This video about Gitlab was posted recently and is a very interesting case-study on how a company can normally function while having all of its employees working remotely.
Living outside of the big cities or big technology/financial hubs sometimes is the synonym of being left out of everything whats going on and being far away of many opportunities for your own professional development. On these places the ecosystem is vibrant with lots of events and meetups, either focused on networking or to share knowledge about a given topic.
Unfortunately for those leaving far away of these centers, even though remote working is more prevalent nowadays, we still are not able to attend and be part of these things regularly.
That is why I see with great pleasure the recent grow in this kind of activity in Funchal. Institutions like the University of Madeira, Madeira Interactive Technologies Institute, Startup Madeira and some other people, have been organizing recently several of this events and promoting an entrepreneurial mindset and a culture of innovation. This might turn the city’s ecosystem more dynamic and help exploring new and open opportunities, specially related with technology and the digital economy, in an island that is somewhat dependent on tourism.
As an example in the last few weeks, we’ve had a informal business networking event called “Afterwork” which as been happening with some regularity, we had some talks in several fields given by experts in M-ITI open to the public and we also had a workshop of “marketing for startups” given by Felipe Ávila Da Costa from Infraspeak (previously UPTEC).
Of course this should be just the staring of a long walk, but is good to see some movement in this area. Because I see much potential in other areas, that are not tied to location or raw materials, to be developed and “shipped” to the rest of the world from Madeira, we just have to change to a doing mindset and explore the existing global opportunities while living here.
And lets get real, who doesn’t prefer to work and live in a beautiful island with great weather all year round, instead of endure crazy cold winters and burning hot summers?
Also, have I already talked about the great co-working spaces we have here? Perhaps I will do it in a future post.
So certain parts of the world already entered the year 2017 (this system doesn’t sound that great, but I will leave this discussion for another occasion) and we, here in Europe, are making the preparations to start the new year in a few hours.
I am not found of those traditional new year resolutions that everyone does, they seem always destined to fail. But, yesterday I found a challenge on HackerNews that is very interesting and looks like a great push to be more productive during the whole year.
This post explains it a little better, but in brief, everyone that tries to accomplish the “1ppm Challenge” must build and ship a different project every month during the next year. For it to work out, in that restricted time-frame, the projects must have a clear objective and focus on solving a well defined problem. They also must cut all the clutter and be a MVP (Minimum viable product) since we only have +- 4 weeks for each one.
On the original challenge the projects can be anything, but for me I will restrict it to software projects (at least 10 in 12). My goal with this is to improve me skills in shipping new products and to focus on what matters the most in a given moment. I’m realistic about the challenge and having a 100% success rate will be hard, so a the end of the year I will evaluate my performance this simple way: number_of_finished_projects/12.
By number_of_finished_projects I mean every project that meets all the goals defined for it. Since I don’t have yet 12 good ideas I really want to work on during the next year, the project for each month will be posted in a new post before the beginning of every month and the challenge log will be updated.
So lets see what is the score I achieve at the end of the year. To get things started here is the description of the project for the next month:
Audio and video capture monitor
Description: This project aim is to let people know when their computers camera and microphone are being used by any program. This way every time a program starts to use this devices the users gets an alert. For now it will be Linux only.
- Must be written using Rust
- Must detect when the cam or the micro is active (being used)
- Must alert the user
- Provide a log for all activity (optional)
So, I’m looking forward to know how this challenge will play out. Tomorrow it is time to start. Hope for a better 2017 for you all.
Yesterday I watched the above documentary on National Geography Channel, it is a good piece of work and it alerts to very pertinent issues, that have been in the agenda for many years/decades. Yet, we haven’t been able to overcome lobbies and established interests, that maintain the status quo and their “money machines” running with disregard for future consequences. Something we already know for sure is that there is no going back and we will pay the price. Now, the question that remains is “what will the price be?”.
You should watch it, I definitely recommend it. It reminded me of another great documentary called “Home” (You should watch it too), released in 2009 (dam, 7 years and we are still stuck) that is less focused on climate change and addresses mankind’s impact on the planet specially on the last 100 years.
I really hope that we can start seeing real progress soon.
A few weeks ago took place in Lisbon the first edition of Pixels Camp (aka Codebits 2.0), an event that I try to attend whenever it happens (see previous posts about it). It is the biggest technology focused event/conference in Portugal with a number of attendees close to 1000.
This year the venue changed to LX Factory, even though the place is really cool, it is not as well located as the previous venue, at least to people who don’t live in Lisbon and arrive to the airport. The venue was well decorated and with a cool atmosphere, giving you the feeling that it was the place to be. However, this year there was less room for the teams working on the projects and not everybody was able to get a table/spot (it appeared to me that the venue was a little bit smaller than the previous one).
From the dozens of great talks that were given on the 4 stages of the event, many of whose I was not able to see since I was competing in the 48h programming competition, bellow are two that I really liked:
Chrome Dev Tools Masterclass
IPFS, The Interplanetary Filesystem
If you have some curiosity you may find the remaining on their youtube channel.
All this is great but the main activity of Pixels Camp is the 48h programing competition and this year we had another great batch of cools projects being developed (total of 60, if I remember correctly).
As usual I entered the contest, this time with the fellow Whitesmithians, Rui and Pedro. We chose to develop a GPS based game, you know, since it seemed to be a popular thing this summer and we though the medium still has great potential to do really entertaining stuff.
The idea already had a few years but never had been implemented and at its core was quite simple. It took some ideas from the classic game “pong” and adapted it to be played in a fun way while navigating through a real world area.
We called it PonGO and essentially the users must agree on a playing field, such as city block, a city or even bigger areas, then they connect their phones and the ball starts rolling. The players have to move around with their phones (which they use to see the map and track everyone’s position) trying to catch the ball and throw it to the other side of the map. The player that is able to do it more times wins the game. Here is sketch we did while discussing the project:
As you can see in the above image, that would be on the phone’s screen, the player (in yellow) reached close enough to the ball so it can play it, now he has to change the direction to one of the opposite sides (marked as green). The other players (in blue), will have to run to catch the ball before it gets out. Spread across the map you can see some power ups that give users special capabilities.
That’s it, it might seem easy but doing it in less that 48h is not. We ended with a working version of the game but the power ups were not implemented due to time constrains. Here are some screenshots of the final result(we used the map view instead of the satellite view so it might look a little different):
At the end, it was a great event as usual and I would also like to congratulate some of my coworkers at Whitesmith that took home the 7th place in the competition. Next year I hope to be there again (and you should too).
Today I stumbled on this blog post about a poll for the EU-FOSSA. I’m not familiarized with all aspects of this pilot project, however by the information I could gather, it seems to be a really great idea.
Most of us regularly use, up to a certain degree, several pieces of free (as in freedom) software on a daily basis. Many of these projects are essential to assure the security of our communications, documents and work. European institutions and countries make use of these tools as well, so why not spend a little time and money to assure they meet certain quality goals and are free of major bugs that can undermine the safety of its users?
This will also raise the public’s trust on these tools, so they can become standards over their proprietary counterparts, which we are unable to review and modify according to our needs, leading to many security questions.
One of its components is a sample review of one open-source project and until the 8th day of July you can give your opinion on which one. Go there it only takes 1 minute and it will help them understand that this is an important issue. Here is the link
A few weeks ago the “Decentralized Web Summit” took place in San Francisco, even though there was a video stream available at the time, I wasn’t able to watch it, but later I saw some excerpts of it. One of the talks that caught my attention was about a new thing called ZeroNet. It seemed to be some kind of network where the assets and the contents of the websites are fetched from your peers while introducing clever mechanisms to give the owners control and allowing the existence of user generated content. It grabs concepts from either bitcoin and bittorrent, but for a better explanation bellow is an introduction by the creator of this technology:
The presentation is very high level, so on the website I found some slides with more details about how it works and I must say it is very interesting from a technical perspective, it even has an address naming system (“.bit”) if you don’t want to have some gibberish on the address bar.
Watching the video things seamed to be working pretty well (for something that was being presented for the first time), so I decided to join the network and give it a try. For those using docker it happens to be pretty easy, just run:
$ docker run -d -v <local_data_folder>:/root/data -p 15441:15441 -p 43110:43110 nofish/zeronet
then your node will be available on: http://127.0.0.1:43110/
After using it for 2 weekends, I have to say the level of polish of this project is amazing, all the pre-built apps work pretty well and are easy to use, the websites load super fast (at least in comparison with my expectations) and changes show up in real-time. The most interesting aspect of all was the amount of people trying and using it.
You may ask, what are the great advantages of using something like this? Based on what I’ve seen during these few days there are 3 points/use cases where this network shines:
- Websites cannot be taken down, as long as there are peers serving it, it will be online.
- Zero infrastructure costs (or pretty close) to run a website there, you create and sign the content, it gets delivered by the peers.
- Website that you visit remain available while you are offline.
So to test this network further, I will do an experiment. During the next few weeks/months I will mirror this blog and make the new contents available on ZeroNet, starting with this post. The address is:
Note: In this initial stage it might not be always available, since at the moment I’m the only peer serving it from my laptop.
To know more about it, check the repository on Github.
As I already published before (twice), I’m a big supporter of an “ancient” and practically dead technology, at least as many like to call it, that still can be found in the Internet. It is the RSS, a very useful standard that is one of the foundations for publishing content to the Web in a open way (the way it initially was supposed to).
Today and following a recent blog post of Seth Godin, about reading more blogs and teaching a way to easily get their new content, I want to get back to the subject and address a few thoughts I have on the matter. Without publicizing a single solution, I want to explore and extend a few points made in that article.
First, I want to start with a basic explanation of how it works, at least for the user. The basic idea is that content creators, being professionals or hobbyists, along side with the content displayed on their website, also publish some structured file that is not supposed to be read by humans, with information about that content (and sometimes with parts of that content). The usefulness of these files, is that other entities can watch them and get a linear view of what was published over time. This way people that want to follow or consume that content can use cool little apps, that track their favorite authors and let them know when there is new stuff, along site with other features, such as keeping track of what you already read.
If you check for the icon shown above, you will find it in many websites, this is hugely used in many areas from newspapers to blogs, from postcasts to itunes, etc.
Overall It is used broadly and it lets people do cool things with it, however (this is where I start to converge on the topic of Seth’s post) most big players don’t have an interest in an open web and slowly, over time, they started dropping support for it and undermining its usefulness, because they want people to publish and consume content only inside their platform, locking everyone’s content to their services. Two examples are given, Google and Facebook, but I’m sure there are others.
There are many benefits of following your favorite authors using these feeds, such as:
- You control the content that you read. Lets face it, letting any middleman manipulate what kind of information you have access to is never a good thing, and it is not uncommon.
- You are not stuck with a single interface where the content is lost overtime. You can choose your app and organize the content your own way.
- Clear distinction of what you already read and what you didn’t.
- Even if the content gets taken down, you might have your own copy. There is no risk of a service going out of business and everybody losing all they content.
One thing I’ve been seeing more often, is many people writing such nice content but their blog or platform does not expose RSS feeds (like this one). This saddens me because I do not remember every good source, if I can’t add it to my feed reader, I will often forget to check for new content. It is relatively easy to add it, most systems support it, and if it is a custom one there are plenty of libraries available to help you with that.
I comprehend the need to make the person visit the site in order to monetize the content, in these cases there is always the option to only add to the feed the title and a small excerpt, the reader will follow the link to reach the remaining of the content.
Adding support for time zones in your website, in order to allow its users to work using their own timezone is a “must” nowadays. So in this post I’m gonna try to show you how to implement a simple version of it. Even though Django’s documentation is very good and complete, the only example given is how to store the timezone in the users session after detecting (somehow) the user timezone.
What if the user wants to store his timezone in the settings and used it from there on every time he visits the website? To solve this I’m gonna pick the example given in the documentation and together with the simple django-timezone-field package/app implement this feature.
First we need to install the dependency:
$ pip install django-timezone-field==2.0rc1
Add to the INSTALLED_APPS of your project:
INSTALLED_APPS = [ ..., 'timezone_field', ... ]
Then add a new field to the user model:
class User(AbstractUser): timezone = TimeZoneField(default='UTC'
Handle the migrations:
$python manage.py makemigration && python manage.py migrate
Now we will need to use this information, based on the Django’s documentation example we can add a middleware class, that will get this information on every request and set the desired timezone. It should look like this:
from django.utils import timezone class TimezoneMiddleware(): def process_request(self, request): if request.user.is_authenticated(): timezone.activate(request.user.timezone) else: timezone.deactivate()
Add the new class to the project middleware:
MIDDLEWARE_CLASSES = [ ..., 'your.module.middleware.TimezoneMiddleware', ... ]
Now it should be ready to use, all your forms will convert the received input (in that timeone) to UTC, and templates will convert from UTC to the user’s timezone when rendered. For different conversions and more complex implementations check the available methods.
One common trouble of people trying to secure their email communications with PGP, is that more often that not the other end doesn’t know how to use these kind of tools. I’ll be honest, at the current state the learning curve is too steep for the common user. This causes a huge deal of trouble when you desire to receive/sent sensitive information in a secure manner.
I will give you an example, a software development team helping a customer building his web business or application, may want to receive a wide variety of access keys to external services and APIs, that are in possession of the customer and are required (or useful) to be integrated in the project.
Lets assume that the customer is not familiarized with encryption tools, the probability of that sensitive material to be shared in an insecure way is too high, he might send it through a clear text email or post it on some shared document (or file). Both the previous situations are red flags, either by the communication channel not secure enough or the possibility of existing multiple copies of the information in different places with doubtful security, all of them in clear text.
In our recent “Whitesmith Hackathon”, one of the projects tried to address this issue. We though on a more direct approach to this situation based on the assumption that you will not be able to convince the customer into learning this kind of things. We called it Hawkpost, essentially it’s a website that makes use of OpenPGP.js, where you create unique links containing a form, that the user uses to submit any information, that will then be encrypted on his browser with your public key (without the need to install any extra software) and forwarded to your email address.
You can test and used it on https://hawkpost.co, but the project is open-source, so you can change it and deploy it on your own server if you prefer. It’s still in a green state at the moment, but we will continue improving the concept according with the received feedback. Check it out and tell us what you think.