Federated Tweets, or Toots

Recently there was been a big fuss about “Mastodon“, an open-source project that is very similar to twitter. The biggest difference is that it is federated. So what it means?

It means that it works like “email”, there are several providers (called instances) where you can create an account (you can setup your own server if you desire) and accounts from different providers can communicate with each other, instead of all information being in just one silo.

Of course for someone that is in favor of an open web this is a really important “feature”.

Another big plus is that the wheel wasn’t reinvented, this network is inter-operable with the existing “GNU Social” providers (uses the same protocol), so you can communicate and interact with people that have an account in an instance running that software. It can be seen like 2 providers of the same network running different software packages (one in PHP the other in Ruby) but talking the same language over the network.

I haven’t tested it much yet, but given it is a push for a solution that is not centralized (which is a rare thing nowadays) and I think it is a small step in the right direction, So I’ve setup an individual instance for myself where I will publish regularly links of posts/articles/pages that I find interesting. Feel free to follow at https://s.ovalerio.net and if you know someone worth following in this network, let me know.

Here are a few links with more information:

List of instances where you can create an account

The wave of bloated desktop apps

Some time ago I complained about a mindset that was producing a form of “bloated” software (software that uses a disproportionate amount of resources to accomplish “simple” tasks). Later I even posted an example of a popular chat application consuming a ridiculous amount of memory, given the its purpose.

One recent example of this phenomenon is the recent explosion of electron desktop applications. Today the following post reached the front page of Hacker News, and it makes a good argument:

Electron is flash for the desktop

https://josephg.com/blog/electron-is-flash-for-the-desktop/

Making it in Madeira

Living outside of the big cities or big technology/financial hubs sometimes is the synonym of being left out of everything whats going on and being far away of many opportunities for your own professional development. On these places the ecosystem is vibrant with lots of events and meetups, either focused on networking or to share knowledge about a given topic.

Unfortunately for those leaving far away of these centers, even though remote working is more prevalent nowadays, we still are not able to attend and be part of these things regularly.

That is why I see with great pleasure the recent grow in this kind of activity in Funchal. Institutions like the University of Madeira, Madeira Interactive Technologies Institute, Startup Madeira and some other people, have been organizing recently several of this events and promoting an entrepreneurial mindset and a culture of innovation. This might turn the city’s ecosystem more dynamic and help exploring new and open opportunities, specially related with technology and the digital economy, in an island that is somewhat dependent on tourism.

As an example in the last few weeks, we’ve had a informal business networking event called “Afterwork” which as been happening  with some regularity, we had some talks in several fields given by experts in M-ITI open to the public and we also had a workshop of “marketing for startups” given by Felipe Ávila Da Costa from Infraspeak (previously UPTEC).

Of course this should be just the staring of a long walk, but is good to see some movement in this area. Because I see much potential in other areas, that are not tied to location or raw materials, to be developed and “shipped”  to the rest of the world from Madeira, we just have to change to a doing mindset and explore the existing global opportunities while living here.

Madeira
Foto by Tiago Aguiar (https://unsplash.com/@tiagoaguiar)

And lets get real, who doesn’t prefer to work and live in a beautiful island with great weather all year round, instead of endure crazy cold winters and burning hot summers?

Also, have I already talked about the great co-working spaces we have here? Perhaps I will do it in a future post.

Starting the “1ppm Challenge”

So certain parts of the world already entered the year 2017 (this system doesn’t sound that great, but I will leave this discussion for another occasion) and we, here in Europe, are making the preparations to start the new year in a few hours.

I am not found of those traditional new year resolutions that everyone does, they seem always destined to fail. But, yesterday I found a challenge on HackerNews that is very interesting and looks like a great push to be more productive during the whole year.

This post explains it a little better, but in brief, everyone that tries to accomplish the “1ppm Challenge” must build and ship a different project every month during the next year. For it to work out, in that restricted time-frame, the projects must have a clear objective and focus on solving a well defined problem. They also must cut all the clutter and be a MVP (Minimum viable product) since we only have +- 4 weeks for each one.

On the original challenge the projects can be anything, but for me I will restrict it to software projects (at least 10 in 12). My goal with this is to improve me skills in shipping new products and to focus on what matters the most in a given moment. I’m realistic about the challenge and having a 100% success rate will be hard, so a the end of the year I will evaluate my performance this simple way: number_of_finished_projects/12.

By number_of_finished_projects I mean every project that meets all the goals defined for it. Since I don’t have yet 12 good ideas I really want to work on during the next year, the project for each month will be posted in a new post before the beginning of every month and the challenge log will be updated.

So lets see what is the score I achieve at the end of the year. To get things started here is the description of the project for the next month:


Audio and video capture monitor

Description: This project aim is to let people know when their computers camera and microphone are being used by any program. This way every time a program starts to use this devices the users gets an alert. For now it will be Linux only.

Goals:

  • Must be written using Rust
  • Must detect when the cam or the micro is active (being used)
  • Must alert the user
  • Provide a log for all activity (optional)

So, I’m looking forward to know how this challenge will play out. Tomorrow it is time to start. Hope for a better 2017 for you all.

Before the flood

Yesterday I watched the above documentary on National Geography Channel, it is a good piece of work and it alerts to very pertinent issues, that have been in the agenda for many years/decades. Yet, we haven’t been able to overcome lobbies and established interests, that maintain the status quo and their “money machines” running with disregard for future consequences. Something we already know for sure is that there is no going back and we will pay the price. Now, the question that remains is “what will the price be?”.

You should watch it, I definitely recommend it. It reminded me of another great documentary called “Home” (You should watch it too), released in 2009 (dam, 7 years and we are still stuck) that is less focused on climate change and addresses mankind’s impact on the planet specially on the last 100 years.

I really hope that we can start seeing real progress soon.

Pixels Camp 2016

A few weeks ago took place in Lisbon the first edition of Pixels Camp (aka Codebits 2.0), an event that I try to attend whenever it happens (see previous posts about it). It is the biggest technology focused event/conference in Portugal with a number of attendees close to 1000.

This year the venue changed to LX Factory, even though the place is really cool, it is not as well located as the previous venue, at least to people who don’t live in Lisbon and arrive to the airport. The venue was well decorated and with a cool atmosphere, giving you the feeling that it was the place to be. However, this year there was less room for the teams working on the projects and not everybody was able to get a table/spot (it appeared to me that the venue was a little bit smaller than the previous one).

From the dozens of great talks that were given on the 4 stages of the event, many of whose I was not able to see since I was competing in the 48h programming competition, bellow are two that I really liked:

Chrome Dev Tools Masterclass

IPFS, The Interplanetary Filesystem

If you have some curiosity you may find the remaining on their youtube channel.

All this is great but the main activity of Pixels Camp is the 48h programing competition and this year we had another great batch of cools projects being developed (total of 60, if I remember correctly).

As usual I entered the contest, this time with the fellow Whitesmithians, Rui and Pedro. We chose to develop a GPS based game, you know, since it seemed to be a popular thing this summer and we though the medium still has great potential to do really entertaining stuff.

The idea already had a few years but never had been implemented and at its core was quite simple. It took some ideas from the classic game “pong” and adapted it to be played in a fun way while navigating through a real world area.

We called it PonGO and essentially the users must agree on a playing field, such as city block, a city or even bigger areas, then they connect their phones and the ball starts rolling. The players have to move around with their phones (which they use to see the map and track everyone’s position) trying to catch the ball and throw it to the other side of the map. The player that is able to do it more times wins the game. Here is sketch we did while discussing the project:

Initial Sketch
Initial Sketch

As you can see in the above image, that would be on the phone’s screen, the player (in yellow) reached close enough to the ball so it can play it, now he has to change the direction to one of the opposite sides (marked as green). The other players (in blue), will have to run to catch the ball before it gets out. Spread across the map you can see some power ups that give users special capabilities.

That’s it, it might seem easy but doing it in less that 48h is not. We ended with a working version of the game but the power ups were not implemented due to time constrains. Here are some screenshots of the final result(we used the map view instead of the satellite view so it might look a little different):

In game screenshotsIn game action

 

 

 

 

 

 

 

 

 

The code itself is a mess (it was an hackathon what were you expecting) and can be found here and here.

At the end, it was a great event as usual and I would also like to congratulate some of my coworkers at Whitesmith that took home the 7th place in the competition. Next year I hope to be there again (and you should too).

EU-Free and Open Source Software Auditing project

Today I stumbled on this blog post about a poll for the EU-FOSSA. I’m not familiarized with all aspects of this pilot project, however by the information I could gather, it seems to be a really great idea.

Most of us regularly use, up to a certain degree, several pieces of free (as in freedom) software on a daily basis. Many of these projects are essential to assure the security of our communications, documents and work. European institutions and countries make use of these tools as well, so why not spend a little time and money to assure they meet certain quality goals and are free of major bugs that can undermine the safety of its users?

This will also raise the public’s trust on these tools, so they can become standards over their proprietary counterparts, which we are unable to review and modify according to our needs, leading to many security questions.

One of its components is a sample review of one open-source project and until the 8th day of July you can give your opinion on which one. Go there it only takes 1 minute and it will help them understand that this is an important issue. Here is the link

Test driving ZeroNet

A few weeks ago the “Decentralized Web Summit” took place in San Francisco, even though there was a video stream available at the time, I wasn’t able to watch it, but later I saw some excerpts of it. One of the talks that caught my attention was about a new thing called ZeroNet. It seemed to be some kind of network where the assets and the contents of the websites are fetched from your peers while introducing clever mechanisms to give the owners control and allowing the existence of user generated content. It grabs concepts from either bitcoin and bittorrent, but for a better explanation bellow is an introduction by the creator of this technology:

The presentation is very high level, so on the website I found some slides with more details about how it works and I must say it is very interesting from a technical perspective, it even has an address naming system (“.bit”) if you don’t want to have some gibberish on the address bar.

Watching the video things seamed to be working pretty well (for something that was being presented for the first time), so I decided to join the network and give it a try. For those using docker it happens to be pretty easy, just run:

$ docker run -d -v <local_data_folder>:/root/data -p 15441:15441 -p 43110:43110 nofish/zeronet

then your node will be available on: http://127.0.0.1:43110/

After using it for 2 weekends, I have to say the level of polish of this project is amazing, all the pre-built apps work pretty well and are easy to use, the websites load super fast (at least in comparison with my expectations) and changes show up in real-time. The most interesting aspect of all was the amount of people trying and using it.

You may ask, what are the great advantages of using something like this? Based on what I’ve seen during these few days there are 3 points/use cases where this network shines:

  • Websites cannot be taken down, as long as there are peers serving it, it will be online.
  • Zero infrastructure costs (or pretty close) to run a website there, you create and sign the content, it gets delivered by the peers.
  • Website that you visit remain available while you are offline.

So to test this network further, I will do an experiment. During the next few weeks/months I will mirror this blog and make the new contents available on ZeroNet, starting with this post. The address is:

http://127.0.0.1:43110/1PLZ7PjfX91VSMmzU5revwswmrEkTz6Mpk

Note: In this initial stage it might not be always available, since at the moment I’m the only peer serving it from my laptop.

To know more about it, check the repository on Github.

Blogs, web feeds and the open web

As I already published before (twice), I’m a big supporter of an “ancient” and practically dead technology, at least as many like to call it, that still can be found in the Internet. It is the RSS, a very useful standard that is one of the foundations for publishing content to the Web in a open way (the way it initially was supposed to).

Today and following a recent blog post of Seth Godin, about reading more blogs and teaching a way to easily get their new content, I want to get back to the subject and address a few thoughts I have on the matter. Without publicizing a single solution, I want to explore and extend a few points made in that article.

First, I want to start with a basic explanation of how it works, at least for the user. The basic idea is that content creators, being professionals or hobbyists, along side with the content displayed on their website, also publish some structured file that is not supposed to be read by humans, with information about that content (and sometimes with parts of that content). The usefulness of these files, is that other entities can watch them and get a linear view of what was published over time. This way people that want to follow or consume that content can use cool little apps, that track their favorite authors and let them know when there is new stuff, along site with other features, such as keeping track of what you already read.

RSS
RSS Logo

If you check for the icon shown above, you will find it in many websites, this is hugely used in many areas from newspapers to blogs, from postcasts to itunes, etc.

Overall It is used broadly and it lets people do cool things with it, however (this is where I start to converge on the topic of Seth’s post) most big players don’t have an interest in an open web and slowly, over time, they started dropping support for it and undermining its usefulness, because they want people to publish and consume content only inside their platform, locking everyone’s content to their services. Two examples are given, Google and Facebook, but I’m sure there are others.

There are many benefits of following your favorite authors using these feeds, such as:

  • You control the content that you read. Lets face it, letting any middleman manipulate what kind of information you have access to is never a good thing, and it is not uncommon.
  • You are not stuck with a single interface where the content is lost overtime. You can choose your app and organize the content your own way.
  • Clear distinction of what you already read and what you didn’t.
  • Even if the content gets taken down, you might have your own copy. There is no risk of a service going out of business and everybody losing all they content.

One thing I’ve been seeing more often, is many people writing such nice content but their blog or platform does not expose RSS feeds (like this one). This saddens me because I do not remember every good source, if I can’t add it to my feed reader, I will often forget to check for new content. It is relatively easy to add it, most systems support it, and if it is a custom one there are plenty of libraries available to help you with that.

I comprehend the need to make the person visit the site in order to monetize the content, in these cases there is always the option to only add to the feed the title and a small excerpt, the reader will follow the link to reach the remaining of the content.