Categories
Personal

10 years

The first post I published on this blog is now 10 years old. This wasn’t my first website or even the first blog, but it’s the one that stuck for the longest time.

The initial goal was to have a place to share anything I might find interesting on the Web, a place that would allow me to publish my opinions on all kinds of issues (if I felt like it) and to be able to publish information about my projects. I think you still can deduce that from the tag line, that remained unchanged ever since.

From the start, being able to host my own content was one of the priorities, in order to be able to control its distribution and ensuring that it is universally accessible to anyone without any locks on how and by whom it should be consumed.

The reasoning behind this decision was related to a trend that started a couple of years earlier, the departure from the open web and the big migration to the walled gardens.

Many people thought it was an inoffensive move, something that would improve the user experience and make the life easier for everyone. But as anything in life, with time we started to see the costs.

Today the world is different, using closed platforms that barely interact with each other is the rule and the downsides became evident: Users started to be spied for profit, platforms decide what speech is acceptable, manipulation is more present than ever, big monopolies are now gate keepers to many markets, etc. Summing up, the information and power is concentrated in fewer hands.

Last week this event set the topic for the post. A “simple chat app”, that uses an open protocol to interact with different servers, was excluded/blocked from the market unilaterally without any chance to defend itself. A more extensive discussion can be found here.

The message I wanted to leave in this commemorative post, is that we need to give another shot to decentralized and interoperable software, use open protocols and technologies to put creators and users back in control.

If there is anything that I would like to keep for the next 10 years, is the capability to reach, interact and collaborate with the world without having a huge corporation acting as middleman dictating its rules.

I will continue to put an effort in making sure open standards are used on this website (such RSS, Webmention, etc) and that I’m reachable using decentralized protocols and tools (such as email, Matrix or the “Fediverse“). It think this is the minimum a person could ask for the next decade.

Categories
Software Development Technology and Internet

Mirroring GitHub Repositories

Git by itself is a distributed version control system (a very popular one), but over the years organizations started to rely on some internet services to manage their repositories and those services eventually become the central/single source of truth for their code.

The most well known service out there is GitHub (now owned by Microsoft), which nowadays is synonymous of git for a huge amount of people. Many other services exist, such as Gitlab and BitBucked, but GitHub gained a notoriety above all others, specially for hosting small (and some large) open source projects.

These centralized services provide many more features that help managing, testing and deploying software. Functionality not directly related to the main purpose of git.

Relying on these central services is very useful but as everything in life, it is a trade-off. Many large open source organizations don’t rely on these companies (such as KDE, Gnome, Debian, etc), because the risks involved are not worth the convenience of letting these platforms host their code and other data.

Over time we have been witnessing some of these risks, such as your project (and all the related data) being taken down without you having any chance to defend yourself (Example 1 and Example 2). Very similar to what some content creators have been experiencing with Youtube (I really like this one).

When this happens, your or your organizations don’t lose the code itself since you almost certainly have copies on your own devices (thanks to git), but you lose everything else, issues, projects, automated actions, documentation and essentially the known used by URL of your project.

Since Github is just too convenient to collaborate with other people, we can’t just leave. In this post I explain an easy alternative to minimize the risks described above, that I implemented myself after reading many guides and tools made by others that also tried to address this problem before.

The main idea is to automatically mirror everything in a machine that I own and make it publicly available side by side with the GitHub URLs, the work will still be done in Github but can be easily switched over if something happens.

The software

To achieve the desired outcome I’ve researched a few tools and the one that seemed to fit all my requirements (work with git and be lightweight) was “Gitea“. Next I will describe the steps I took.

The Setup

This part was very simple, I just followed the instructions present on the documentation for a docker based install. Something like this:

version: "3"

networks:
  gitea:
    external: false

services:
  server:
    image: gitea/gitea:latest
    container_name: gitea
    environment:
      - USER_UID=1000
      - USER_GID=1000
    restart: always
    networks:
      - gitea
    volumes:
      - ./gitea:/data
      - /etc/timezone:/etc/timezone:ro
      - /etc/localtime:/etc/localtime:ro
    ports:
      - "3000:3000"
      - "222:22"

If you are doing the same, don’t copy the snippet above. Take look here for updated instructions.

Since my website is not supposed to have much concurrent activity, using an SQLite database is more than enough. So after launching the container, I chose this database type and made sure I disabled the all the functionality I won’t need.

Part of Gitea's configuration page. Server and Third-Party Service Settings.
Part of the Gitea’s configuration page

After this step, you should be logged in as an admin. The next step is to create a new migration on the top right menu. We just need to choose the “Github” option and continue. You should see the below screen:

Screenshot of the page that lets the users create a new migration/mirror in Gitea.
Creating a new Github migration/mirror in Gitea.

If you choose This repository will be a mirror option, Gitea will keep your repository and wiki in sync with the original, but unfortunately it will not do the same for issues, labels, milestones and releases. So if you need that information, the best approach is to uncheck this field and do a normal migration. To keep that information updated you will have to repeat this process periodically.

Once migrated, do the same for your other repositories.

Conclusion

Having an alternative with a backup of the general Github data ended up being quite easy to set up. However the mirror feature would be much more valuable if it included the other items available on the standard migration.

During my research for solutions, I found Fossil, which looks very interesting and something that I would like to explore in the future, but at the moment all repositories are based on Git and for practical reasons that won’t change for the time being.

With this change, my public repositories can be found in:

Categories
Technology and Internet

Decentralized alternatives on the Internet

This post follows a recent discussion about decentralization at the first Madeira Tech Meetup of 2018. I already touched these subjects previously here on the blog a couple of times (for example,  the one about Zeronet and the one about Mastodon), but this time I will try to summarize the key points of the issue and give a list of awesome projects and applications the will help you reduce your dependency on several centralized services.

Nowadays many most popular web services and applications that “we” use are centralized, this means that the application and its data only exist on the machines of a single entity. As soon as this was considered normal, it led to another phenomenon, which these “apps” close down any possible interaction that is not under their control, to keep the user locked in. A good example is social media and their chat applications, since closing down the doors to open protocols and any form of interoperability, allows the network effect to keeps the users inside (why can’t someone with an account on a certain Google chat application, talk with someone who uses Facebook Messenger? you know, like email works).

The Internet infrastructure at it’s core is decentralized, it was initially designed this way in order to be robust and continue working when there is a problem in any of member of the network. For many years the applications built on top it, followed some of this principals and were happy to have a certain level of interoperability with other software because it provided value. Since mid 00’s to this day things changed and many services and applications are as closed as they can be (the so called wallet gardens), because there are economic incentives to behave this way, making the users and their data “hostage”.

Fortunately, the new rules of the European Union will start being implemented and enforced in the next couple of months. Other than giving back the control of their data to users, it requires that they must be able to export it in a common format, which will allow users to move to other competitors more easily.

However, this new right still threatens some comfortable walled-garden models. For example, social networks, exercise-tracking apps, and photo services will have to allow users to export their posts, rides, and photos in a common format. Smart competitors will build upload tools that recognize these formats; GDPR might therefore help to bridge the strategic moats of incumbents.

Source: Techies Guide to GDPR

But it doesn’t solve all the centralization problems. So a better (and complementary) approach is to use applications and services that use open protocols and were built with interoperability in mind. If they are open source, if you can self-host them or if they are completely distributed, even better.

So to give you a head start, here is a list of applications that address common usages in the Internet, that are open-source, decentralized and can easily replace popular software you use nowadays:

  • Matrix – Is an open chat protocol and collection of reference implementations (ready to use) that is an alternative to any chat application, you can use an existing server or launch your own.
  • Mastodon – is a Twitter alternative. Together with GNUSocial and other softwares, they form an open network that regardless of which service or software type you use, you are able to interact with everyone. You also can host your own server/instance.
  • IPFS – a content-based addressing protocol (with reference implementation) that tries to make the web more distributed and less reliant on central servers. Very useful to publish public data from your own computer.
  • ZeroNet – an alternative distributed network where websites are served by the peers in the network and not central servers. One of the nice things here is that you can easily clone and publish websites using standard templates.
  • Peertube – the name gives you a hint, it is a Youtube replacement that doesn’t depend any central server. It is federated and the video is distributed to the users by “Webtorrent”.
  • Syncthing – File synchronization application (alternative to Dropbox) that lets you have all your data up to date across all your devises and doesn’t rely on central servers.
  • OwnCloud/NextCloud – Also an alternative to Dropbox and similar services, were the user can self-host a server or use one available on the Internet. These servers are federated, which means you can share and work on the same folders even with users from another server or service.

Like these ones there are many others available. I just highlighted the ones above because I think they are ready/stable and can “fight” with the existing centralized services.

The usage of decentralized applications makes the Internet more democratic, more resistant to censorship, more resilient, makes the life harder to authoritarian entities, as well as others that try to centrally manipulate public perception (don’t automatically adding users to the famous “filter bubbles”) and reduces the probability of massive companies maintaining monopolies on certain markets.

At first it might take a little bit of work, but the benefits are there and will be collected later.

Categories
Technology and Internet

Test driving ZeroNet

A few weeks ago the “Decentralized Web Summit” took place in San Francisco, even though there was a video stream available at the time, I wasn’t able to watch it, but later I saw some excerpts of it. One of the talks that caught my attention was about a new thing called ZeroNet. It seemed to be some kind of network where the assets and the contents of the websites are fetched from your peers while introducing clever mechanisms to give the owners control and allowing the existence of user generated content. It grabs concepts from either bitcoin and bittorrent, but for a better explanation bellow is an introduction by the creator of this technology:

The presentation is very high level, so on the website I found some slides with more details about how it works and I must say it is very interesting from a technical perspective, it even has an address naming system (“.bit”) if you don’t want to have some gibberish on the address bar.

Watching the video things seamed to be working pretty well (for something that was being presented for the first time), so I decided to join the network and give it a try. For those using docker it happens to be pretty easy, just run:

$ docker run -d -v <local_data_folder>:/root/data -p 15441:15441 -p 43110:43110 nofish/zeronet

then your node will be available on: http://127.0.0.1:43110/

After using it for 2 weekends, I have to say the level of polish of this project is amazing, all the pre-built apps work pretty well and are easy to use, the websites load super fast (at least in comparison with my expectations) and changes show up in real-time. The most interesting aspect of all was the amount of people trying and using it.

You may ask, what are the great advantages of using something like this? Based on what I’ve seen during these few days there are 3 points/use cases where this network shines:

  • Websites cannot be taken down, as long as there are peers serving it, it will be online.
  • Zero infrastructure costs (or pretty close) to run a website there, you create and sign the content, it gets delivered by the peers.
  • Website that you visit remain available while you are offline.

So to test this network further, I will do an experiment. During the next few weeks/months I will mirror this blog and make the new contents available on ZeroNet, starting with this post. The address is:

http://127.0.0.1:43110/1PLZ7PjfX91VSMmzU5revwswmrEkTz6Mpk

Note: In this initial stage it might not be always available, since at the moment I’m the only peer serving it from my laptop.

To know more about it, check the repository on Github.