Category: Personal

Posts about my own stuff and things i am involved

  • The books I enjoyed the most in 2024

    Another year went by, and another batch of books was consumed. Just like I did last year, I want to share the ones that I enjoyed the most.

    But what kind of metric is that? Truth be told, it is not an objective one. Last year, I clearly described it like this:

    I don’t mean they are masterpieces or references in a given field, what I mean is that I truly enjoyed the experience. It could be because of the subject, the kind of book, the writing style or for any other reason.

    What matters is that I was able to appreciate the time I spent reading them.

    And I still think it is what truly matters.

    So this year I will repeat the dose — two more books that were entirely worth the money.

    “Broken Money”, by Lyn Alden

    This is a book about money (surprise, surprise). Not in the usual sense of telling the reader how to earn it or on how to spend it. The focus is instead on what it is, what forms of money existed throughout history, how it was used and how each of those forms failed to fulfil their purpose at a given time.

    As the book progresses, it introduces the reader to important financial concepts, practices, and institutions that were born to fulfil certain needs, or to accomplish a desired outcome. It discusses their purposes and their problems.

    When describing the current state of affairs, the author focuses on how the existing financial system doesn’t serve all people equally. Example after example, we can see how some benefit from it, while others are harmed by it, over and over again.

    The book ends by taking a look at the internet age and exploring “alternatives” that are surfacing on the horizon.

    It had a real impact on how I see money and the financial system.

    “Masters of Doom”, by David Kushner

    Another great book that was a joy to read was “Masters of Doom”, and I guess that every kid from the 90s that touched a PC during that time will know at least one game that is mentioned there.

    It tells the story about the people behind “id Software” and their journey throughout most of the decade while they developed and released games such as Commander Keen, Wolfenstein 3D, Doom, and Quake.

    As a kid, I remember playing and enjoying some of those games, many hours of fun and excitement. I was too young to know or follow the stories and the dramas of the game development industry, but I definitely hold great memories of the outcome.

    In the book you will find how they met, the ups, the downs, the drama, etc. You know, the whole rollercoaster that any new and successful company eventually goes through.

    While many other people were involved in making those games, and eventually make the company prosper, the two main characters in this story are John Carmack and John Romero. With very distinct personalities, it is remarkable how far they were able to take this endeavor together.

    If you lived during that time, I guess you will enjoy the book.

    Fediverse Reactions
  • Hawkpost enters “maintenance only” mode

    In practice this already happened a couple of years ago, now we are just making it official.

    For those who don’t know, Hawkpost is a side project that I started while at Whitesmith back in 2016 (8+ years ago). I’ve written about it here in the blog on several occasions.

    To sum it up, it is a tool made to solve a problem that at the time I frequently saw in the wild, while doing the typical agency/studio work. Clients and most people shared credentials and other secrets for their projects through insecure means (in plain text on chats, emails, etc.). It bothered me to the point of trying to figure out a solution, that was both, easy to use for me and my coworkers, and obvious/transparent to people who simply don’t care about it.

    Awareness about encryption at the time, while making rapid progress, was not as widespread as it is today. Tools were not as easy to use.

    Hawkpost ended up being very useful for many people. It didn’t have to be perfect, it just needed to improve the existing state of affairs, as it did.

    Eight years after, things have changed. I no longer do agency work, I’ve changed workplaces, awareness improved a lot, and many other tools appeared in the market. Hawkpost’s development has stalled, and while it still has its users, we haven’t seen much overall interest to keep working on it.

    To be sincere, I don’t use it anymore. That’s because I have other tools at my disposal that are much better for the specific use-cases they address, and perhaps also better for Hawkpost’s original purpose.

    Here are some examples:

    • For sharing credentials within a non-technical team (if you really must): Use a proper team password manager such as Bitwarden or 1Password.
    • For sharing files and other sizable data: one good alternative is to use send (the successor of Firefox Send). It also has an official CLI client.
    • For sharing and working on encrypted documents: CryptPad has a whole range of applications where data is encrypted E2E.

    So, this week, we released the version 1.4.0 of Hawkpost. It fixes some bugs, updates major dependencies and makes sure the project is in good shape to continue to receive small updates. The full list of changes can be found here.

    However, new features or other big improvements won’t be merged from now on (at least for the foreseeable future). The project is in “maintenance only” mode. Security issues and anything that could make the project unusable, will be handled, but nothing else.

  • New Nostr and Lightning Addresses

    Bitcoin Atlantis is just around the corner. This conference, happening here in Madeira, is something unusual for us locals. The common pattern is that we have to fly to attend such conferences.

    I plan to attend the event, and I will be there with an open mindset, since there are always new things to learn. I’m particularly interested in the new technologies and advancements powering the existing ecosystem.

    So this post is an early preparation for the conference. I noticed that the community is gathering increasingly around a protocol called Nostr. I also noticed that a great deal of new cool applications are being built on top of the Lightning Network.

    To blend in, to actively participate and to make the most of my time there, I just set up my Nostr presence with a lightning address pointing to one lightning network wallet.

    This setup allows me to interact with others using a Twitter/X like interface. Instead of likes and favorites, people can express their support by sending small bitcoin tips (zaps) directly to the authors by clicking/tapping a button.

    Before describing what I did, I will do a rapid introduction to Nostr and also the Lightning Network, to contextualize the reader.

    Lightning network

    Built on top of the Bitcoin blockchain, the lightning network allows users to send and receive transactions rapidly (couple of seconds) and with minimal fees (less than a cent). These records are kept off-chain and are only settled later when the users decide to make the outcome permanent.

    It works very well and led to the development of many websites, apps, and services with use cases that weren’t previously possible or viable. One example is quickly paying a few cents to read an article or paying to watch a stream by the second without any sort of credit.

    One of the shortcomings is that for every payment to happen, the receiving party needs to generate and share an invoice first. For certain scenarios, this flow is not ideal.

    Recently, a new protocol called LNURL was developed to allow users to send transactions without the need of having a pre-generated invoice first. The only thing that is needed is this “URL”, which can work like a traditional address.

    On top of this new protocol, users can create Lightning Addresses, which looks and works just like an e-mail address. So, you can send some BTC to joe@example.com, which is much friendlier than standard bitcoin addresses.

    Nostr

    Twitter is centralized, Mastodon is not, but it is federated, which means you still need an account on a given server. Nostr, on the other hand, is an entirely different approach to the social media landscape.

    Just like Bitcoin, users generate a pair of keys, one is the identifier and the other allows the user to prove is identity.

    Instead of having an account where the user stores and publishes his messages, on Nostr the user creates a message, signs and publishes it to one or more servers (called relay).

    Readers can then look for messages of a given identifier, on one or more servers.

    One of the characteristics of this approach is that it is censorship resistant, and the individual is not at the mercy of the owner of a given service.

    The relationship between Nostr and the Bitcoin’s lightning network, that I described above, is that the Bitcoin community was one of the first to adopt Nostr. Furthermore, many Nostr apps already integrated support for tipping using the lightning network, more specifically using lightning addresses.

    My new contacts

    With that long introduction done, this post serves as an announcement that I will start sharing my content on Nostr as well.

    For now, I will just share the same things that I already do on Mastodon/Fediverse. But as we get closer to the conference, I plan to join a few discussions about Bitcoin and share more material about that specific subject.

    You can follow me using:

    npub1c86s34sfthe0yx4dp2sevkz2njm5lqz0arscrkhjqhkdexn5kuqqtlvmv9

    View on Primal

    To send and receive some tips/zaps, I had to set a lightning address. Unfortunately, setting up one from scratch in a self-custodial way still requires some effort.

    The app that I currently use can provide one, but they take custody of the funds (which in my humble opinion goes a bit against the whole point). This address uses their domain, something like: raremouse1@primal.net. It doesn’t look so good, especially if I want to switch providers in the future or have self-custody of my wallet.

    Looking at the Lightning Address specification, I noticed that it is possible to create some kind of “alias”, that allows us to easily switch later.

    To test, I just quickly wrote a Cloudflare worker that essentially looks like the following:

    const url = 'https://primal.net/.well-known/lnurlp/raremouse1'
    
    export default {
      async fetch(request, env, ctx) {
        async function MethodNotAllowed(request) {
          return new Response(
            `Method ${request.method} not allowed.`,
            {
              status: 405,
              headers: {
                Allow: "GET",
              },
            });
        }
        // Only GET requests work with this proxy.
        if (request.method !== "GET")
          return MethodNotAllowed(request);
    
        let response = await fetch(url);
        let content = await response.json();
    
        content.metadata = content.metadata.replace(
          "raremouse1@primal.net",
          "gon@ovalerio.net"
        );
    
        return new Response(
          JSON.stringify(content),
          response
        )
      },
    };

    Then I just had to let it handle the https://ovalerio.net/.well-known/lnurlp/gon route.

    The wallet is still the one provided by Primal. However, I can now share with everyone that my lightning address is the same as my email: gon at ovalerio dot net. When I move to my infrastructure, the address will remain the same.

    And this is it, feel free to use the above script to set your lightning address and to follow me on Nostr. If any of this is useful to you, you can buy me a beer by sending a few sats to my new lightning address.

  • My setup to keep up with podcasts

    To be sincere, I have a strong preference for written content. There is something with audio and video (podcasts and streams) that doesn’t fit very well with me or how I consume content when I’m at the computer.

    Nevertheless, there is a lot a great content that is only available through podcasts. So sometimes I need to tune in.

    Given I like to go for a run once in a while, listening to a podcast is a great way to learn something while training. For this reason, I decided to be as organized with the podcasts as I am with the written content that I consume.

    One great thing about the podcast ecosystem, is that it is one of the few that haven’t totally succumbed to the pressures of centralization. Sure, there have been many attempts, but you can still follow and listen to most of your favorite producers without being tied to an app that belongs to an internet giant.

    You can use open-source software on your computer or phone to manage sources and listen to episodes, and then rely on RSS feeds to find out when new content is available. It works pretty well.

    After some trial and error… there are tons of apps for all platforms… today I’m sharing my current setup, one I’m pleased with.

    • For my phone and tablet: AntennaPod. It is open-source, lets you download the episodes to listen offline, and doesn’t require any account.
    • For my computers: Kasts. Almost the same as AntennaPod, but for standard Linux distros.

    Syncing

    The apps work great when used standalone. However, with multiple devices, I need the subscriptions to be the same, to know the episodes I already listened to and to keep track of the status of the episode I’m currently listening. So I can jump from one device to another.

    If I can’t have everything in sync, it becomes a total mess. Fortunately, we can do that using another piece of open-source software that is self-hostable.

    To keep all the devices on the same note, I use GPodder Sync on an existing Nextcloud instance. Both AntennaPod and Kasts have built-in support for it and work “flawlessly” together.

    This is it, if you think there is a better way to do this in an “open” way, please let me know, I’m always open to further enhancements.

  • The books I enjoyed the most in 2023

    We reached the end of another year, and generally, this is a good time to look back, to evaluate what was done, what wasn’t done and eventually plan ahead.

    While dedicating some time to the first task, it occurred to me that I should share some of this stuff. I doubt it will be useful to the readers of this blog (but you never know), however it could be useful to me as notes that I leave behind, so I can come back to them later.

    Today I decided to write a bit about the books I’ve read and enjoyed the most during 2023. I don’t mean they are masterpieces or references in a given field, what I mean is that I truly enjoyed the experience. It could be because of the subject, the kind of book, the writing style or for any other reason.

    What matters is that I was able to appreciate the time I spent reading them.

    “The founders”, by Jimmy Soni

    When the events described in the book happened, I wasn’t old enough to follow, comprehend and care about them. I certainly was already playing with computers, but this evolving thing called the Internet, and more precisely the web, was more like a playground to me.

    Nevertheless, the late 90s and early 00s were a remarkable period and many outstanding innovations were being developed and fighting their way through at that time.

    Internet payments were one of those things and the story of how PayPal came to be, is not only turbulent, as for any startup, but is also fascinating. Its survival in the end is the result of immense talent, hard work, troublesome politics and luck.

    The team not only had to face hard technical challenges and also fight on multiple other fronts, including with themselves.

    One quote from the introduction tells a lot about how some arguments against change are timeless:

    … and sites like PayPal were often thought to be portals for illicit activity like money laundering or the sale of drugs and weapons. On the eve of its IPO, a prominent trade publication declared that the country needed PayPal “as much as it does an anthrax epidemic”.

    From the introduction of the book

    Nevertheless, they changed the way we do things and helped unleash a new era. An era that people like me, who always worked online and for businesses that depend on the Internet, were able to be part of due to many of these achievements. Not to mention the unusual number of individuals who were part of PayPal and would later start other world-changing businesses.

    “On Writing”, by Stephen King

    The second book is related to an entirely different field, but shares many similarities with the first. It tells us the story behind the name that authored many well-known books during several decades.

    It is a book not only about the person and the challenges faced throughout his life, but also how his path impacted the writing and “taste”.

    When I started with the book, I was expecting something entirely different, more of a collection of rules and tips on how to write clearly. In the end, the book was much more than that, of course, the important teachings are there, but it also provides you with a great story and context.

    I can’t say I learned all the tips the book has to offer and that I’ve put them in practice, but the experience remains.

    As with all crafts, it requires time, dedication, and much more.

  • You can now follow this blog on the fediverse

    The possibilities of the ActivityPub protocol, and what it can bring to the table regarding interoperability in the social media landscape, are immense. It is specially welcome after a decade (and half?) plagued by the dominance of centralized walled gardens that almost eradicated the diverse ecosystem that previously existed.

    It is used by many software packages, such as Mastodon, Peertube, Lemmy and others. For all of them, you can run your own instance if you don’t want to sign up for the existing providers (just like email).

    By speaking the ActivityPub “language”, these services can communicate with each other, forming a network that we call the Fediverse.

    Recently, WordPress joined the party by adding support through a plugin. Since this blog runs on WordPress, its content is now available to be followed on any of those networks.

    Together with the existing RSS feeds, that you can add to your newsreader, you now have another way of getting the latest content without having to directly visit the website.

    To accomplish that, you just have to search for content@blog.ovalerio.net on any instance in the Fediverse.

    Keep in mind that my “real” account on that network is still dethos@s.ovalerio.net, where I share, much more frequently, short form content, such as links, that I find interesting.

  • Cleaning my follow list using “jacanaoesta”

    Last year we saw the rise of the Fediverse. Mostly because of a series of external events, that ended up pushing many people to try other alternatives to their centralized platform of choice.

    Mastodon was clearly the software component that got most attention and has been under the spotlight in the last few months. It wasn’t launched last year, in fact, mastodon instances (servers) have been online since 2016, managed by its developers and other enthusiasts.

    I’ve been running my own instance since 2017 and since then, I’ve seen people come and gone. I started following many of them, but some no longer are active. This brings us to the real topic of this post.

    Since I couldn’t find a place in the Mastodon interface that would allow me to check which users I follow are inactive, I decided to build a small tool for that. It also served as a nice exercise to put my rust skills into practice (a language that I’m trying to slowly learn during my spare time).

    The user just needs to specify the instance and API key, plus the number of days for an account to be considered inactive if the default (180 days) is not reasonable. Then the tools will print all the accounts you follow that fit that criteria.

    Find people that no longer are active in your Mastodon follow list.
    
    Usage: jacanaoesta [OPTIONS] <instance>
    
    Arguments:
      <instance>
    
    Options:
      -k, --api-key      Ask for API key
      -d, --days <days>  Days since last status to consider inactive [default: 180]
      -h, --help         Print help information
      -V, --version      Print version information

    And this is an example of the expected output:

    Paste API Key here:
    Found 171 users. Checking...
    veracrypt (https://mastodon.social/@veracrypt) seems to be inactive
    ...
    fsf (https://status.fsf.org/fsf) seems to be inactive
    38 of them seem to be inactive for at least 180 days

    Without the -k option, the program tries to grab the API key from the environment variables instead of asking the user for it.

    Problem solved. If you want or need to give it a try, the code and binaries can be found here: https://github.com/dethos/jacanaoesta

    Note: After publishing this tool, someone brought to my attention that Mastodon does indeed have a similar functionality in its interface. The difference being it only considers accounts that don’t publish a status for 1 month as inactive (it’s not configurable).

    You can find it in “Preferences → Follows and Followers → Account Activity → Dormant”

    Screenshot of where to find the "dormant" functionality.
  • Shutting Down Webhook-logger

    A few years ago I built a small application to test Django’s websocket support through django-channels. It basically displayed on a web page in real time all the requests made to a given endpoint (you could generate multiple of them) without storing anything. It was fun and it was very useful to quickly debug stuff , so I kept it running since that time.

    If you are interested in more details about the project itself, you can find a complete overview here.

    However today, Heroku, the platform where it was running, announced the end of the free tier. This tier has been a godsend for personal projects and experiments over the last decade and heroku as a platform initially set the bar really high regarding the developer experience of deploying those projects.

    “Webhook-logger” was the only live project I had running on Heroku’s free tier and after some consideration I reached the conclusion it was time to turn it off. Its functionality is not unique and there are better options for this use case, so it is not worth the work required to move it to a new hosting provider.

    The code is still available in case anyone still want to take a look or deploy by their own.

  • worker-planet was awarded a swag box

    If you remember, back in June/July I worked on a small project to make it easy to build small community pages that aggregate content produced from many sources. As I shared in the post, worker-planet was built to run on “Cloudflare Workers” without the need to manage a server yourself.

    A short time afterwards I noticed that Cloudflare was running a challenge for developers to build on top of their tools/services. So I submitted worker-planet, since it already complied with all requirements.

    Email received, informing that one of the swag boxes was awarded.

    Today I found out that it was awarded one of the 300 prizes.

    That’s cool, at least I’m not the only one that finds it useful. Perhaps I should fix some of the existing issues and improve the project (like better instructions and more themes?).

    If you haven’t tried it, please do and let me know what you think. If you are better than me (the bar is very low) at building custom themes, feel free to contribute one, I would appreciate.

    I will try to release a new version early in 2022.

  • Tools I’m thankful for

    In the spirit of thanksgiving, even though it isn’t a tradition here where live, and following the same path as some posts I’ve read today, here’s 5 software tools I’m thankful for.

    (Of course this is not a comprehensive list, but today these are the ones that come to my mind)

    Syncthing

    This tool basically lets us sync files with multiple devices without relying on a central service/server (unlike Dropbox or Google Drive). So we don’t have to rely on a 3rd party service to sync your documents, it is all done in a p2p fashion with high security standards.

    No Internet connection? no worries, it works through the local network as well.

    BorgBackup

    I use Borg for most kinds of backups. Many will argue that there are alternatives that do X and Y better, but Borg has all I need for this task and does its job very well. It is compatible with many services, has a nice CLI and several decent GUIs.

    Backups are critical, so we should rely on a mature tool that we can trust.

    VLC

    I don’t remember the last time I had “hair pulling” struggles when trying to play a video file or disk. VLC is one of the reasons why (perhaps the biggest of them). It’s a light and versatile Swiss army knife for dealing with video content and it handles whatever I am trying to do: watch a stream, open a video with a strange encoding or even convert between file formats.

    uBlock Origin

    Nowadays the mainstream websites are almost unbearable, they are slow, heavy and full of ads (displayed in many formats with all kinds of tricks). Not to mention the huge effort they make to track you “everywhere you go”.

    This “little” browser extension takes care of blocking and removing a big chunk of that annoying content, making the pages faster while helping us avoid being followed online.

    Python

    To finish the list, a programming language and its interpreter. Throughout the last decade I ended up using several programming languages, either on my job and for personal projects, but there is one of them that I always fallback to and is a joy to use.

    Easy to read and to write, available almost everywhere, it might not be the a perfect fit for all tasks but allows you to do a lot and quickly.

  • My picks on open-source licenses

    Sooner or later everybody that works with computers will have to deal with software licenses. Newcomers usually assume that software is either open-source (aka free stuff) or proprietary, but this is a very simplistic view of the world and wrong most of the time.

    This topic can quickly become complex and small details really matter. You might find yourself using a piece of software in a way that the license does not allow.

    There are many types of open-source licenses with different sets of conditions, while you can use some for basically whatever you want, others might impose some limits and/or duties. If you aren’t familiar with the most common options take look at choosealicense.com.

    This is also a topic that was recently the source of a certain level of drama in the industry, when companies that usually released their software and source code with a very permissive license opted to change it, in order to protect their work from certain behaviors they viewed as abusive.

    In this post I share my current approach regarding the licenses of the computer programs I end up releasing as FOSS (Free and Open Source Software).

    Let’s start with libraries, that is, packages of code containing instructions to solve specific problems, aimed to be used by other software developers in their own apps and programs. On this case, my choice is MIT, a very permissive license which allows it to be used for any purpose without creating any other implications for the end result (app/product/service). In my view this is exactly the aim an open source library should have.

    The next category is “apps and tools”, these are regular computer programs aimed to be installed by the end user in his computer. For this scenario, my choice is GPLv3. So I’m providing a tool with the source code for free, that the user can use and modify as he sees fit. The only thing I ask for is: if you modify it in any way, to make it better or address a different scenario, please share your changes using the same license.

    Finally, the last category is “network applications”, which are computer programs that can be used through the network without having to install them on the local machine. Here I think AGPLv3 is a good compromise, it basically says if the end user modifies the software and let his users access it over the network (so he doesn’t distribute copies of it), he is free to do so, as long as he shares is changes using the same license.

    And this is it. I think this is a good enough approach for now (even though I’m certain it isn’t a perfect fit for every scenario). What do you think?

  • And… the blog is back

    You might have noticed that the website has been unavailable during the last week (or a bit longer than that), well, the reason is quite simple:

    OVH Strasbourg datacenter burning
    OVH Strasbourg datacenter burning (10-03-2021)

    It took sometime but the blog was finally put online again, new content should be flowing in soon.

    And kids, don’t forget about the backups, because the good old Murphy’s law never disappoints:

    Anything that can go wrong will go wrong

    Wikipedia
  • 10 years

    The first post I published on this blog is now 10 years old. This wasn’t my first website or even the first blog, but it’s the one that stuck for the longest time.

    The initial goal was to have a place to share anything I might find interesting on the Web, a place that would allow me to publish my opinions on all kinds of issues (if I felt like it) and to be able to publish information about my projects. I think you still can deduce that from the tag line, that remained unchanged ever since.

    From the start, being able to host my own content was one of the priorities, in order to be able to control its distribution and ensuring that it is universally accessible to anyone without any locks on how and by whom it should be consumed.

    The reasoning behind this decision was related to a trend that started a couple of years earlier, the departure from the open web and the big migration to the walled gardens.

    Many people thought it was an inoffensive move, something that would improve the user experience and make the life easier for everyone. But as anything in life, with time we started to see the costs.

    Today the world is different, using closed platforms that barely interact with each other is the rule and the downsides became evident: Users started to be spied for profit, platforms decide what speech is acceptable, manipulation is more present than ever, big monopolies are now gate keepers to many markets, etc. Summing up, the information and power is concentrated in fewer hands.

    Last week this event set the topic for the post. A “simple chat app”, that uses an open protocol to interact with different servers, was excluded/blocked from the market unilaterally without any chance to defend itself. A more extensive discussion can be found here.

    The message I wanted to leave in this commemorative post, is that we need to give another shot to decentralized and interoperable software, use open protocols and technologies to put creators and users back in control.

    If there is anything that I would like to keep for the next 10 years, is the capability to reach, interact and collaborate with the world without having a huge corporation acting as middleman dictating its rules.

    I will continue to put an effort in making sure open standards are used on this website (such RSS, Webmention, etc) and that I’m reachable using decentralized protocols and tools (such as email, Matrix or the “Fediverse“). It think this is the minimum a person could ask for the next decade.

  • The app I’ve used for the longest period of time

    What is the piece of software (app) you have used continuously for the longest period of time?

    This is an interesting question. More than 2 decades have passed since I’ve got my first computer. Throughout all this time my usage of computers evolved dramatically, most of the software I installed at the time no longer exists or is so outdated that there no point in using it.

    Even the “type” of software changed, before I didn’t rely on so many web apps and SaaS (Software as a service) products that dominate the market nowadays.

    The devices we use to run the software also changed, now it’s common for people to spend more time on certain mobile apps than their desktop counterparts.

    In the last 2 decades, not just the user needs changed but also the communication protocols in the internet, the multimedia codecs and the main “algorithms” for certain tasks.

    It is true that many things changed, however others haven’t. There are apps that were relevant at the time, that are still in use and I expect that they will still be around in for many years.

    I spent some time thinking about my answer to the question, given I have a few strong contenders.

    One of them is Firefox. However my usage of the browser was split by periods when I tried other alternatives. I installed it when it was initially launched and I still use it nowadays, but the continuous usage time doesn’t take it to the first place.

    I used Windows for 12/13 straight years before switching to Linux, but it is still not enough (I also don’t think operating systems should be taken into account for this question, since for most people the answer would be Windows).

    VLC is another contender, but like it happened to Firefox, I started using it early and then kept switching back and forth with other media players throughout the years. The same applies to the “office” suite.

    The final answer seems to be Thunderbird. I’ve been using it daily since 2004, which means 16 years and counting. At the time I was fighting the ridiculously small storage limit I had for my “webmail” inbox, so I started using it to download the messages to my computer in order to save space. I still use it today for totally different reasons.

    And you, what is the piece of software or app you have continuously used for the longest period of time?

  • Observations on remote work

    A few days ago I noticed that I’ve been working fully remote for more than 2 years. To be sincere this now feels natural to me and not awkward at all, as some might think at the beginning or when they are introduced to the concept.

    Over this period, even though it was not my first experience (since I already did it for a couple of months before), it is expected that one might start noticing what works and what doesn’t, how to deal with the shortcomings of the situation and how make the most of its advantages.

    In this post I want to explore what I found out in my personal experience. There are already lots of articles and blog posts, detailing strategies/tips on how to improve your (or your team’s) productivity while working remotely and describing  the daily life of many remote workers. Instead of enumerating everything that already has been written, I will focus on some aspects which proved to have a huge impact.

    All or almost nothing

    This is a crucial one, with the exception of some edge cases, the most common scenario is that you need to interact and work with other people. So remote work will only be effective and achieve its true potential if everyone accepts that not every element of the team is present in the same building.

    The processes and all the communication channels should be available for every member of the team the same way. This means that it should resemble the scenario where all members work remotely. We know people talk in person, however work related discussions, memos, presentations and any other kind of activity should be available to all.

    This way we don’t create a culture were the team is divided between first and second class citizens. The only way to maximize the output of the team, is to make sure everyone can contribute with 100% of their skills. For that to happen, adequate processes and an according mindset is required.

    Tools matter

    To build over the previous topic, one important issue is inadequate tooling. We need to remove friction and make sure working on a team that is spread through multiple locations requires no more effort and doesn’t cause more stress than it would normally do in any other situation.

    Good tools are essential to make it happen. As an example, a common scenario is a bad video conference tool that is a true pain to work with, making people lose time at the beginning of the conference call because the connection can’t be established or nobody is able to hear the people on the other end. Merge that together with the image/sound constantly freezing and the frustration levels go through the roof.

    So good tools should make communication, data sharing and collaboration fluid and effortless, helping and not getting in the way. They should adapt to this environment (remote) and privilege this new way of working, over the “standard”/local one (this sometimes requires some adjustments).

    Make the progress visible

    One of the issues people often complain about remote work, is the attitude of other colleagues/managers who aren’t familiarized with this way of doing things, struggling with the notion of not seeing you there at your desk. In many places what counts is the time spend on your chair and not the work you deliver.

    On the other side, remote workers also struggle to be kept in the loop, there are many conversations that are never written or recorded, so they aren’t able to be part of.

    It is very important to fix this disconnection, and based on the first point (“All or almost nothing”) the complete solution requires an effort of both parties. They should make sure that the progress being done is visible to everyone, keeping all team in the loop and able to participate. It can be a log, some status updates, sending some previews or even asking for feedback regularly, as long as it is visible and easily accessible. People will be able to discuss the most recent progress and everyone will know what is going on. It might look like some extra overhead, but it makes all the difference.

    Final notes

    As we can see working remotely requires a joint effort of everybody involved and is not immune to certain kinds of problems / challenges (you can read more on this blog post), but if handled correctly it can provide serious improvements and alternatives to a given organization (of course there are jobs that can’t be done remotely, but you get the point). At least at this point, I think the benefits generally outweigh the drawbacks.