Category: Technology and Internet

Everything related to technology and the Internet. Tutorials, projects, news…

  • Upgrade your “neo-python” wallets

    Several weeks ago I started to explore the NEO ecosystem, for those who are not aware NEO is a blockchain project that just like Ethereum pretends to create the tools and the platform to execute smart-contracts and create new types of decentralized applications. It has its pros and cons just like any other system, but that is outside of the scope of this blog post.

    One of the defining characteristics of this “cryptocurrency” is the ability develop those smart-contracts in programming languages the user already is familiar with (however only a small subset of the language is available).

    So I searched for the available SDKs and found the neo-python project, which is a wallet software and also a set of tools to develop using the Python programming language. The project is developed by a community of supporters of the NEO ecosystem called City of Zion.

    And now the real topic of the post begins, while learning the features and exploring the codebase I found an urgent security issue with the way the wallets were being encrypted by neo-python.

    Long story short, the method used to protect the wallets wasn’t correctly implemented and allowed an attacker with access to the wallet file to decrypt it without the need for the password/pass-phrase (more details here) .

    Fortunately it is an actively developed project and the team responsible for it was quick to acknowledge the problem and merge the fix I proposed in a pull request. The fix is now present in the newer versions of the project, and it now forces the users to reset the security features of their wallets (check this video for more details, starting on minute 8 up to 10) .

    Now in this post I would like to leave my recommendation about how to proceed after re-encrypting the wallet, because even though the issue is fixed your private keys might have been compromised before you applied the patch. If you are a user and didn’t noticed nothing yet the most probable scenario is that you weren’t compromised, since most immediate thing an attacker could/would do is to steal your funds.

    Nevertheless, there is always the possibility and to avoid any bad surprises you definitely should:

    1. Properly encrypt your wallet using the reencrypt_wallet.py  script.
    2. Check the new generated wallet is working properly.
    3. Then delete the old wallet.
    4. Create a new wallet.
    5. Transfer your funds to the new wallet.

    The steps 4 and 5 are necessary because the fix protects your master key but it doesn’t change it and as I previously said if a copy of your vulnerable wallet exists (created by you or by an attacker) your funds are still accessible. So don’t forget to go through them.

    Other than this, the project is very interesting and while still immature it has been fun the work with it, so I will keep contributing some improvements in the near future.

     

  • Decentralized alternatives on the Internet

    This post follows a recent discussion about decentralization at the first Madeira Tech Meetup of 2018. I already touched these subjects previously here on the blog a couple of times (for example,  the one about Zeronet and the one about Mastodon), but this time I will try to summarize the key points of the issue and give a list of awesome projects and applications the will help you reduce your dependency on several centralized services.

    Nowadays many most popular web services and applications that “we” use are centralized, this means that the application and its data only exist on the machines of a single entity. As soon as this was considered normal, it led to another phenomenon, which these “apps” close down any possible interaction that is not under their control, to keep the user locked in. A good example is social media and their chat applications, since closing down the doors to open protocols and any form of interoperability, allows the network effect to keeps the users inside (why can’t someone with an account on a certain Google chat application, talk with someone who uses Facebook Messenger? you know, like email works).

    The Internet infrastructure at it’s core is decentralized, it was initially designed this way in order to be robust and continue working when there is a problem in any of member of the network. For many years the applications built on top it, followed some of this principals and were happy to have a certain level of interoperability with other software because it provided value. Since mid 00’s to this day things changed and many services and applications are as closed as they can be (the so called wallet gardens), because there are economic incentives to behave this way, making the users and their data “hostage”.

    Fortunately, the new rules of the European Union will start being implemented and enforced in the next couple of months. Other than giving back the control of their data to users, it requires that they must be able to export it in a common format, which will allow users to move to other competitors more easily.

    However, this new right still threatens some comfortable walled-garden models. For example, social networks, exercise-tracking apps, and photo services will have to allow users to export their posts, rides, and photos in a common format. Smart competitors will build upload tools that recognize these formats; GDPR might therefore help to bridge the strategic moats of incumbents.

    Source: Techies Guide to GDPR

    But it doesn’t solve all the centralization problems. So a better (and complementary) approach is to use applications and services that use open protocols and were built with interoperability in mind. If they are open source, if you can self-host them or if they are completely distributed, even better.

    So to give you a head start, here is a list of applications that address common usages in the Internet, that are open-source, decentralized and can easily replace popular software you use nowadays:

    • Matrix – Is an open chat protocol and collection of reference implementations (ready to use) that is an alternative to any chat application, you can use an existing server or launch your own.
    • Mastodon – is a Twitter alternative. Together with GNUSocial and other softwares, they form an open network that regardless of which service or software type you use, you are able to interact with everyone. You also can host your own server/instance.
    • IPFS – a content-based addressing protocol (with reference implementation) that tries to make the web more distributed and less reliant on central servers. Very useful to publish public data from your own computer.
    • ZeroNet – an alternative distributed network where websites are served by the peers in the network and not central servers. One of the nice things here is that you can easily clone and publish websites using standard templates.
    • Peertube – the name gives you a hint, it is a Youtube replacement that doesn’t depend any central server. It is federated and the video is distributed to the users by “Webtorrent”.
    • Syncthing – File synchronization application (alternative to Dropbox) that lets you have all your data up to date across all your devises and doesn’t rely on central servers.
    • OwnCloud/NextCloud – Also an alternative to Dropbox and similar services, were the user can self-host a server or use one available on the Internet. These servers are federated, which means you can share and work on the same folders even with users from another server or service.

    Like these ones there are many others available. I just highlighted the ones above because I think they are ready/stable and can “fight” with the existing centralized services.

    The usage of decentralized applications makes the Internet more democratic, more resistant to censorship, more resilient, makes the life harder to authoritarian entities, as well as others that try to centrally manipulate public perception (don’t automatically adding users to the famous “filter bubbles”) and reduces the probability of massive companies maintaining monopolies on certain markets.

    At first it might take a little bit of work, but the benefits are there and will be collected later.

  • Getting started with Blockchain and related technologies

    Crypto currencies have been at all rage in the last few months, with the price of Bitcoin and other Altcoins skyrocketing and achieving new all time heights.

    This is very interesting and since there is the idea that the technology supporting all of these coins is revolutionary, some saying it can be used to solve many problems that we currently struggle with (or turn thing more transparent). I needed to take a look at how it works internally.

    For anyone trying to understand and work with “Blockchain” and its new style of “consensus”, playing and just using the coins and wallets is not enough. All the things that happen in the background are complex and require the understand of several mathematical properties.

    So where to start?

    There are many books (and I’ve read a few of them), but some just approach the technology from an high level and from a user perspective. Others are good and complete, but very dense.

    I took another approach to complete my understanding of the internals of the technology. I joined and recently completed the online course of “Bitcoin and Cryptocurrency Technologies” from Princeton University, available on the Coursera platform.

    The content, even though the course dates from 2014/2015, is solid. It covers the basic concepts and the internals of bitcoin very well, foundational things that do not change that often. Content isn’t just thrown at you, they build up your knowledge on the matter.

    The only issue, in my humble opinion, is the timing of the first graded assignment, which is given to you before the lessons on those concepts are explained (but you can watch those lessons and comeback to the assignment later). On the remaining assignments some context and better explanation of the terms on the code might help, but eventually you end up understanding what they mean. Otherwise the discussion forums will guide you in the right direction.

    Overall, truly recommend this course (that is free at the moment) if you desire to get an initial and deeper knowledge of the system and not just the user perspective.

  • Making the most of your dusty Raspberry Pi

    Raspberry Pi was one of those projects that were “godsent” (just like Arduido). A cheap and very small computer that would let the little ones and also older folks learn, explore, prototype and build lots of things that  previously would be inaccessible to many.

    So in this post I want to give you some tips and ideas of things you could do with that Raspberry Pi you have unused in the drawer. The main objective is to set it up once, configure some services and utilities that are useful in several areas.

    Protect the entire household from Ads

    One awesome project you can easily install on your device is Pi-hole, which you can setup with a simple command and by changing the DNS entries of your router’s DHCP settings, it will let you have all of the devices in your network automatically “protected” against those Ad networks that try to track your Internet usage.

    Protect your communications when not at home

    Other tool that is very easy to setup and can help you when you are abroad or using insecure networks, is the PiVPN project. You will be able to create your own virtual private network, that will route your communications through your chosen network (home network for example), protecting the traffic and also letting you access your devices that are on the given network as well.

    Extra Tip: Use this together with the Pi-hole’s DNS server to block the advertisements and tracking requests anywhere you go, by adding:

    push "dhcp-option DNS <IP OF YOUR PI'S TUN0 INTERFACE>"

    to your “/etc/openvpn/server.conf” (a restart is needed) and changing the “Interface Listening Behavior” to “Listen on all interfaces” on Pi-Hole’s administration.

    Check who’s at Home or at the office

    This one is a little more tricky to setup, but it might worth it when you need to know when somebody is at home or at the office. The Pi-sensor-free-presence-detector, makes use of your local network to check which devices are connected and then you can link to a person based on who’s the owner.

    Other Stuff

    For those who want to take things one step further and take control of many of the usual web applications you can take a look at the Freedom Box project. You can install it on your Raspberry Pi and configure many applications on your own server.

  • Managing Secrets With Vault

    I’ve been looking into this area, of how to handle and manage a large quantity of secrets and users, for quite a while (old post), because when an organization or infrastructure grow, the number of “secrets” required for authentication and authorization increase as well. Is at this stage that bad practices (that are no more than shortcuts) as reusing credentials, storing them in less appropriate ways or no longer invalidating those who are no longer in required, start becoming problematic.

    Yesterday at “Madeira Tech Meetup” I gave a brief introduction to this issue and explored ways to overcome it, which included a quick and basic explanation of Vault and demo about a common use case.

    You can find the slides of the presentation here and if you have any suggestion or something you would like to discuss about it, feel free to comment or reach through any of the contact mediums I provided.

  • Search open tabs in Firefox

    A few days ago it was announced that next versions of Firefox will have huge improvements on performance and resource usage. This is great because browsers nowadays consume huge amounts of resources, making it harder for people with olders or weaker machines to use the computer without starting to pulling out their hair.

    I’m not a huge fan of having lots of opens tabs, but I know many people that like to work this way, having dozens of them open. One question I always ask is how they find the tab they are looking for, when just the favicons are shown on the tabs.

    It looks like Firefox makes it easy to handle this situation without any extra extension, just select the search bar in any tab (Ctrl+l) and then type “% <search query>” to search on the opens tabs.

    I hope this is useful to you.

  • Keep your dependencies under check

    Nowadays most software projects with a “decent size” rely on many software dependencies, or in other words: libraries and tools, developed by other people. That usually are under constant change.

    The reasons for these are clear and can go from implementing common patterns and avoid repeating ourselves, to accelerate the development, to use mature implementations and avoid some pitfalls, etc. Sometimes many projects rely on way too many dependencies for simples things (Remember the left-pad fiasco?).

    Once these dependencies are loaded, integrated and working as expected, people often forget they are there and many times they stay untouched for long periods of time. Even when newer versions are released, unless something starts breaking, nobody remembers to keep them up to date, a situation that might lead to security vulnerabilities, not in your code but on the code your project depends on.

    Of course I’m not telling you anything new, what I pretend to achieve with this post, is to show that there are many tools available to help you fight this problem. When you integrate them on your CI or on another step of your development process, they will keep you informed about what dependencies have known security vulnerabilities and what you should upgrade as soon as possible.

    The majority of the programming languages have this sort of tools, so a little search should help you find the one that better suits you stack. Below are some examples:

    As an example here is what I needed to do in order to check the dependencies of Hawkpost (an open-source project that I’m deeply involved with at the moment):

    $ safety check --full-report -r requirements/requirements.txt
    safety report
    ---
    No known security vulnerabilities found

    For most of these tools the basic check is this simple to do and in the long run it might save you from some headaches.

    Update (26-06-2018): Added cargo-audit to the list

  • Federated Tweets, or Toots

    Recently there was been a big fuss about “Mastodon“, an open-source project that is very similar to twitter. The biggest difference is that it is federated. So what it means?

    It means that it works like “email”, there are several providers (called instances) where you can create an account (you can setup your own server if you desire) and accounts from different providers can communicate with each other, instead of all information being in just one silo.

    Of course for someone that is in favor of an open web this is a really important “feature”.

    Another big plus is that the wheel wasn’t reinvented, this network is inter-operable with the existing “GNU Social” providers (uses the same protocol), so you can communicate and interact with people that have an account in an instance running that software. It can be seen like 2 providers of the same network running different software packages (one in PHP the other in Ruby) but talking the same language over the network.

    I haven’t tested it much yet, but given it is a push for a solution that is not centralized (which is a rare thing nowadays) and I think it is a small step in the right direction, So I’ve setup an individual instance for myself where I will publish regularly links of posts/articles/pages that I find interesting. Feel free to follow at https://s.ovalerio.net and if you know someone worth following in this network, let me know.

    Here are a few links with more information:

    List of instances where you can create an account

  • The wave of bloated desktop apps

    Some time ago I complained about a mindset that was producing a form of “bloated” software (software that uses a disproportionate amount of resources to accomplish “simple” tasks). Later I even posted an example of a popular chat application consuming a ridiculous amount of memory, given the its purpose.

    One recent example of this phenomenon is the recent explosion of electron desktop applications. Today the following post reached the front page of Hacker News, and it makes a good argument:

    Electron is flash for the desktop

    https://josephg.com/blog/electron-is-flash-for-the-desktop/

  • Pixels Camp 2016

    A few weeks ago took place in Lisbon the first edition of Pixels Camp (aka Codebits 2.0), an event that I try to attend whenever it happens (see previous posts about it). It is the biggest technology focused event/conference in Portugal with a number of attendees close to 1000.

    This year the venue changed to LX Factory, even though the place is really cool, it is not as well located as the previous venue, at least to people who don’t live in Lisbon and arrive to the airport. The venue was well decorated and with a cool atmosphere, giving you the feeling that it was the place to be. However, this year there was less room for the teams working on the projects and not everybody was able to get a table/spot (it appeared to me that the venue was a little bit smaller than the previous one).

    From the dozens of great talks that were given on the 4 stages of the event, many of whose I was not able to see since I was competing in the 48h programming competition, bellow are two that I really liked:

    Chrome Dev Tools Masterclass

    IPFS, The Interplanetary Filesystem

    If you have some curiosity you may find the remaining on their youtube channel.

    All this is great but the main activity of Pixels Camp is the 48h programing competition and this year we had another great batch of cools projects being developed (total of 60, if I remember correctly).

    As usual I entered the contest, this time with the fellow Whitesmithians, Rui and Pedro. We chose to develop a GPS based game, you know, since it seemed to be a popular thing this summer and we though the medium still has great potential to do really entertaining stuff.

    The idea already had a few years but never had been implemented and at its core was quite simple. It took some ideas from the classic game “pong” and adapted it to be played in a fun way while navigating through a real world area.

    We called it PonGO and essentially the users must agree on a playing field, such as city block, a city or even bigger areas, then they connect their phones and the ball starts rolling. The players have to move around with their phones (which they use to see the map and track everyone’s position) trying to catch the ball and throw it to the other side of the map. The player that is able to do it more times wins the game. Here is sketch we did while discussing the project:

    Initial Sketch
    Initial Sketch

    As you can see in the above image, that would be on the phone’s screen, the player (in yellow) reached close enough to the ball so it can play it, now he has to change the direction to one of the opposite sides (marked as green). The other players (in blue), will have to run to catch the ball before it gets out. Spread across the map you can see some power ups that give users special capabilities.

    That’s it, it might seem easy but doing it in less that 48h is not. We ended with a working version of the game but the power ups were not implemented due to time constrains. Here are some screenshots of the final result(we used the map view instead of the satellite view so it might look a little different):

    In game screenshotsIn game action

    The code itself is a mess (it was an hackathon what were you expecting) and can be found here and here.

    At the end, it was a great event as usual and I would also like to congratulate some of my coworkers at Whitesmith that took home the 7th place in the competition. Next year I hope to be there again (and you should too).

  • EU-Free and Open Source Software Auditing project

    Today I stumbled on this blog post about a poll for the EU-FOSSA. I’m not familiarized with all aspects of this pilot project, however by the information I could gather, it seems to be a really great idea.

    Most of us regularly use, up to a certain degree, several pieces of free (as in freedom) software on a daily basis. Many of these projects are essential to assure the security of our communications, documents and work. European institutions and countries make use of these tools as well, so why not spend a little time and money to assure they meet certain quality goals and are free of major bugs that can undermine the safety of its users?

    This will also raise the public’s trust on these tools, so they can become standards over their proprietary counterparts, which we are unable to review and modify according to our needs, leading to many security questions.

    One of its components is a sample review of one open-source project and until the 8th day of July you can give your opinion on which one. Go there it only takes 1 minute and it will help them understand that this is an important issue. Here is the link

  • Test driving ZeroNet

    A few weeks ago the “Decentralized Web Summit” took place in San Francisco, even though there was a video stream available at the time, I wasn’t able to watch it, but later I saw some excerpts of it. One of the talks that caught my attention was about a new thing called ZeroNet. It seemed to be some kind of network where the assets and the contents of the websites are fetched from your peers while introducing clever mechanisms to give the owners control and allowing the existence of user generated content. It grabs concepts from either bitcoin and bittorrent, but for a better explanation bellow is an introduction by the creator of this technology:

    The presentation is very high level, so on the website I found some slides with more details about how it works and I must say it is very interesting from a technical perspective, it even has an address naming system (“.bit”) if you don’t want to have some gibberish on the address bar.

    Watching the video things seamed to be working pretty well (for something that was being presented for the first time), so I decided to join the network and give it a try. For those using docker it happens to be pretty easy, just run:

    $ docker run -d -v <local_data_folder>:/root/data -p 15441:15441 -p 43110:43110 nofish/zeronet

    then your node will be available on: http://127.0.0.1:43110/

    After using it for 2 weekends, I have to say the level of polish of this project is amazing, all the pre-built apps work pretty well and are easy to use, the websites load super fast (at least in comparison with my expectations) and changes show up in real-time. The most interesting aspect of all was the amount of people trying and using it.

    You may ask, what are the great advantages of using something like this? Based on what I’ve seen during these few days there are 3 points/use cases where this network shines:

    • Websites cannot be taken down, as long as there are peers serving it, it will be online.
    • Zero infrastructure costs (or pretty close) to run a website there, you create and sign the content, it gets delivered by the peers.
    • Website that you visit remain available while you are offline.

    So to test this network further, I will do an experiment. During the next few weeks/months I will mirror this blog and make the new contents available on ZeroNet, starting with this post. The address is:

    http://127.0.0.1:43110/1PLZ7PjfX91VSMmzU5revwswmrEkTz6Mpk

    Note: In this initial stage it might not be always available, since at the moment I’m the only peer serving it from my laptop.

    To know more about it, check the repository on Github.

  • Django Friday Tips: Timezone per user

    Adding support for time zones in your website, in order to allow its users to work using their own timezone is a “must” nowadays. So in this post I’m gonna try to show you how to implement a simple version of it. Even though Django’s documentation is very good and complete, the only example given is how to store the timezone in the users session after detecting (somehow) the user timezone.

    What if the user wants to store his timezone in the settings and used it from there on every time he visits the website? To solve this I’m gonna pick the example given in the documentation and together with the simple django-timezone-field package/app implement this feature.

    First we need to install the dependency:

     $ pip install django-timezone-field==2.0rc1

    Add to the INSTALLED_APPS of your project:

    INSTALLED_APPS = [
        ...,
        'timezone_field',
        ...
    ]

    Then add a new field to the user model:

    class User(AbstractUser):
        timezone = TimeZoneField(default='UTC'

    Handle the migrations:

     $python manage.py makemigration && python manage.py migrate

    Now we will need to use this information, based on the Django’s documentation example we can add a middleware class, that will get this information on every request and set the desired timezone. It should look like this:

    from django.utils import timezone
    
    
    class TimezoneMiddleware():
        def process_request(self, request):
            if request.user.is_authenticated():
                timezone.activate(request.user.timezone)
            else:
                timezone.deactivate()

    Add the new class to the project middleware:

    MIDDLEWARE_CLASSES = [
        ...,
        'your.module.middleware.TimezoneMiddleware',
        ...
    ]

    Now it should be ready to use, all your forms will convert the received input (in that timeone) to UTC, and templates will convert from UTC to the user’s timezone when rendered. For different conversions and more complex implementations check the available methods.

  • Receive PGP encrypted emails, without the sender needing to know how to do it

    One common trouble of people trying to secure their email communications with PGP, is that more often that not the other end doesn’t know how to use these kind of tools. I’ll be honest, at the current state the learning curve is too steep for the common user. This causes a huge deal of trouble when you desire to receive/sent sensitive information in a secure manner.

    I will give you an example, a software development team helping a customer building his web business or application, may want to receive a wide variety of access keys to external services and APIs, that are in possession of the customer and are required (or useful) to be integrated in the project.

    Lets assume that the customer is not familiarized with encryption tools, the probability of that sensitive material to be shared in an insecure way is too high, he might send it through a clear text email or post it on some shared document (or file). Both the previous situations are red flags, either by the communication channel not secure enough or the possibility of existing multiple copies of the information in different places with doubtful security, all of them in clear text.

    In our recent “Whitesmith Hackathon”, one of the projects tried to address this issue. We though on a more direct approach to this situation based on the assumption that you will not be able to convince the customer into learning this kind of things. We called it Hawkpost, essentially it’s a website that makes use of OpenPGP.js, where you create unique links containing a form, that the user uses to submit any information, that will then be encrypted on his browser with your public key (without the need to install any extra software) and forwarded to your email address.

    You can test and used it on https://hawkpost.co, but the project is open-source, so you can change it and deploy it on your own server if you prefer. It’s still in a green state at the moment, but we will continue improving the concept according with the received feedback. Check it out and tell us what you think.

  • Log based analytics are still useful

    A long time ago, most of the modern website analytics software made the shift from relying on server logs to use client-side code snippets to gather information about the user, in this last category we can include as examples Google Analytics and Piwik. In fact, this paradigm allows to collect information with greater detail about the visitors of the website and gives developers more flexibility, however this can also be seen as the website owners imposing the execution of code on the user’s computing device that goes against his will and undermines his privacy (some people go as further as putting it in the same category as malware). Log based analytics software, last time i checked, is seen as a museum relic from the 90s and early 00s.

    However, as have been explained in a blog post named: Why “Ad Blockers” Are Also Changing the Game for SaaS and Web Developers and further discussed by the Hacker News community, we might need look again to the server-side approach, since the recent trends of using Ad blockers (which have all legitimacy, given the excesses of the industry) can be undermining the usefulness of the client-side method, given that most of the time the loading of the snippet and the extra requests that are required are being blocked. This is why server side analytics can be very handy again, allowing us to measure the “Ghost Traffic” as it is called in the article.

    A very high level overview of both methods can be described like this:

    Client-side:

    • Pros:
      • Lots of information
      • Easy to setup
    • Cons:
      • Extra requests and traffic
      • Can be blocked by browser extensions
      • The use of a third party entity raises some privacy concerns.

    Server-side:

    • Pros:
      • Cannot be blocked,
      • Does not pose a privacy concern since it only records the requests for the website “pages” made by the user.
    • Cons:
      • Less detailed information,
      • If the server is behind a CDN, not all requests will hit the server.

    The main issues with the use of log based tool is that they look ancient, some haven’t seem an update for a while and can take some work to setup. Nevertheless, they definitely can be very useful in order to understand the extent of the usage of blockers by visitors and even for the cases when we just need simple numbers. It also puts aside the privacy discussion since it only monitors the activity of the servers.

    That’s the case of this blog, I do not run any analytics software here (because I do not see the need given its purpose) and when I’m curious about the traffic, I use a very cool tool called GoAccess, that goes over the nginx logs and generates some nice reports.

    Give it a look, perhaps you don’t need Google Analytics everywhere or its results might not be as accurate as you think, specially if your audience has a significant percentage of tech-savvy people.