Tag: Technology

  • Local AI to the rescue

    The last couple of years have been dominated by the advancements in the Artificial Intelligence (AI) field. Many of us witnessed and are currently experiencing some sort of renaissance of AI.

    It started with generated images from prompts, then it was all types of written content, and in the last few weeks we’ve seen astonishing videos completely generated from a prompt.

    Simultaneously, many other more specific tasks and fields started seeing the outcomes of specialized usage of these technologies.

    Like any other tool ever produced by human ingenuity, it can be used for “good” and for “bad”. However, that’s not what I want to discuss in this post, just a general observation.

    Like many others, I felt the curiosity to experiment with these new tools, to see where they can help me in my daily life, either at work or at a more personal level.

    One thing that quickly caught my attention, was that many of the most well-known products are only accessible through the internet. You send your inputs to “Company X” servers, they run the trained models on their end, and eventually the result is transmitted back to you.

    While understandable, given that the hardware requirements for AI stuff are massive, I find unsettling the continuation of the trend of all your data and interactions being shared with a remote company.

    Let’s take programming as a simple example, an area where some companies are betting strongly on AI helpers, such as GitHub’s Copilot. I think many employers wouldn’t be too happy knowing that their proprietary code was being leaked to a third party through developer interactions with an assistant.

    Even though the above example might not apply to all, it is a real concern and in many places, adopting such a tool would require few discussions with the security and legal teams.

    That is why I turned my attention to how can a person run this stuff locally. The main obstacles to this approach are:

    • The models that are freely available might not be the best ones.
    • Your hardware might not be powerful enough

    Regarding the first problem, a few companies already released models that you can freely use, so we are good. They might not be as good as the big ones, but they don’t need to tell you all the right answers, nor do the job for you, to be useful in some way. They just need to help you break barriers with less effort, as it is shown in a recent study:

    Instead, it lies in helping the user to make the best progress toward their goals. A suggestion that serves as a useful template to tinker with may be as good or better than a perfectly correct (but obvious) line of code that only saves the user a few keystrokes.

    This suggests that a narrow focus on the correctness of suggestions would not tell the whole story for these kinds of tooling.

    Measuring GitHub Copilot’s Impact on Productivity

    The hardware issue is a bigger limitation to running more general and bigger models locally, however my experience showed me that smaller or more specific models can also bring value to the table.

    As a proof that this is viable, we have the example of 2 web browsers that started integrating AI functionality, both for different reasons:

    With the case for Local AI on the table, the next question is: how?

    My local setup

    Next I’ll list and describe the tools I ended up with after some research and testing. It is very likely that this setup will change soon, since things are moving really fast nowadays. Nevertheless, presently, they have been working fine for me on all sorts of tasks.

    I mostly rely on four pieces of software:

    • Ollama: To run the Large Language Models (LLM) on my computers and provide a standard API that other apps can use.
    • Continue.dev plugin for my text editor/IDE: it presents a nice interface to the LLMs and easily attaches context to the session.
    • ImaginAIry: For generating images and illustrations. It can also generate video, but I never explored that part.
    • Fabric: An tool that provides “prompts” for common tasks you would like the AI to do for you.

    All of them work well, even on my laptop that doesn’t have a dedicated GPU. It is much slower than on my desktop, which is much more powerful, but usable.

    To improve that situation, I installed smaller models on the laptop, for example, codellama:7b instead of the codellama:34b and so on.

    And this is it for now, if you have other suggestions and recommendations for local AI tools that I should try, please let me know. I’m well aware that better things are showing up almost every day.

  • What to use for “TOTP” in 2023?

    At the start of last week, we received great news regarding new improvements to a very popular security app, “Google Authenticator”. A feature it was lacking for a long time was finally implemented, “cloud backups”.

    However, after a few days, the security community realized the new feature wasn’t as good as everybody was assuming. It lacks “end-to-end encryption”. In other words, when users back up their 2FA codes to the cloud, Google has complete access to these secrets.

    Even ignoring the initial bugs (check this one and also this one), it is a big deal because any second factor should only be available to the “owner”. Having multiple entities with access to these codes, defeats the whole purpose of having a second factor (ignoring again any privacy shortcomings).

    Summing up, if you use Google Authenticator, do not activate the cloud backups.

    And this brings us to the topic of today’s post: “What app (or mechanism) should I use for 2FA?”

    This question is broader than one might initially expect, since we have multiple methods at our disposal.

    SMS codes should be on their way out, for multiple reasons, but specially because of the widespread SIM swapping vulnerabilities.

    Push-based authenticators don’t seem to be a great alternative. They are not standardized, they tie the user to proprietary ecosystems, and they can’t be used everywhere.

    In an ideal scenario, everyone would be using FIDO2 (“Webauthn”) mechanisms, with hardware keys or letting their device’s platform handle the secret material.

    While support is growing, and we should definitely start using it where we can, the truth is, it is not yet widely accepted. This means we still need to use another form of 2FA, where FIDO2 isn’t supported yet.

    That easy to use and widely accepted second factor is TOPT.

    This still is the most independent and widely used form of 2FA we have nowadays. Basically, you install an authenticator app that provides you temporary codes to use in each service after providing the password. One of the most popular apps for TOPT is the “problematic” Google Authenticator.

    What are the existing alternatives?

    Many password managers (1Password, Bitwarden, etc.) also offer the possibility to generate these codes for you. However, I don’t like this approach because the different factors should be:

    • Something you know
    • Something you have
    • Something you are

    In this case, the password manager already stores the first factor (the “something you know”), so having all eggs in the same basket doesn’t seem to be a good idea.

    For this reason, from now on, I will focus on apps that allow me to store these codes in a separate device (the “something you have”).

    My requirements for such an app are:

    • Data is encrypted at rest.
    • Access is secured by another form of authentication.
    • Has easy offline backups.
    • It is easy to restore a backup.
    • Secure display (tap before the code is displayed on the screen).
    • Open source.
    • Available for android.

    There are dozens of them, but many don’t comply with all the points above, while others have privacy and security issues that I can’t overlook (just to give you a glimpse, check this).

    In the past, I usually recommended “andOTP“. It checks all the boxes and is indeed a great app for this purpose. Unfortunately, it stopped being maintained a few months ago.

    While it is still a solid app, I don’t feel comfortable recommending it anymore.

    The bright side is that I went looking for a similar app and I found “Aegis“, that happens to have great reviews, fulfills all the above requirements and is still maintained. I guess this is the one I will be recommending when I’m asked “what to use for 2FA nowadays”.

  • Tools I’m thankful for

    In the spirit of thanksgiving, even though it isn’t a tradition here where live, and following the same path as some posts I’ve read today, here’s 5 software tools I’m thankful for.

    (Of course this is not a comprehensive list, but today these are the ones that come to my mind)

    Syncthing

    This tool basically lets us sync files with multiple devices without relying on a central service/server (unlike Dropbox or Google Drive). So we don’t have to rely on a 3rd party service to sync your documents, it is all done in a p2p fashion with high security standards.

    No Internet connection? no worries, it works through the local network as well.

    BorgBackup

    I use Borg for most kinds of backups. Many will argue that there are alternatives that do X and Y better, but Borg has all I need for this task and does its job very well. It is compatible with many services, has a nice CLI and several decent GUIs.

    Backups are critical, so we should rely on a mature tool that we can trust.

    VLC

    I don’t remember the last time I had “hair pulling” struggles when trying to play a video file or disk. VLC is one of the reasons why (perhaps the biggest of them). It’s a light and versatile Swiss army knife for dealing with video content and it handles whatever I am trying to do: watch a stream, open a video with a strange encoding or even convert between file formats.

    uBlock Origin

    Nowadays the mainstream websites are almost unbearable, they are slow, heavy and full of ads (displayed in many formats with all kinds of tricks). Not to mention the huge effort they make to track you “everywhere you go”.

    This “little” browser extension takes care of blocking and removing a big chunk of that annoying content, making the pages faster while helping us avoid being followed online.

    Python

    To finish the list, a programming language and its interpreter. Throughout the last decade I ended up using several programming languages, either on my job and for personal projects, but there is one of them that I always fallback to and is a joy to use.

    Easy to read and to write, available almost everywhere, it might not be the a perfect fit for all tasks but allows you to do a lot and quickly.

  • Pixels Camp v3

    Like I did in previous years/versions, this year I participated again on Pixels.camp, a kind of conference plus hackathon. For those who aren’t aware, it is one of the biggest (if not the biggest) technology event in Portugal (from a technical perspective not counting with the Web Summit).

    So, as I did in previous editions, I’m gonna leave here a small list with the nicest talks I was able to attend.

    Lockpicking versus IT security

    This one was super interesting, Walter Belgers showed the audience a set of problems in make locks and compared those mistakes with the ones regularly done by software developers.

    Al least for me the more impressive parts of the whole presentation were the demonstrations of the flaws on regular (and high security) locks.

    Talk description here.


    Containers 101

    “Everybody” uses containers nowadays, on this talk the speaker took a step back and went through the history and the major details behind this technology. Then he shows how you could implement a part of it yourself using common Linux features and tools.

    Talk description here.


    Static and dynamic analysis of events for threat detection

    This one was a nice overview about Siemens infrastructure for threat detection, their approaches and used tools. It was also possible to understand some of the obstacles and challenges a company must address to protect a global infrastructure.

    Talk description here.


    Protecting Crypto exchanges from a new wave of man-in-the-browser attacks

    This presentation used the theme of protecting crypto-currency exchanges but gave lots of good hints on how to improve security of any website or web application. The second half of the talk focused on a kind of attack called man-in-the-browser and focused on a demonstration of it. In my opinion, this last part was weaker and I left with the impression it lacked details about the most crucial part of the attack while spending a lot of time on less important stuff.

    Talk description here.

  • Pixels Camp 2016

    A few weeks ago took place in Lisbon the first edition of Pixels Camp (aka Codebits 2.0), an event that I try to attend whenever it happens (see previous posts about it). It is the biggest technology focused event/conference in Portugal with a number of attendees close to 1000.

    This year the venue changed to LX Factory, even though the place is really cool, it is not as well located as the previous venue, at least to people who don’t live in Lisbon and arrive to the airport. The venue was well decorated and with a cool atmosphere, giving you the feeling that it was the place to be. However, this year there was less room for the teams working on the projects and not everybody was able to get a table/spot (it appeared to me that the venue was a little bit smaller than the previous one).

    From the dozens of great talks that were given on the 4 stages of the event, many of whose I was not able to see since I was competing in the 48h programming competition, bellow are two that I really liked:

    Chrome Dev Tools Masterclass

    IPFS, The Interplanetary Filesystem

    If you have some curiosity you may find the remaining on their youtube channel.

    All this is great but the main activity of Pixels Camp is the 48h programing competition and this year we had another great batch of cools projects being developed (total of 60, if I remember correctly).

    As usual I entered the contest, this time with the fellow Whitesmithians, Rui and Pedro. We chose to develop a GPS based game, you know, since it seemed to be a popular thing this summer and we though the medium still has great potential to do really entertaining stuff.

    The idea already had a few years but never had been implemented and at its core was quite simple. It took some ideas from the classic game “pong” and adapted it to be played in a fun way while navigating through a real world area.

    We called it PonGO and essentially the users must agree on a playing field, such as city block, a city or even bigger areas, then they connect their phones and the ball starts rolling. The players have to move around with their phones (which they use to see the map and track everyone’s position) trying to catch the ball and throw it to the other side of the map. The player that is able to do it more times wins the game. Here is sketch we did while discussing the project:

    Initial Sketch
    Initial Sketch

    As you can see in the above image, that would be on the phone’s screen, the player (in yellow) reached close enough to the ball so it can play it, now he has to change the direction to one of the opposite sides (marked as green). The other players (in blue), will have to run to catch the ball before it gets out. Spread across the map you can see some power ups that give users special capabilities.

    That’s it, it might seem easy but doing it in less that 48h is not. We ended with a working version of the game but the power ups were not implemented due to time constrains. Here are some screenshots of the final result(we used the map view instead of the satellite view so it might look a little different):

    In game screenshotsIn game action

    The code itself is a mess (it was an hackathon what were you expecting) and can be found here and here.

    At the end, it was a great event as usual and I would also like to congratulate some of my coworkers at Whitesmith that took home the 7th place in the competition. Next year I hope to be there again (and you should too).

  • An heavy chat application

    Following up the quote I pointed some time ago in an entry entitled “Bloat”, I will let here one good example. I know Slack it is a great application and it has some complex features, yet I don’t see any reason for any chat program to steal almost 900 Megabytes of my computer’s memory.

    slack memoryNote: I know the version I am using is still in beta but c’mon guys that’s too much.

  • “Bloat”

    Last week I’ve read a great post entitled “Web Design: The First 100 Years“, it is a long one but definitely worth reading. I will just leave here a quote (3 short paragraphs) because it puts into words something that already crossed my mind multiple times.

    “A further symptom of our exponential hangover is bloat. As soon as a system shows signs of performance, developers will add enough abstraction to make it borderline unusable. Software forever remains at the limits of what people will put up with. Developers and designers together create overweight systems in hopes that the hardware will catch up in time and cover their mistakes.

    We complained for years that browsers couldn’t do layout and javascript consistently. As soon as that got fixed, we got busy writing libraries that reimplemented the browser within itself, only slower.

    It’s 2014, and consider one hot blogging site, Medium. On a late-model computer it takes me ten seconds for a Medium page (which is literally a formatted text file) to load and render. This experience was faster in the sixties.” Maciej Cegłowski

  • Folding@Home

    Recently I’ve started “folding” again, to give a small contribution to science and research on important topics such as Alzeimer’s disease among others (as the above video shows). After 2 previous failed attempts (the old computer could not handle it), I’m currently on my longest streak and have just completed the first 100 work units. I know that many people would like to contribute to a project like this but simply don’t know of its existence. So sharing is important.

    With almost 15 years, the project continues to thrive and the performance of the overall system continues to grow, mostly pushed by the advances of technology, because the participation, as Wikipedia shows, is far from its peak of 450k processors in 2011. During its existence the team responsible for the project was able to publish 118 scientific papers, based on the results obtained by the collaborative work done by all personal computers that joined the network. It is a visible amount of work that is certainly important in humanity’s continuous fight against these diseases.

    The project maintains a leaderboard with stats of its users and teams, making it somewhat fun to see your performance and to compare with others (Portuguese fellows come on and join the team). Contributing is relatively easy and cheap, so i challenge you to start. If you have interest to know what kind of work is being done at the moment throughout the network, the project publishes that information in their website.

    To make it easier to see your progress without having to open other programs or websites, I’ve made a simple plasmoid to give you that information.

    fah-plasmoid screenshot
    Screenshot of the current version of fah-plasmoid

    So if you are a KDE4 user (one version for Plasma 5 is coming out of the oven soon) you can get it here. It isn’t complete yet, since I’ve done it quickly while learning about KDE development, but it is usable.

  • Lenovo and men in the middle

    Another week, another scandal. The general public might pass by without noticing the recent news about Lenovo computers but the tech community in the Internet is incredulous. What we witnessed was serious and a betrayal of the customer confidence, so in this post I will try to briefly cover everything that I’ve read about the issue and point out how this affects who bought a Lenovo computer in the last 6 months.

    What happened

    Basically the computers were sold with a piece of very intrusive ad-ware (that could be called malware since it is not that different). This software supposedly stands in the middle of every Internet connection that the computer makes (even secure ones) and tries to inspect its contents and inject advertisement on the websites that the users visits [source] [prof].

    On the technical level, this software was able to avoid the securities measures and alerts implemented by browsers by issuing a self-signed root certificate that was added to the list of Trusted Certificate Authorities. This way it was able to trick the browser into thinking that it was connecting to the valid website, issuing certificates when needed, when instead it was talking with the ad-ware (SuperFish) and the secure connection was instead being made by it [source].

    What are the consequences

    Besides users being spied and secure connection being compromised (for example.with bank websites) by the hardware vendor, like many as already stated, this leaves a huge security hole that can be exploited by people with bad intentions. [source]

    In fact as we can see in this tweet, once this issue was uncovered people started digging into the subject and already uncovered the private key, with gives the anyone the ability to sign certificates, tricking the affected users into believing they are visiting the correct website when in reality they are on a malicious one. According with some articles it was relatively easy and the password is the same for every machine.  [source]

    What can be done

    Thankfully, given the enormous pressure on the Internet and media attention, the company tried some excuses and provided some tools to remove the software. But … there is always a but, the less alert users might not know they are vulnerable and it seems the certificate problem is still persisting (probably the worst issue). Fortunately Microsoft stepped in and its windows defender tool that comes bundled with the operating system will automatically clear the software and reset all certificates. [source]

    For the most suspicious users, some people created tools to check if the machines are still vulnerable (here and here).

    Summing up, this serves as a reminder to be careful with the software that you install in your computer. If possible, when acquiring a new machine, the first step is to clean the disc and install everything yourself, i recommend using a Linux based operating system.

    P.S.: Digging into the root of the issue and knowing who crafted the problematic software.

  • On buying new hardware

    When I was buying my laptop some years ago, I wished I knew of a website with a database of hardware that works well with free software. Specially with any operating system based on Linux, so I wouldn’t get into too much trouble to get everything working. Instead I ended up purchasing a machine that came with Windows and a bunch of hardware that depended on proprietary drivers to work well. It took ages to get every feature to work as it should on my chosen distribution.

    Recently I discovered h-node, a website created by the free software foundation (FSF) together with Debain GNU/Linux which tries to:

    … aim at the construction of a hardware database in order to identify what devices work with a fully free operating system.

    Since not everyone uses Windows or Mac OSX, I hope this might be helpful to those reading this blog. As for me, next time I need to buy something I already know where to start my research.

  • Leap

    Just found out about the existence of this device, at OMG Ubuntu, 5 minutes ago. This is one case where internet’s common expression “Shut up and take my money” fits very well. Seems to be amazing and is affordable.

    Can’t wait to put my hands on one of this toys together with one Raspberry Pi. More info at http://leapmotion.com/ .

  • Almost 9 zeros

    Take a picture, add some pre-made effects and share. Now just wait for people to leave comments on your “work”. These were the steps that i made till i reach this photo:

    instagram

    The biggest tech news in the last few days was around this simple process and the application that is behind it. It’s called “Instagram” and Facebook just paid more than 900 million dollars for it (more than The New York Times current market value, according to some websites).

    I already knew the service but never had tried because it was only available for the iphone. Recently with the launch of the android version and all the buzz around the deal with facebook, i decided to try the app and see if it worth all that noise around it.

    The greatest weakness that i see on the service is that it doesn’t have a web interface, so basically all the network only functions within the mobile app, which is very limiting. People are comparing this application with Youtube, saying it is the same thing for photos, but Youtube works everywhere (almost all platforms) and Instagram doesn’t.

    Besides that and in a overall view, the app is addicting, has a nice design and you learn fast despite the icons and menus aren’t obvious at first (on the android version). I found myself watching photos and more photos that i didn’t even care about, and that is a good sign.

    In conclusion, i am from the opinion that it was a nice move from facebook to buy Instagram but it was very (very) overvalued, since there are lots of other great ways to share photos on the web.

    Time will tell if it was well spent money or not.

  • Finally some choice

    Yesterday when walking around in a mall, I decided to enter one Portuguese store called “Worten” to check some notebooks and other equipment, to stay updated about what this stores are selling and the current prices.
    I already had read about some netbooks being sold in Portugal with Ubuntu but I thought it was only advertising and that in reality it would never happen, so if I wanted to buy one computer without Windows, Internet was the only solution. But apparently I was wrong.
    So when checking the notebooks I noticed a little orange badge, on the cheapest computer that was on the shelf, saying “Ubuntu Certified”,  suddenly I was shocked and thought “Finally we have a choice”.
    So to prove it, some shots below:

    ubuntucert4 ubuntucert3 ubuntucert2 ubuntucert1_0

    The specifications of the equipment doesn’t match with the one that was announced because it’s another model, but doesn’t matter, still is a step forward.

    P.S.: The battery, in my point of view, is the weakest part, 4,5 hours is not so good.

  • Light Field Cameras

    lytroA new era in digital photography is coming, maybe because the new technology that a startup called Lytro is using to build their cameras. Calling it a “revolution”, they presented to the world a new type of camera and new file format that will allow you to take a shot and refocus latter when you’re in your computer.

    Many websites already wrote about the subject (TechSpot or Technology Review), so in a simple and short way, it’s a camera that capture light rays in more than one direction, allowing you to choose later what to focus in the picture. For more details read the above mentioned articles or visit Lytro’s website.

    I just wanted to share that I’ve played with some of the example pictures they have in their gallery and it’s quite impressive, now we will have to wait to see if the price will be affordable. Now try this example: