• Letting go of perfection: Just write

    Letting go of perfection: Just write

    One of the things I realized while trying to publish more is that things often get in the way, and every time, there is a new obstacle. This is true for many goals we set in life. But why is that?
    In this post, I want to make the case for why you should try to publish more, and why you should pay less attention to your own doubts.

    For example, when I start writing something, I notice that there’s a new design I could use, or there are some theme improvements I could make to my blog. Sometimes, I realize that a specific topic might be outdated in a couple of months, and I keep asking myself: Is it still worth writing?
    Should I write about this? Then, there’s the judgment—wondering if it’s even worth it, or if people already know about it.

    In a world increasingly filled with AI-generated content, I believe it makes more sense than ever to publish and share what only a human can.
    What can we do to express our thoughts and ideas?
    Remember: our perspectives are truly unique. These are the qualities that make humanity so fascinating.

    For example, if you think again about “Should I write this?” you see that there will always—or at least often—be someone who wanted to hear what you had to say but never could, or someone who agreed with your thoughts or could learn from you.
    I’ve seen this happen repeatedly, both in my own experience and with friends. We take our thoughts and ideas for granted, but we should share them more because the world needs ideas, connection, and human voices.

    In an ever-more-connected world, this is what makes us unique and irreplaceable: that connection and shared understanding, those ideas we try to spread, and the way we see the world.
    Those are the reasons that make you, well, you.

    One thing that improved my writing was consistency.
    Lately, I’ve been enjoying the process of using dictation, and some tools with extra features that help me write as I speak.
    You may find you have plenty of ideas but aren’t good on writing.
    Dictation tools like MacWhisper or WisprFlow have come a long way. If you connect them to AI tools, they can clean up the words and further refine your contents, without losing your voice..

    So, whenever you find yourself wondering, “Should I write about this?” I would argue: yes, please do.
    Just focus on the writing; don’t worry about the rest.

    Distracted? Go back to writing or consider dictation as I mentioned.
    Want to change your blog design? Write instead.
    Yes, eventually you might want to improve the design, but writing is the thing you want to eventually do, so do that. Don’t work around it, don’t procrastinate. Just write.

    And in case you feel you have yet to learn how to write well or you are scared about how people will judge your writing then have a separate blog, no comments, where you write daily or very frequently.
    Small phrases, long stories, whatever works for you.
    As with many of our skills, writing takes time, and the only person that can put that time in is you.


  • Onboarding AI vs Onboarding Humans

    Onboarding AI vs Onboarding Humans

    There’s one thing I realized when Cursor and other AI-integrated code development systems came out. Some of them allow you to set rules to better help the AI navigate your codebase. For example, you can tell the AI how to use the codebase, where to find the code related to a specific feature, how to structure that code, what the intention behind it was, and if there is any unusual behavior in the code.

    By behavior, I mean cases where the code might be structured in a certain way but still have edge cases that don’t quite make sense or differ from what you would expect just by looking at the folder structure.

    I’ve started to see some tweets about how this is great and I agree. It’s great.

    What I realized is that this is exactly the kind of information you share when onboarding a new team member.

    Usually, you either have some documentation in place or, after some time working together, you end up having specific onboarding sessions. During these sessions, you explain things like, “This is why we have this ‘views’ folder in two different paths — because XYZ.” There’s a reason for that structure even if, at first look, it doesn’t make sense.

    The unwritten rules like: “If you want to work in the first ‘views’ folder, that’s because you want to create a component for a view. On the other hand, if you want to create something in the other ‘views’ folder, that’s likely because you want to define the actual view.

    Besides the argument that my example could be or not be confusing to people, this is exactly the kind of information we also try to include in tools like Cursor rules when developing. It’s curious — we’re trying so hard now to make things clear for an AI, whereas we weren’t trying this hard before for humans.
    It’s funny in some ways, but it’s also what I think we should have been doing from the start.

    We should maintain the code and accompanying documentation so that someone can work on the code as quickly as possible.
    It’s interesting, at least in this moment in time, how the needs of AI are similar to the needs of a human in terms of understanding the code and knowing all the quirks and strange logic that might have been implemented.

    The challenge is still the same: how do we keep the mental model or internal structure of a project up to date so that it makes AI faster and keeps the codebase readable by humans?
    This would help both humans and AI move faster.

    The challenge, at least in the short term, is how to make maintaining this documentation and structure easier.
    For me, looking back at all the coding I’ve done in my life, maintenance has always been the most difficult. You’re tempted to just ship and move the project forward because that’s easy. But you also want to keep the documentation up to date and accurate, that’s a fine balance where you tend to do tradeoffs.

    My personal take is that we should integrate updating the structure or documentation into the change flow. So, whenever you do a big or small refactoring, AI should update the internal documentation of the features you changed. That way, it remains clear, current, and useful.

    Different tools will handle this differently, and I expect this article to be outdated in less than a year, given how fast AI is moving.

    While writing this article I discovered how Devin.ai includes some kind of internal scratchpad that it uses to know how your code works, plus it adds (thinking about human, again), their deepwiki.com to ensure humans can still navigate the complexities of a codebase.
    If you’re using Cursor/Windsurf, .cursorrules are your friend. And you can add prompts in them to ensure they keep either a scratchpad (like Devin) updated, or keep the docs (which, in this case, should live in the same codebase) updated at every PR.

    It’s interesting to see how this will play out in the future. I would suggest everyone keep this in mind because maintaining accurate documentation and structure benefits everyone working with your products.


  • Let’s talk about Burnout, the enemy we cannot ignore

    Let’s talk about Burnout, the enemy we cannot ignore

    This article is a translation of the Italian version I published in 2021 and aims to give some ideas on how to handle burnout both from a personal and an company-wide perspective.

    If you’re currently struggling with burnout, consider seeking help from a professional.
    There are many options available (one of them is Wysa) that can help you work through it.

    Onto the article…

    LinkedIn is a beautiful place where we can list our successes, but no day passes where I don’t see one key element missing in the conversation: Let’s talk about the dark side of work, the burnout.

    What is burnout, and why knowing its existence is not enough?

    Let’s understand first that burnout kicks in after stress.
    When stress, which is usually temporary, persists over a long period of time and is not followed by relaxation.

    When physical stress (fatigue), mental stress (work) or social stress (requirements) do not diminish/resolve.

    Burnout can be many things, fatigue, surrender, etc.
    But more than any of these, it’s loneliness.

    Even more so in a society of overachievers, of heroes, of exceptional people, burnout is something we fear to show, to tell.
    Or, even worse, to see in others.

    We quibble, “it’s just a moment”, “it’ll pass”.
    That’s the voice inside us speaking when we think about our problems or when we discover the struggles someone is going through, when we see them too much exhausted.

    Truth is: Burnout doesn’t heal the way we expect.
    It marks us, a scar is left if we get over it, or we sink if we never heal from it.

    (more…)

  • Why did I start Working Part-Time instead of Full-Time and the Benefits of a 4 Days Workweek

    Why did I start Working Part-Time instead of Full-Time and the Benefits of a 4 Days Workweek

    Around august 2015 I decided that I wanted to step back from my 9-5 job and work part-time for the same company and it was one of the best decisions of my life.
    In this post I want to highlight the story, the tradeoff, the benefits and the downsides of this choice to help everyone that is uncertain about it have a better understanding of what might happen.

    But let me rewind for a second. (more…)


  • Nginx and MariaDB Issues Fix on Upgrading DigitalOcean Droplet from Ubuntu 14.04.5 to 16.04.1

    Nginx and MariaDB Issues Fix on Upgrading DigitalOcean Droplet from Ubuntu 14.04.5 to 16.04.1

    Today after I updated the packages on my blog digital ocean droplet (here’s my entire wordpress setup) I discovered that it was ready to be upgraded to the new Ubuntu 16.04.1 LTS.

    DigitalOcean already offers a wonderful guide for you to follow, but there were some caveats on my system that needed to be fixed before and after the upgrade.

    Those were

    • MariaDB repository
    • Nginx failed to restart

    here’s how I fixed it. (more…)


  • Is DuckDuckGo a Worth Alternative to Google? My 6 Months Test

    Is DuckDuckGo a Worth Alternative to Google? My 6 Months Test

    If you’re here I suppose you already know the DuckDuckGo search engine, it’s a free search engine that doesn’t track you and cares even more about your privacy.

    Here there are my thoughts about its pros and cons. (more…)


  • How to Automatically Backup your OpenShift Applications with the Openshift Backup Server

    How to Automatically Backup your OpenShift Applications with the Openshift Backup Server

    Did you know there’s a way to automate the backup of your openshift apps?
    I didn’t until I found it on the main OpenShift blogs.
    Turns out there’s a whole webapp that does it beautifully and it takes 3 simple steps to setup a scheduled backup for your application. (more…)


  • Powerful and Cheap WordPress Blog Setup with PHP 7.0, NGNIX, Memcached and MariaDB 10.1 on DigitalOcean 512MB Droplet

    Powerful and Cheap WordPress Blog Setup with PHP 7.0, NGNIX, Memcached and MariaDB 10.1 on DigitalOcean 512MB Droplet

    I have a few wordpress blogs now, some of them had some spikes in visits, some are quite unpopular, but still I always wanted an easy and secure system to host them all without sacrificing speed and money.

    After many trials and errors, I am quite happy with my actual setup which is

    • PHP7
    • Ubuntu 14.04
    • Nginx+Memcached
    • WordPress

    All hosted on a 512MB droplet on DigitalOcean (ref.link) but you can easily use any VPS provider. I just like the overall DigitalOcean service and support. (more…)


  • How to migrate a Node.js App from Heroku to OpenShift

    How to migrate a Node.js App from Heroku to OpenShift

    Given the recent price/tier changes of Heroku I wanted to understand what alternatives do we have, as developers, to host a Node.js application free of charge.

    Let me be clear upfront… there are not so many alternatives right now, and I can’t praise Heroku enough for putting out a free plan like the one in the past.
    The new Heroku freeplan will limit the overall hours your app can stay on, therefore making it not very feasible for apps that need to be always on (although not very task-consuming).

    So I decided to settle on OpenShift, and see how it would go.
    To do the test I tried porting Haptime.in in OpenShift.

    Let me show you what I discovered. (more…)


  • Recovering a Corrupted embedded Apache Derby Database after an error XJ040 (or if you got an error XSDG2)

    Recovering a Corrupted embedded Apache Derby Database after an error XJ040 (or if you got an error XSDG2)

    Apache Derby is a usually a strong database but sometimes it might get corrupted.

    This is what worked for me after many embedded  derby databases became corrupted because we finished the free space on the partition (yes, shame on us).

    To make things worse, we didn’t always have a recent backup of the db, which just grew the complexity of our problem. (more…)