Onboarding AI vs Onboarding Humans

There’s one thing I realized when Cursor and other AI-integrated code development systems came out. Some of them allow you to set rules to better help the AI navigate your codebase. For example, you can tell the AI how to use the codebase, where to find the code related to a specific feature, how to structure that code, what the intention behind it was, and if there is any unusual behavior in the code.

By behavior, I mean cases where the code might be structured in a certain way but still have edge cases that don’t quite make sense or differ from what you would expect just by looking at the folder structure.

I’ve started to see some tweets about how this is great and I agree. It’s great.

What I realized is that this is exactly the kind of information you share when onboarding a new team member.

Usually, you either have some documentation in place or, after some time working together, you end up having specific onboarding sessions. During these sessions, you explain things like, “This is why we have this ‘views’ folder in two different paths — because XYZ.” There’s a reason for that structure even if, at first look, it doesn’t make sense.

The unwritten rules like: “If you want to work in the first ‘views’ folder, that’s because you want to create a component for a view. On the other hand, if you want to create something in the other ‘views’ folder, that’s likely because you want to define the actual view.

Besides the argument that my example could be or not be confusing to people, this is exactly the kind of information we also try to include in tools like Cursor rules when developing. It’s curious — we’re trying so hard now to make things clear for an AI, whereas we weren’t trying this hard before for humans.
It’s funny in some ways, but it’s also what I think we should have been doing from the start.

We should maintain the code and accompanying documentation so that someone can work on the code as quickly as possible.
It’s interesting, at least in this moment in time, how the needs of AI are similar to the needs of a human in terms of understanding the code and knowing all the quirks and strange logic that might have been implemented.

The challenge is still the same: how do we keep the mental model or internal structure of a project up to date so that it makes AI faster and keeps the codebase readable by humans?
This would help both humans and AI move faster.

The challenge, at least in the short term, is how to make maintaining this documentation and structure easier.
For me, looking back at all the coding I’ve done in my life, maintenance has always been the most difficult. You’re tempted to just ship and move the project forward because that’s easy. But you also want to keep the documentation up to date and accurate, that’s a fine balance where you tend to do tradeoffs.

My personal take is that we should integrate updating the structure or documentation into the change flow. So, whenever you do a big or small refactoring, AI should update the internal documentation of the features you changed. That way, it remains clear, current, and useful.

Different tools will handle this differently, and I expect this article to be outdated in less than a year, given how fast AI is moving.

While writing this article I discovered how Devin.ai includes some kind of internal scratchpad that it uses to know how your code works, plus it adds (thinking about human, again), their deepwiki.com to ensure humans can still navigate the complexities of a codebase.
If you’re using Cursor/Windsurf, .cursorrules are your friend. And you can add prompts in them to ensure they keep either a scratchpad (like Devin) updated, or keep the docs (which, in this case, should live in the same codebase) updated at every PR.

It’s interesting to see how this will play out in the future. I would suggest everyone keep this in mind because maintaining accurate documentation and structure benefits everyone working with your products.


Discover more from Give me the chills

Subscribe to get the latest posts sent to your email.

Comments

One response to “Onboarding AI vs Onboarding Humans”

  1. Backfilling a Project hidden Knowledge is finally Possible – Give me the chills Avatar

    […] I realized we can actually backfill that information, thanks to AI. You may have heard me mention AI before and that’s because I use it daily for product work and coding, and I think it offers many […]

Leave a comment

Discover more from Give me the chills

Subscribe now to keep reading and get access to the full archive.

Continue reading