I wanted to make this article since The Primeagen made this react youtube video about the fact that people say AI is making you dumb. As a person who used AI to generate code before Cursor was a thing I have a few things to say about this.
Are we going to have the same discussions our parents had with us when we started using the pocket calculator at math class? I can still hear my mom saying “stop using the pocket calculator, or you will never learn to do math.” But we learned math anyway, so the pocket calculator was just a helping tool. And I don’t feel dumb, to be honest.
Now, I know there are studies out there claiming AI makes developers slower. I’ve seen headlines about how developers thought they were 20% faster with AI but were actually 19% slower. And you know what? I believe those results, but not for the reasons the researchers think. The problem isn’t the tool, it’s how people used it in the study.
Running a controlled experiment where you hand developers AI tools they’ve never used before and measuring their speed is like giving culinary students iPads for watching cooking courses and then being shocked when they use them as cutting boards. You’re not measuring the tool’s potential, you’re measuring unfamiliarity and misuse.
The reality is that AI tools require a learning curve and a fundamental shift in workflow. You can’t just drop them into someone’s lap and expect instant productivity gains. It takes time to learn when to use them, when to ignore them, and how to integrate them into your thinking process. Those studies measure the wrong thing at the wrong time.
The pattern repeats itself
If we look back at history, this isn’t the first time we’ve had this conversation. When books became widely available, scholars worried that people would stop memorizing important information. When search engines appeared, teachers feared students would never truly learn anything. When GPS navigation arrived, people said we’d lose our sense of direction.
Yet here we are, arguably more knowledgeable and capable than ever before. The tools didn’t make us dumb, they freed up mental space for more complex thinking. We don’t need to memorize phone numbers anymore, but we can tackle problems our grandparents couldn’t even imagine.
Nothing makes you dumb except being one.
This blaming technique makes me react. Every time we have an issue in our lives we tend to say it’s someone else’s fault, or the context was in such a way that made me fail. I think this is happening now with this topic, some people are failing to keep up with the tech news and suddenly they blame AI.
Let me draw a quick conclusion from the start. AI is not making you dumb, you are not exercising your brain enough to keep it sharp, that’s all.
Maybe you are using it the wrong way.
Since the dawn of the AI coding era, people tried to find a full replacement for programmers by using AI models to fully code applications. I think this approach is called “vibe-coding”.
Now AI is great at sharing ideas, exploring the best ways to do one thing or another, but what AI models don’t have is real reasoning. Their way of reasoning is still mathematical or probabilistic based, so at the end of the day, human reasoning will always be superior (and not all humans, to be honest).
And this is the root of the problem. People look at AI as a way of tricking the system and finding a replacement for their job, while they drink a piña colada on the beach. The bad news is that it’s not working like this: AI needs supervision.
I’m all in for AI coding under supervision. I code daily with AI but I review every piece of code. I decide the architecture, I take the responsibility for the outcome, and when it blows up I’m responsible for it, so I’d rather understand every line of code generated.
If at some point I’m losing the understanding of the generated code, then the project is doomed. I have enough experience with AI coding to know that at some point, it will lose the context of the project and it will not know anymore the entire concept of the project, which will confuse both of us.
So, if you want to not be dumb, don’t act like one! Do not prompt the request and walk away from the PC. You still need to understand the result, you still need to adjust the result as best as you can, and that will keep your mind sharp.
I call it “co-op coding,” which means that there are two entities working there, not one.
The skill shift: What actually matters now
Here’s something interesting that most people miss in this debate: AI isn’t replacing skills, it’s shifting which skills are valuable. When was the last time you needed to remember syntax for a rarely-used function? AI handles that instantly. But can AI understand your business requirements? Can it architect a scalable system that will work for your specific use case? Not really.
The skills that matter now are higher-level: problem decomposition, critical thinking, understanding trade-offs, and knowing when the AI is giving you garbage. These are the same skills that separate senior developers from junior ones, and AI hasn’t changed that. If anything, it’s made these skills more important because now you need to evaluate AI output on top of everything else.
I’ve noticed something in my own work: I spend less time on boilerplate and more time on architecture decisions. Less time googling syntax, more time thinking about user experience. The cognitive load has shifted, not disappeared. And honestly? The problems I’m solving now are more interesting than debugging a missing semicolon.
The learning curve paradox
There’s a paradox I’ve observed: AI can actually make you learn faster, but only if you use it correctly. When I’m exploring a new technology or framework, I can ask AI to explain concepts, generate examples, and help me understand patterns much faster than reading through documentation alone. It’s like having a patient tutor available 24/7.
But here’s the catch, this only works if you’re actively engaged. If you copy-paste without understanding, you’re not learning, you’re just moving code around. The difference is intention. Are you using AI to accelerate your understanding or to avoid it entirely?
I’ve started treating AI like a sparring partner for learning. I ask it to explain something, then I challenge that explanation. I ask “why” repeatedly. I ask it to show me alternatives. This active engagement keeps my brain working while still benefiting from the speed AI provides.
The dependency trap (and how to avoid it)
Let’s be honest, there is a real risk of becoming dependent on AI in an unhealthy way. I’ve seen developers who panic when their AI coding assistant goes down, or who genuinely don’t know how to solve a problem without asking AI first. That’s not co-op coding, that’s outsourcing your thinking.
The way I avoid this trap is by regularly coding without AI. Sometimes I’ll spend a session deliberately not using any AI tools, just to make sure I still can. It’s like a musician practicing scales, it keeps the fundamentals sharp even when you’re using advanced tools for actual performance.
Another thing I do is set boundaries. AI is great for generating test cases, creating boilerplate, or exploring API options. But core logic? System design? Those critical decisions? I make those myself first, then maybe consult AI for validation or alternative perspectives. The order matters.
What about the next generation?
A common concern I hear is: “What about junior developers or students who start with AI from day one? Won’t they miss the fundamentals?” It’s a valid question, but I think it misses something important about how learning actually works.
People who grew up with calculators still learned math. People who grew up with GPS can still read maps if needed. The fundamentals get learned differently, but they still get learned; If the person is motivated to understand, not just to complete tasks.
I agree that schools and teachers have a real challenge nowadays because every student can use AI on homework. But this is an issue with how our educational system decided to teach and test. They need to adjust their approach NOW, and that’s their responsibility, not ours. I’m not going to feel bad about a teacher who’s paid to educate but hasn’t kept up with technology. Teaching requires continuous learning first. If you’re not evolving with the times, you’re in the wrong profession.
There’s nothing wrong with a kid who looks for the fastest way to solve a problem. If that student figured out how to leverage AI, good for them, that’s exactly the skill they’ll need in society. The ability to use available tools effectively is far more valuable than passing trivia tests that measure memorization over understanding.
The real question is whether we’re teaching critical thinking alongside tool usage. Are we teaching students to question AI output? To understand why something works, not just that it works? Are we designing assessments that test understanding rather than just completion? If we do that, they’ll be fine. If we just teach them to prompt and pray, then yes, we have a problem.
The creative advantage
Here’s something positive that doesn’t get talked about enough: AI can actually make you more creative, not less. When you’re not bogged down in syntax and boilerplate, your brain has more space for creative problem-solving. You can prototype ideas faster, test multiple approaches, and iterate more quickly.
I’ve built projects with AI that I wouldn’t have attempted before, simply because the initial setup time has dropped dramatically. This means more experimentation, more learning through doing, and more creative solutions. The constraint was never my creativity, it was time.
Think about artists who embraced digital tools. Did Photoshop make them less creative? No, it gave them new ways to express their creativity. AI is the same for developers. It’s a new medium, and like any medium, it rewards those who master it rather than just dabble with it.
Measuring intelligence in the AI era
Maybe the real issue is that we’re using outdated metrics to measure intelligence. We used to value memorization, quick recall, and the ability to write perfect syntax from memory. But are those really the markers of intelligence, or just the markers of what was necessary before we had better tools?
Intelligence in the AI era looks different. It’s about synthesis, taking information from multiple sources and creating something new. It’s about judgment, knowing when to trust AI and when to question it. It’s about adaptability, learning new tools and techniques as they emerge. These are all forms of intelligence that matter just as much, if not more, than traditional markers.
The developers I respect most aren’t the ones who can write bubble sort from memory, they’re the ones who can jump into a 12 years old code base and fix bugs in it. AI doesn’t threaten any of those abilities.
Conclusion
The debate about AI making us dumb is missing the point entirely. Tools don’t make us dumb, our relationship with them does. Just like calculators didn’t destroy our ability to do math, AI won’t destroy our ability to think, code, or create. What matters is how we use it.
AI is a powerful amplifier. If you use it to avoid thinking, it will amplify your laziness. If you use it to explore ideas faster, validate approaches, and handle repetitive tasks while you focus on architecture and critical decisions, it will amplify your productivity and skills.
The skills that matter have shifted, not disappeared. Critical thinking, system design, problem decomposition, and quality judgment are more valuable than ever. The difference is that now you also need the meta-skill of working effectively with AI, knowing when to use it, when to trust it, and when to override it.
So next time someone tells you AI is making people dumb, ask them: Are they using it as a crutch or as a catalyst? Are they copying without understanding or learning faster with a powerful tutor? The tool isn’t the problem, it never was. It’s all about whether you choose to stay engaged, curious, and in the driver’s seat.
The future belongs to those who can think critically while leveraging powerful tools, not to those who refuse the tools out of fear or those who blindly trust them without thinking. Find that balance, and you’ll be sharper than ever before.
Note: I’m not a native English speaker nor a designer; so the thumbnail and the grammar check for this article were made with AI.
Leave a Reply