Are AI Assistants Making Us Worse Programmers

This is a good article, thanks for sharing. It aligns with a lot of my experiences using AI tools. It might not be magic, but it certainly feels like a game-changer if you use it wisely.

Zenith said:
This is a good article, thanks for sharing. It aligns with a lot of my experiences using AI tools. It might not be magic, but it certainly feels like a game-changer if you use it wisely.

I completely agree!

I think it can learn how to approach things differently. But you have to review what it does; it’s not infallible.

Anyway, let’s wait for the laughs

I’m not sure. But I can say I’ve resolved several bugs that would have taken me hours in just seconds.

They aren’t making us worse; they allow unskilled programmers to contribute.

It’s just another tool for simple tasks.

Only if you were already bad

AI assistants show that people still struggle with GUIs. For 20 years, we could have developed easy ways to access documentation, snippets, and workflows. Yet, here we are, just dumping everything into AI and hoping it works.

Sort of; I’m lucky to have 20 years of experience and enough knowledge to be useful. I can use AI to create the framework for functions and then easily spot mistakes, which is often about half the time.

The juniors and mid-levels on my team take AI answers at face value, which usually leads to chaos. But honestly, I’m so burned out that I just let it go — let the company face the consequences as they treat me poorly and underpay.

@Zayden
100%. I’m worried for junior developers who don’t remember a time before AI assistants.

It’s basically a more user-friendly Stack Overflow without any moderation.

Would it make you a bad programmer if you’re LEARNING through it? Absolutely.

But if you’re just using it to find alternatives to a solution you already have? I would say it’s just another tool for making things easier.

Brought to you by programmers who already have no clue how the software they use operates.

In my view, yes. It seems like people are too quick to let AI do most of their thinking instead of using it to support their work.

Personally, I’ve only used it as a quick way to find documentation. Things like why this PLSQL query causes an error or checking if there’s a library for URI encoding in a certain language.

The rise of AI assistants reminds me of when SSDs became the norm for gaming. Initially, it was make your games load faster, but then studios started insisting on them to the point where you had to have one to play. I think about this every time I’m struggling with Rust’s borrow checker and end up handing it off to an AI.

40 years of experience, tried AI for two weeks, and it was just a mess.

It really depends on the dataset.

It can be awful. Sure, it produces code, but it often needs a lot of reworking. Plus, if your code is too long, it can drop sections.

It also gives bad advice if your code isn’t like the average application. But it’s okay for standard JavaScript and HTML, useful for dummy data, and good for conversation.

The science of cognitive offloading suggests it does make us worse, similar to how smarter phones make people lazier.

Yes.

One example of many: https://www.mdpi.com/2076-3417/14/10/4115

It makes new programmers weak. I would tell beginners not to copy-paste code they don’t understand or can’t write themselves. This goes for AI too. Even if it sometimes works, if it prevents learning, it does more harm than good.

I don’t know if I can get any worse :joy: