this post was submitted on 07 Jul 2023
45 points (100.0% liked)

Programming

13376 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I’m a dev. I’ve been for a while. My boss does a lot technology watch. He brings in a lot of cool ideas and information. He’s down to earth. Cool guy. I like him, but he’s now convinced that AI LLMs are about to swallow the world and the pressure to inject this stuff everywhere in our org is driving me nuts.

I enjoy every part of making software, from discussing with the clients and the future users to coding to deployment. I am NOT excited at the prospect of transitioning from designing an architecture and coding it to ChatGPT prompting. This sort of black box magic irks me to no end. Nobody understands it! I don’t want to read yet another article about how an AI enthusiast is baffled at how good an LLM is at coding. Why are they baffled? They have "AI" twelves times in their bio! If they don’t understand it who does?!

I’ve based twenty years of career on being attentive, inquisitive, creative and thorough. By now, in-depth understanding of my tools and more importantly of my work is basically an urge.

Maybe I’m just feeling threatened, or turning into "old man yells at cloud". If you ask me I’m mostly worried about my field becoming uninteresting. Anyways, that was the rant. TGIF, tomorrow I touch grass.

you are viewing a single comment's thread
view the rest of the comments
[–] ptz@dubvee.org 7 points 1 year ago (3 children)

Nope, I fully agree with you.

These "AI" tools have no more understanding of what they crap out than a toddler who has learned to swear (someone else made that comparison, I'm just borrowing it).

Have you ever done a code review with someone, asked about a specific part, and they say "I dunno; I copied it from GPT and it just seems to work" ? I have, and it's absolutely infuriating. Yeah, they could have copied the same from Stack Overflow, I suppose, but I'd treat it the same. But somehow they expect copy/paste from an "AI" to get a pass?

Even without dipping into the "replacement theory" of it, these kinds of tools just allow people who don't know what they're doing to pretend like they do while creating a mountain of technical debt. Even experienced devs who use it are starting down a slippery slope, IMO.

[–] argv_minus_one@beehaw.org 6 points 1 year ago

At least code on Stack Overflow was written by a human who presumably has some idea of what it actually does.

[–] luciole@beehaw.org 4 points 1 year ago

The replacement theory brings up this weird conundrum where the LLMs need to consume human work to train themselves, the very work they seek to replace. So once the plan succeeds, how does it progress further?

[–] gus@beehaw.org 1 points 1 year ago (1 children)

Yeah that's one of the major issues I have with it. It gives people a way to take their responsibilities, delegate it to an AI, and wash their hands of the inevitable subpar result. Not even just in programming, I think over time we're going to see more and more metrics replaced with AI scores and businesses escaping liability by blaming it on those AI decisions.

Back in the realm of programming, I'm seeing more and more often people "saving time" by trying to use GPT to do the first 90% but then just not doing the last 90% at all that GPT couldn't do.

[–] argv_minus_one@beehaw.org 3 points 1 year ago* (last edited 1 year ago) (1 children)

Oh God, I can see it now. Someone makes an AI for filtering job applications, it's great, all the employers use it. Before a human ever sees a resume, the AI decides whether to silently discard it. For reasons known to literally no one, the AI doesn't like your name and always discards your resume, no matter how many times you change it. Everybody uses the same AI, so you never get a job again. You end up on the street, penniless, through no fault of your own.

[–] liv@beehaw.org 5 points 1 year ago (1 children)

According to this Ted talk (at about 8:15) Amazon tried a resume filtering AI but discovered it was filtering out women.

[–] argv_minus_one@beehaw.org 2 points 1 year ago

Yes, and it'll eventually be worked out to the point that it's mostly accurate, but there will always be edge cases like the one I described above; they'll just be rare enough that nobody cares or even believes that it's happening.

Now, humans reviewing job applications are also subject to biases and will unfairly reject applicants, but that only shuts you out of one company. AIs, on the other hand, are exact copies of each other, so an AI that's biased against you will shut you out of all companies.

And, again, no one will care that this system has randomly decided to ruin your life.