this post was submitted on 07 Mar 2024
486 points (97.5% liked)
Technology
59314 readers
4798 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have never trusted AI. One of the big problems is that the large language models will straight up lie to you. If you have to take the time to double check everything they tell you, then why bother using the AI in the first place?
If you use AI to generate code, often times it will be buggy and sometimes not even work at all. There is also the issue of whether or not it just spat out a piece of copyrighted code that could get you in trouble if you use it in something.
I'm using Github Copilot every day just fine. It's great for fleshing out boilerplate and other tedious things where I'd rather spend the time working out the logic instead of syntax. If you actually know how to program and don't treat it as if it can do it all for you, it's actually a pretty great time saver. An autocomplete on steroids basically. It integrates right into my IDE and actually types out code WITH me at the same time, like someone is sitting right beside you on a second keyboard.
Um... that's a trait AI shares with humans.
You have to double check human work too. So, since you are going to double check everything anyway, it doesn't really matter if it's wrong?
... again, exactly the same as a human. Difference is the LLM writes buggy code really fast.
Assuming you have good testing processes in place, and you better have those, AI generated code is perfectly safe. In fact it's a lot easier to find bugs in code that you didn't write yourself.
Um - no - that's not how copyright works. You're thinking of patents. But human written code has the same problem.