this post was submitted on 26 Jun 2023
52 points (100.0% liked)
Technology
37712 readers
268 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The thing is - and what's also annoying me about the article - AI experts and computational linguistics know this. It's just the laypeople that end up using (or promoting) these tools now that they're public that don't know what they're talking about and project intelligence onto AI that isn't there. The real hallucination problem isn't with deep learning, it's with the users.
The article really isn’t about the hallucinations though. It’s about the impact of AI. its in the second half of the article.
I read the article yes
Spot on. I work on AI and just tell people "Don't worry, we're not anywhere close to terminator or skynet or anything remotely close to that yet" I don't know anyone that I work with that wouldn't roll their eyes at most of these "articles" you're talking about. It's frustrating reading some of that crap lol.