this post was submitted on 20 Jan 2024
633 points (100.0% liked)

196

16491 readers
1551 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] taaz@biglemmowski.win 9 points 9 months ago* (last edited 9 months ago) (3 children)

Correct me if I am wrong here but isn't this like the best example of why the current "AI" isn't taking over anything anytime soon or shouldn't be doing critical stuff?
Like, this is almost exactly how current LLMs work.

Edit: yeah no, I was wrong on the internet! Was sleepy, and I think I imagined that the secondary scenario never ocurred in the trained dataset, requiring a true deduction... ?

[–] HandMadeArtisanRobot@lemmy.world 17 points 9 months ago (1 children)

Yeah, you are wrong. This has nothing to do with LLMs or how AI today works. What was it that led you to that conclusion?

[–] taaz@biglemmowski.win 3 points 9 months ago (1 children)

I've edited the comment with some extra, but I would still rather say my yesterday me was just high and act like this never happened haha

[–] Draconic_NEO@lemmy.dbzer0.com 8 points 9 months ago

It's not how LLMs work though, an LLM would know the difference between these scenarios due to context given. I would go as far as to say it isn't even ML related, it's just a joke about defining a global variable and using it blindly everywhere.

[–] stevehobbes@lemy.lol 5 points 9 months ago* (last edited 9 months ago)

No. LLMs have context and know that words have context. This would be the exact opposite of ”AI”. This is analogous to defining a global variable “hot” as 1.9m kelvin, and then blindly using that for hot everywhere the word hot is used.

AI, even current iterations, know that a hot stove will be hotter than hot tea. And they’re both less than the hot that is the surface of the sun.

The whole achievement of LLMs is that they learn all of that context - to guess with certainty of some percentage that when you’re talking about hot while talking about tea that you mean 160-180 degrees or whatever, and when talking about hot oil it might be 350 degrees if you’re frying, or 250 degrees if you’re talking about cars. And if you’re talking about people, hot means attractive.

That’s exactly what LLMs do today. Not 100% perfectly, there are errors and hallucinations and whatever else, but that’s the exception not the norm.