11
What the h... (lemmy.world)
submitted 1 year ago by sysadmin@lemmy.world to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] Kichae@kbin.social 1 points 1 year ago

Yuuuup.

Language models, just like any model, only interpolate from what they've been trained on. They can answer questions they've seen the answer to a million times already easily enough, but it does that through stored word association, not reasoning.

In other words, describe your symptoms in a way that isnt popular, and you'll get "misdiagnosed".

And they have a real problem with making up citations of every type. Fabricating textbooks, newspaper articles, legal decisions, and entire academic journals. They can recognize the pattern and utilize it, but because repeated citations are relatively rare compared to other word combinations (most papers get cited dozens of times, not millions like LLMs need to make confident associations between words), they just fill in basically whatever into the citation format.

this post was submitted on 20 Jun 2023
11 points (76.2% liked)

Memes

45190 readers
4212 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS