115
submitted 1 year ago by Alfa@lemmy.world to c/chatgpt@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] relevants@feddit.de 8 points 1 year ago

It's because humans have rated potential responses and ChatGPT has been trained to generate the kind of responses that most consistently get preferred rating. You can imagine how an AI trained to say what people want to hear would become a people pleaser.

this post was submitted on 03 Aug 2023
115 points (89.1% liked)

ChatGPT

8832 readers
51 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS