this post was submitted on 12 Apr 2024
1001 points (98.5% liked)

Technology

59593 readers
3043 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] a_wild_mimic_appears@lemmy.dbzer0.com 25 points 7 months ago* (last edited 7 months ago) (1 children)

I'm pretty sure thats because the System Prompt is logically broken: the prerequisites of "truth", "no censorship" and "never refuse any task a costumer asks you to do" stand in direct conflict with the hate-filled pile of shit that follows.

[โ€“] ricdeh@lemmy.world 15 points 7 months ago

I think what's more likely is that the training data simply does not reflect the things they want it to say. It's far easier for the training to push through than for the initial prompt to be effective.