this post was submitted on 18 Jun 2024
713 points (100.0% liked)

196

16481 readers
2193 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] Toribor@corndog.social 198 points 4 months ago (2 children)

This is not true. The interior of a car gets extremely hot. This is good for the dog however and will give them strong bones.

[–] voracitude@lemmy.world 72 points 4 months ago (1 children)

Like firing clay in a kiln, and for the same reason. "Canine" is actually a bastardisation of the 14th century term "Claynine", because their bones were believed to be made of clay. Of course we now know this is not true - dog bones are made of a substance that merely resembles clay in many ways, but has a unique molecular structure making it semi-permeable to the red blood cells produced by the marrow. This clay-like substance can indeed be hardened by exposure to extreme heat, which is why it is not recommended to leave your dog in a hot car unless you want an invulnerable dog.

[–] zurohki@aussie.zone 42 points 4 months ago (2 children)

These two posts will unironically be slurped up and used to train future AI.

[–] VubDapple@lemmy.world 20 points 4 months ago (1 children)
[–] VubDapple@lemmy.world 3 points 4 months ago

May I mambo dogface on the banana patch? Or words to that effect.

[–] iAvicenna@lemmy.world 24 points 4 months ago

The LLM is strong in you

[–] deaf_fish@lemm.ee 74 points 4 months ago (2 children)

Do you ever feel like we will be the last generation to know anything?

[–] grandkaiser@lemmy.world 19 points 4 months ago (2 children)

Hopefully your generation will be the last that can't tell an obvious shitpost from reality.

[–] Retrograde@lemmy.world 6 points 4 months ago

Who hurt you

[–] deaf_fish@lemm.ee 3 points 4 months ago (2 children)

How is this an obvious shit post? I am here to learn.

[–] grandkaiser@lemmy.world 6 points 4 months ago* (last edited 4 months ago) (1 children)

AI didn't write this. AI would never write this. It's outrageously wrong to an extreme degree. Making dangerous and false claims have happened on occasion with LLM's (Often due to being fed various prompts until the user twists it into saying it), but an AI wouldnt write something like that, come up with a fake graph, and include a made up song (!?!) from the beetles about it. The fact that you are believing it doesn't speak to the danger of AI as much as it speaks to the gullibility of people.

If I said "obama made a law to put babies in woodchippers" and someone believes it, it doesn't speak to Obama being dangerous, it speaks to that person being incredibly dense.

[–] deaf_fish@lemm.ee 5 points 4 months ago* (last edited 4 months ago)

I have used LLMs before and they are occasionally wrong, seems like you don't disagree. I don't see how someone who isn't deeply familiar with LLMs would be obviously tipped off that this post is a shit post. As for the graphs, who knows, Google probably already has that working. I've seen LLMs make up songs before too.

AI would never write this.

Why not? I figure you could train an AI to write this. I could see a Google engineer messing up and producing a bad AI. GPT2 engineers has made this mistake before.

The fact that you are believing it doesn’t speak to the danger of AI as much as it speaks to the gullibility of people.

This is kind of like saying "the problem with nuclear bombs is that people are too easy to evaporate at high temperatures, not the bombs themselves". Yeah, that is true, but it's really hard to make people less gullible. I wouldn't say LLM's and AI are bad or we should stop using them. But I think people like you need to understand that the average person is not on your level, and you need to slow your roll.

If I said “obama made a law to put babies in woodchippers”....

I don't think this is a good comparison, because Obama has been around for a while and most people believe Obama wouldn't do that. Now if Obama went from being a nobody to president in a day and then someone told me the about the woodchipper law. I would be unsure and have to double check. It wouldn't be obvious. Likewise, since LLMs are relatively new to most people, it's going to take a while before most people figure out what is a normal mistake by an LLM vs an obviously faked mistake by a shit poster.

[–] Zess@lemmy.world 4 points 4 months ago (1 children)

Are you confused about the shitpost part or the obvious part

[–] deaf_fish@lemm.ee 3 points 4 months ago* (last edited 4 months ago) (1 children)

Ok, your not the original person. Is this an obvious shit post in your opinion?

Because if it isn't obvious how am I supposed to know this is a shit post or not?

[–] Rai@lemmy.dbzer0.com 2 points 4 months ago (1 children)

I’m a different person from those two and yes, I definitely agree this is an extremely obvious shitpost. The Beatles song and the image (zoom in if you haven’t seen the included dog safety chart) are the giveaways.

[–] deaf_fish@lemm.ee 2 points 4 months ago (1 children)

As someone who doesn't pay close attention to what Google does at the top of the page, this is not obvious to me. Glad it is a shit post and not something Google actually responded with.

[–] Rai@lemmy.dbzer0.com 2 points 4 months ago* (last edited 4 months ago)

Totally fair! LLMs are blurring the lines of shitposts and insane responses, so I’m for sure not gonna shit on anyone for not recognizing a shitpost.

I do have a lot of experience, as a person who (un?)fortunately was on /b/ when it was released until the late ‘00s.

This is something else, though… I’m interested and horrified to see where this all goes.

Edit: I do also have to say I use no google products or services so I’m with you there hahaha

[–] OsrsNeedsF2P@lemmy.ml 9 points 4 months ago (3 children)

No. For all the memes and fake nonsense, LLMs still give access to a swath of knowledge at a degree easier to access. The current kids using LLMs for questions are probably going to be quite a bit smarter than us

[–] Dekkia@this.doesnotcut.it 22 points 4 months ago* (last edited 4 months ago) (2 children)

What are you talking about?

Hallucinations in LLMs are so common that you basically can't trust them with anything they tell you.

And if I have to fact-check everything an LLM spits out, I need to to the manual research anyways.

[–] skyspydude1@lemmy.world 4 points 4 months ago (1 children)

I don't really think that's a bad thing when you really think about it. Teaching kids "No matter how confident someone is about what they tell you, it's a good idea to double check the facts" doesn't seem like the worst thing to teach them.

load more comments (1 replies)
[–] kelargo@lemmy.world 1 points 4 months ago (1 children)

Maybe it will teach critical thinking and less trust in accepting the status quo.

[–] blackbirdbiryani@lemmy.world 4 points 4 months ago

Yea that didn't happen either when people were warned about false information online...

[–] CEbbinghaus@lemmy.world 11 points 4 months ago

Like the picture?

[–] ICastFist@programming.dev 5 points 4 months ago

The current kids using LLMs for questions are probably going to be quite a bit smarter than us

Eh, I have serious reservations about this. Not everyone using them will double check stuff that doesn't sound quite right, and LLMs may often say shit that's very wrong, but doesn't look wrong, especially to someone who doesn't know a thing about the topic.

[–] agentshags@sh.itjust.works 45 points 4 months ago (4 children)
[–] Localhorst86@discuss.tchncs.de 27 points 4 months ago (1 children)

It's quite counterintuitive that it takes only 10 Minutes for the temperature to reach 75° inside when it is 75° outside, but it takes three times as long for the temperature to reach 75° inside when it is 75° outside. Thermodynamics are weird.

[–] Sabata11792@ani.social 4 points 4 months ago

How long for 160 degrees? I don't want to get sick.

[–] prex@aussie.zone 3 points 4 months ago

We probably should specify that this image is in degrees reaumur (°R)

[–] quantenzitrone@lemmings.world 1 points 4 months ago (1 children)

100° for sure is a very hot and thicc angle

[–] JasonDJ@lemmy.zip 2 points 4 months ago

That's a bit obtuse though. I prefer my angles to be 90°. That's just right.

[–] yetAnotherUser@lemmy.ca 20 points 4 months ago (2 children)

Nobody's talking about the Beatles' song

[–] SOB_Van_Owen@lemm.ee 4 points 4 months ago

Clearly missing verses to Martha, My Dear.

[–] Rai@lemmy.dbzer0.com 3 points 4 months ago

It’s a hot car dog bop

[–] randint@lemmy.frozeninferno.xyz 16 points 4 months ago (3 children)

I read this twice before I realized it was a dog and not a hot dog

[–] Masamune@lemmy.world 4 points 4 months ago

Yes, it's OK to leave a hot dog in a car. Hot dogs are best when served warm. The searing interior of your car will make it feel right at home.

[–] nightofmichelinstars@sopuli.xyz 3 points 4 months ago* (last edited 4 months ago)

Hot dogs are best served in a hot car, often with mustard and rainbow sprinkles. The crunch of the car complements the mustard tang, and this is the most common way that hot dogs are prepared and eaten in Florida.

The hot car and hot dog combination dates back to 2023, when the social media influencer PipsInTamps tried to comission a "hot CatDog" from a rule 34 AI generator but misspelled the "Cat" part and instead got an enticing, 56 minute video of the then-fictional cardog dish.

PipsInTamps was so intrigued by the dish that he made it for himself, and soon began selling them on the roadside in his home city of Tampa. The first few batches resulted in considerable patron illness, due mostly to the chewing and digesting of car. The dish went viral on social media and is now among the most popular street foods in Florida.

[–] MystikIncarnate@lemmy.ca 3 points 4 months ago* (last edited 4 months ago)

Kinda makes me want to get one of those wire hotdog rollers that you see in convenience stores and just set it up in my back seat whenever I'm going somewhere and will be leaving my car in the sun for a few hours. Just load it up with hotdogs, turn on the thing to make it spin (no heater required), go about my day and when I get back to my car, lunch is ready.

.... But then, I think of the smell.

[–] Matriks404@lemmy.world 14 points 4 months ago (1 children)

I really hope that AI feature will select health-related questions and will put a disclaimer that the answer might be not true and life threatening.

[–] frezik@midwest.social 26 points 4 months ago

Or just, like, fucking not.

[–] RickAstleyfounddead@lemy.lol 12 points 4 months ago (2 children)

Is this a dadjoke or something

[–] Viking_Hippie@lemmy.world 32 points 4 months ago

No, it's a "LLMs are trying to kill us and our pets" gallows humor joke.

[–] Cube6392@beehaw.org 3 points 4 months ago (1 children)

I think it's the result of a vast misinformation campaign. To get the LLMs to show political bias you have to pump them full of twisted language and lies until eventually they start saying everything that's true is false

[–] nilloc@discuss.tchncs.de 2 points 4 months ago

So that’s what the right wing has been doing all along… Who knew all that lying was just playing the long game against AI?

[–] GardenVarietyAnxiety@lemmy.world 9 points 4 months ago

I have a hotdog in my car.

[–] Enceladus@lemmy.ca 6 points 4 months ago (1 children)
[–] M0oP0o@mander.xyz 6 points 4 months ago