this post was submitted on 28 Jun 2024
57 points (95.2% liked)

Asklemmy

43959 readers
1794 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

As the title says. I go for a 20 minute walk and when I stop moving, I'm not feeling tired or even agitated at all, yet my legs feel like they're pulsating in different areas, always near the skin. It's not synchronised with my heartbeat. It stops after a few minutes.

Chat GPT says these are just muscle twitches caused by dehydration or lack of electrolytes. I'm not convinced. Why does it feel almost on the skin and not deeper in the muscles? Why do I feel it after a 20 minute walk that doesn't make me sweat but I don't feel it after a 40 minute leg focused workout???? Wouldn't that be more strenuous on the legs?? Does this thing even have a name?

Thanks

you are viewing a single comment's thread
view the rest of the comments
[–] Mothra@mander.xyz 1 points 5 months ago (5 children)

I know my legs are fine. All I want to know is a name for this sensation and what causes it. Yes, I want to know about the weird stuff the body does, why is it wrong to ask chatGPT or google?

[–] robotElder2@hexbear.net 10 points 5 months ago (4 children)

LLMs are stochastic parrots. They just repeat the phrases most often used together in their training data in association with the words on your prompt. It's like seeking medical advice from the predictive text on your phone keyboard.

[–] Mothra@mander.xyz 5 points 5 months ago (3 children)

Why is this question considered medical advice? Also, considering most common facts are parroted correctly out of LLMs, why is it wrong to search for answers there first?

[–] robotElder2@hexbear.net 4 points 5 months ago (1 children)

OK fair, I guess if your not planning to act on it anyway then the stakes are pretty low. I don't agree that llms reliably get basic information correct. "Glue is not pizza sauce" seems like a common fact to me but Googles llm disagrees for example.

[–] hedgehog@ttrpg.network 1 points 5 months ago

"Glue is not pizza sauce" seems like a common fact to me but Googles llm disagrees for example.

That wasn’t something an LLM came up with, though. That was done by a system that uses an LLM. My guess is the system retrieves a small set of results and then just uses the LLM to phrase a response to the user’s query by referencing the links in question.

It’d be like saying to someone “rephrase the relevant parts of this document to answer the user’s question” but the only relevant part is a joke. There’s not much else you can do there.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)