this post was submitted on 22 Apr 2025
247 points (95.2% liked)

Technology

69211 readers
3790 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] Death_Equity@lemmy.world 23 points 1 day ago (2 children)

They also are the dumbest generation with a COVID education handicap and the least technological literacy in terms of mechanics comprehension. They have grown up with technology that is refined enough to not need to learn troubleshooting skills past "reboot it".

How they don't understand that a LLM can't be conscious is not surprising. LLMs are a neat trick, but far from anything close to consciousness or intelligence.

[–] Muaddib@sopuli.xyz 3 points 1 day ago

How they don't understand that a LLM can't be conscious is not surprising

It's because they're all atheists! They don't know that souls are granted to humans by the gods.

[–] LorIps@lemmy.world 6 points 1 day ago (1 children)

Have fun with your back problems!

load more comments (1 replies)
[–] taladar@sh.itjust.works 21 points 1 day ago (2 children)

I wasn't aware the generation of CEOs and politicians was called "Gen Z".

We have to make the biggest return on our investments, fr fr

load more comments (1 replies)
[–] tal@lemmy.today 10 points 1 day ago* (last edited 1 day ago) (1 children)

At some point in the mid-late 1990s, I recall having a (technically-inclined) friend who dialed up to a BBS and spent a considerable amount of time pinging and then chatting with Lisa, the "sysadmin's sister". When I heard about it, I spent quite some time arguing with him that Lisa was a bot. He was pretty convinced that she was human.

http://bbs.hmvh.net/hmvh/lisa/LISA.HTM

http://textfiles.com/bbs/install.txt

load more comments (1 replies)
[–] wagesj45@fedia.io 15 points 1 day ago (15 children)

That's a matter of philosophy and what a person even understands "consciousness" to be. You shouldn't be surprised that others come to different conclusions about the nature of being and what it means to be conscious.

[–] 0x01@lemmy.ml 12 points 1 day ago (4 children)

Consciousness is an emergent property, generally self awareness and singularity are key defining features.

There is no secret sauce to llms that would make them any more conscious than Wikipedia.

[–] General_Effort@lemmy.world 3 points 1 day ago (1 children)

secret sauce

What would such a secret sauce look like? Like, what is it in humans, for example?

[–] 0x01@lemmy.ml 2 points 1 day ago (1 children)

Likely a prefrontal cortex, the administrative center of the brain and generally host to human consciousness. As well as a dedicated memory system with learning plasticity.

Humans have systems that mirror llms but llms are missing a few key components to be precise replicas of human brains, mostly because it's computationally expensive to consider and the goal is different.

Some specific things the brain has that llms don't directly account for are different neurochemicals (favoring a single floating value per neuron), synaptogenesis, neurogenesis, synapse fire travel duration and myelin, neural pruning, potassium and sodium channels, downstream effects, etc. We use math and gradient descent to somewhat mirror the brain's hebbian learning but do not perform precisely the same operations using the same systems.

In my opinion having a dedicated module for consciousness would bridge the gap, possibly while accounting for some of the missing characteristics. Consciousness is not an indescribable mystery, we have performed tons of experiments and received a whole lot of information on the topic.

As it stands llms are largely reasonable approximations of the language center of the brain but little more. It may honestly not take much to get what we consider consciousness humming in a system that includes an llm as a component.

load more comments (1 replies)
load more comments (3 replies)
load more comments (14 replies)
[–] dissipatersshik@ttrpg.network 2 points 1 day ago (7 children)

Why are we so quick to assume machines cannot achieve consciousness?

Unless you can point to me the existence of a spirit or soul, there's nothing that makes our consciousness unique from what computers are capable of accomplishing.

load more comments (7 replies)
load more comments
view more: ‹ prev next ›