this post was submitted on 17 Nov 2023
424 points (99.5% liked)

Technology

59314 readers
4719 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google is embedding inaudible watermarks right into its AI generated music::Audio created using Google DeepMind’s AI Lyria model will be watermarked with SynthID to let people identify its AI-generated origins after the fact.

you are viewing a single comment's thread
view the rest of the comments
[–] interceder270@lemmy.world -4 points 1 year ago (3 children)

What if an AI writes a song about its own experience? Like how people won't take its music seriously?

[–] WillFord27@lemmy.world 8 points 1 year ago (2 children)

It will depend on whether or not we can empathize with its existence. For now, I think almost all people consider AI to be just language learning models and pattern recognition. Not much emotion in that.

[–] crispy_kilt@feddit.de 3 points 1 year ago* (last edited 1 year ago)

just language learning models

That's because they are just that. Attributing feelings or thought to the LLMs is about as absurd as attributing the same to Microsoft Word. LLMs are computer programs that self optimise to imitate the data they've been trained on. I know that ChatGPT is very impressive to the general public and it seems like talking to a computer, but it's not. The model doesn't understand what you're saying, and it doesn't understand what it is answering. It's just very good at generating fitting output for given input, because that's what it has been optimised for.

[–] interceder270@lemmy.world 1 points 1 year ago

Glad you're at least open to the idea.

[–] Inmate@lemmy.world 4 points 1 year ago* (last edited 1 year ago) (1 children)

"I dunno why it's hard, this anguish--I coddle / Myself too much. My 'Self'? A large-language-model."

[–] WillFord27@lemmy.world 2 points 1 year ago (1 children)

I noticed some of your comments are disappearing from this thread, is that you or mods?

[–] Inmate@lemmy.world 1 points 1 year ago

I'm getting nuked from another thread

[–] wildginger@lemmy.myserv.one 2 points 1 year ago (1 children)

Language models dont experience things, so it literally cannot. In the same way an equation doesnt experience the things its variables are intended to represent in the abstract of human understanding.

Calling language models AI is like calling skyscrapers trees. I can sorta get why you could think it makes sense, but it betrays a deep misunderstanding of construction and botany.

[–] interceder270@lemmy.world 1 points 1 year ago (1 children)

What makes your experiences more valid than that of AI?

[–] wildginger@lemmy.myserv.one 3 points 1 year ago (2 children)

It is not a measure of validity. It is a lack of capacity.

What is the experience of a chair? Of a cup? A drill? Do you believe motors experience, while they spin?

Language models arent actual thought. This isnt a discussion about if non organic thought is equivalent to organic thought. Its an equation, that uses words and the written rules of syntax instead of numbers. Its not thinking, its a calculator.

The only reason you think a language model can experience is because a marketing man missttributed it the name "AI." Its not artificial intelligence. Its a word equation.

You know how we get all these fun and funny memes where you rephrase a question, and you get a "rule breaking" answer? Thats because its an equation, and different inputs avoid parts of the calculation. Thought doesnt work that way.

I get that the calculator is very good at calculating words. But thats all it is. A calculator.

[–] interceder270@lemmy.world 3 points 1 year ago (2 children)

What makes your thoughts 'actual thought' and not just an algorithm?

[–] WillFord27@lemmy.world 1 points 1 year ago (1 children)

Personally, I choose to believe that the people around me are real. In theory, you can't trust anyone but yourself. I know language models don't have humanity. I guess that's the difference.

[–] interceder270@lemmy.world 0 points 1 year ago* (last edited 1 year ago)

Yeah man. You can choose to believe whatever you want.

Some food for thought: https://www.youtube.com/watch?v=2L9RZYguI0Q

[–] wildginger@lemmy.myserv.one -1 points 1 year ago (1 children)

Oh, thats easy. Im not an equation running on a calculator.

[–] interceder270@lemmy.world 1 points 1 year ago (1 children)

You only have thoughts because of electricity.

Do you believe in the existence of a soul or some other god-gene that separates us from machines?

[–] wildginger@lemmy.myserv.one -2 points 1 year ago (1 children)

No one said anything about electricity. A calculator can exist on paper, or stones on sticks.

No one said anything about souls. Please dont make up shit no one said.

I am not an equation. I do not take X input to produce Y output. My thoughts do not require outside stimuli. My thoughts do not give the same output for the same input. I can think, and ambulate and speak, inside a dark room with no stimulus based entirely on my own thoughts.

Chatgpt, and other language models, are equations. They trick you by using random number generation to simulate new outputs to repeat inputs, but if you open the code running the equation and learn how to fix the rng to a set value, you get the same outputs for each input.

Its not thought. Its an equation.

I am not saying non organic thought isnt possible. I am saying that a salesman pointed at a very very very big calculator and said "it definitely thinks! Its more than an equation!" And you, along with a lot of news outlets, fell for it.

We do not have machine brains yet. Someone just tried to sell calculators as if they were.

[–] interceder270@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

I mentioned electricity because it's how our brains function identically to machines. It doesn't have to be electricity, as long as the results are identical to intelligence.

No one said anything about souls. Please dont make up shit no one said.

If you don't believe in something that separates life from machines, then there is nothing stopping us from manufacturing a machine that has identical processing capabilities as our own.

I am not an equation. I do not take X input to produce Y output. My thoughts do not require outside stimuli. My thoughts do not give the same output for the same input. I can think, and ambulate and speak, inside a dark room with no stimulus based entirely on my own thoughts.

Yeah, this is where you're wrong and I can understand why you think you're different than AI. I guess you came up with your language all on your own. I guess you developed all of your ideas without outside stimuli. Heck, you aren't even being shaped by this conversation right now. Lol.

I am not saying non organic thought isnt possible. I am saying that a salesman pointed at a very very very big calculator and said “it definitely thinks! Its more than an equation!” And you, along with a lot of news outlets, fell for it.

We do not have machine brains yet. Someone just tried to sell calculators as if they were.

We can agree on this. Keyword is 'yet'.

[–] wildginger@lemmy.myserv.one -2 points 1 year ago (1 children)

Do.... No, hold on. You just said the stupidest thing Ive seen so far on this AI debate. Do you think learning is only possible if you are an equation?

If I lock you in a dark sound proofed room, can you talk? Can you still think? Can you create new thoughts? If I take away all possible X inputs, are you suddenly paralyzed, with no way to create new Y outputs? (The answer is, obviously, that you can still think without fresh constant inputs.)

Chatgpt get locked in the same room. Can it respond? Can it develop new outputs with no input? Can it change its internal understandings? If you leave chatgpt alone, unbothered, will its internal data shift? Will the same X inputs suddenly produce Y outputs, even with fixed rng? (No. No it wont.)

Retaining information is not an equation. My memories and chatgpts server storage is not the thing that makes us definably different. The actual processing of the information is. Chatgpt takes an input, calculates it, spits out an output, and then ceases. Stops. Ends. The process completes, and the equation terminates.

You and I dont black out when we dont get inputs. We generate multilevel thoughts completely independently, and often unpredictably and unreproducably.

Youre falling for a very complex sleight of hand trick.

[–] interceder270@lemmy.world 2 points 1 year ago* (last edited 1 year ago) (1 children)

So you're saying it's impossible to make a machine that responds in an identical manner as intelligent life in the conditions you describe?

I suggest you re-read my last sentence. Perhaps it will clear some things up for you.

[–] wildginger@lemmy.myserv.one -2 points 1 year ago (1 children)

I am not saying non organic thought isnt possible

.

I am not saying non organic thought isnt possible

For fucks sake, quit making up shit I didnt say because you failed to find a gotcha against the things I say.

Do you think the existence of calculators makes machine thought impossible? No? Then why would you make up that I think so?

Chatgpt and its program families arent trying to be real AI. A salesman just wants you to think they are so you spend money on it.

They are language calculators. They were built with the intent of being language calculators. Their creators all understand they are just language calculators.

The handful of programmers who fell for the marketing, like that poor google idiot, all get fired. Why? Because their bosses now know that they dont understand the project.

An equation cannot think. That doesnt mean a machine cant, ever. It means that a machine who thinks will not be an equation calculator.

We can look at chatgpts code. We can see that it is only an equation. So long as it is only an equation, it isnt capable of thinking. Attempts at real AI may use some equations within the machine brain. But it cannot be a brain while the entire thing is only an equation.

[–] interceder270@lemmy.world 2 points 1 year ago (1 children)

Alright man, you gotta calm down.

I didn't read what you said because it started off abrasive and incorrect.

Please, take a break and come back when you've cooled off. I can wait.

[–] wildginger@lemmy.myserv.one -1 points 1 year ago (1 children)

How about you come back when you respond after reading?

Dont waste my time with your anti science kookery that isnt even responding to the conversation. Thats the third time you failed to read what was given to you.

[–] interceder270@lemmy.world 0 points 1 year ago* (last edited 1 year ago)

Alright, I asked nicely.

On the ignore list you go. Learn how to conduct yourself if you want to be taken seriously.

[–] WillFord27@lemmy.world 1 points 1 year ago

Oddly, I'd find a piece of music written by an ai convinced it was a chair extremely artistic lol. But yeah, just because the algorithm that's really good at putting words together is trying to convince you it has feelings, doesn't mean it does.