this post was submitted on 13 Jun 2023
16 points (100.0% liked)

Technology

37712 readers
342 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

It is interesting how easy this has got with a popular service like Prime Voice AI, and when you realise that many use voice recognition for authenticated access to systems, we can see where the risks come in. Like most technology, there are lots of positive upsides, but it always opens up the negatives as well. As Steve points out in his commentary in the linked article, the bad actors are often quicker than anyone else nowadays to take advantage of these new developments.

No end in sight for the upward trajectory of careers in security and vulnerability consultants.

#technology #security #voicecloning

top 13 comments
sorted by: hot top controversial new old
[–] CasualTee@beehaw.org 5 points 1 year ago (1 children)

Not only organisations, but everyone really. The eldery are already massively targeted by scammers. Now, on top of that, scammers can find a child or grandchild voice sample on some social media and use that to ask for money over the phone. Or even via video call with deep fakes. And they can do that at scale.

[–] Corkyskog@sh.itjust.works 4 points 1 year ago

You don't even need social media. Call person A and keep them on the phone for 15 seconds to get a snippet of their voice. Feed through AI, use that to call person B and extort them or whatever crime you were planning on.

[–] poohbear@toons.zone 3 points 1 year ago

Everyone needs to start the conversation with their loved ones as soon as they can to question anything that seems out of character. It doesn't matter if they sound like or look like them - if anyone is asking for money or sensitive information, question it no matter what.

I know a lot of folks may feel like that's an impossible task, especially with parents and older relatives, but it's possible - I just recently had a conversation with my parents, and I was bracing myself for it to be an uphill battle, but they were switched on and got it. You've gotta at least try.

[–] ABoxOfNeurons@lemmy.one 2 points 1 year ago

But how else are we going to be able to hear Squidward singing Never Gonna Give You Up?

https://www.youtube.com/watch?v=OBLOxQu6s_s

[–] nii236@lemmy.jtmn.dev 1 points 1 year ago (1 children)

Since when does anything use voice recognition for authentication?!

[–] GadgeteerZA@beehaw.org 1 points 1 year ago (3 children)

Just some:

  • Medical devices: Voice recognition is used in medical devices to allow patients to control their devices without having to use their hands.
  • Call centres: Voice recognition is used in call centres to allow agents to answer calls without having to type.
  • Manufacturing: Voice recognition is used in manufacturing to control robots and other machines.
  • Customer service: Voice recognition is used in customer service to allow customers to get information and resolve issues without having to wait on hold (including some banks).
  • Family ransom requests by phone
[–] nii236@lemmy.jtmn.dev 3 points 1 year ago (1 children)

Its a pretty dark and grim future. Until then, here's an AI trained on Ariana Grande's voice singing in Vietnamese.

[–] GadgeteerZA@beehaw.org 1 points 1 year ago (1 children)

And apparently Paul McCartney used AI now to recreate John Lennon's voice in a recent Beatles song...

[–] nii236@lemmy.jtmn.dev 1 points 1 year ago

Its not all doom and gloom out there!

[–] MedicPigBabySaver@sopuli.xyz 2 points 1 year ago (1 children)

Hmmm, can I ransom my own family? ¯⁠\⁠(⁠°⁠_⁠o⁠)⁠/⁠¯

[–] GadgeteerZA@beehaw.org 1 points 1 year ago (1 children)

Sadly its been attempted a few times... we normally hear from the failed attempts.

[–] MedicPigBabySaver@sopuli.xyz 1 points 1 year ago

You shall not hear about me.

[–] PointlessGiraffe@beehaw.org 1 points 1 year ago

But those aren't examples of using voice recognition for authentication. In all of those cases if someone else walked up to the person using the thing and shouted the right command, we don't expect the system doing the voice recognition to ignore it because the wrong person said it.