this post was submitted on 11 Nov 2023
232 points (94.6% liked)

Asklemmy

43835 readers
714 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

I just listened to this AI generated audiobook and if it didn't say it was AI, I'd have thought it was human-made. It has different voices, dramatization, sound effects... The last I'd heard about this tech was a post saying Stephen Fry's voice was stolen and replicated by AI. But since then, nothing, even though it's clearly advanced incredibly fast. You'd expect more buzz for something that went from detectable as AI to indistinguishable from humans so quickly. How is it that no one is talking about AI generated audiobooks and their rapid improvement? This seems like a huge deal to me.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] FaceDeer@kbin.social 2 points 1 year ago

The validity of the chain of custody boils down to the cops and government in general being trusted enough to not falsify it when it suits them.

There are ways to cryptographically validate chain of custody. If we're in a world where only video with valid chain of custody can be used in court then those methods will see widespread adoption. You also didn't address any of the other kinds of evidence that I mentioned AI being unable to tamper with. Sure, you can generate a video of someone doing something horrible. But in a world where it is known that you can generate such videos, what jury would ever convict someone based solely on a video like that? It's frankly ridiculous.

This is very much the typical fictional dystopia scenario where one assumes all the possible negative uses of the technology will work fine but ignore all the ways of being able to counter those negative uses. You can spin a scary sci-fi tale from such speculation but it's not really a useful way of predicting how the actual future is likely to go.