this post was submitted on 09 Dec 2023
96 points (97.1% liked)

Technology

59593 readers
2823 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Humanius@lemmy.world 11 points 11 months ago (7 children)
[–] GigglyBobble@kbin.social 15 points 11 months ago (6 children)

Designing the model to prevent it from generating illegal content

Yeah, good luck designing that.

[–] barsoap@lemm.ee 6 points 11 months ago* (last edited 11 months ago) (2 children)

That's the Parliament wishlist, not the actual text of the law. (At least I think that's the version that got passed).

Stuff like that is why it's a good idea parliamentarians aren't drafting stuff, but an army of technocrats. It's all too easy to vote in a training requirement into a section about transparency when it's 3 o'clock in the morning and you and everyone else in the committee wants to go home.

Here's the transparency article:

Article 52
Transparency obligations for certain AI systems

  1. Providers shall ensure that AI systems intended to interact with natural persons are designed and developed in such a way that natural persons are informed that they are interacting with an AI system, unless this is obvious from the circumstances and the context of use. This obligation shall not apply to AI systems authorised by law to detect, prevent, investigate and prosecute criminal offences, unless those systems are available for the public to report a criminal offence.
  2. Users of an emotion recognition system or a biometric categorisation system shall inform of the operation of the system the natural persons exposed thereto. This obligation shall not apply to AI systems used for biometric categorisation, which are permitted by law to detect, prevent and investigate criminal offences.
  3. Users of an AI system that generates or manipulates image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and would falsely appear to a person to be authentic or truthful (‘deep fake’), shall disclose that the content has been artificially generated or manipulated. However, the first subparagraph shall not apply where the use is authorised by law to detect, prevent, investigate and prosecute criminal offences or it is necessary for the exercise of the right to freedom of expression and the right to freedom of the arts and sciences guaranteed in the Charter of Fundamental Rights of the EU, and subject to appropriate safeguards for the rights and freedoms of third parties.
  4. Paragraphs 1, 2 and 3 shall not affect the requirements and obligations set out in Title III of this Regulation.

Most of the AI uses out there only have these very limited requirements mostly around transparency. There's some stuff about training in Article 2 listing outlawed practices, e.g. you may not train models to be subliminal.

Where things get strict is around things like using AI to screen prospective employees where you have to make sure they're not picking up any unwarranted biases, e.g. judging by sex or nationality. Even more stricter are high-risk systems, listed in Annex III, which are largely uses in administration, critical infrastructure, etc.


All in all I'd say as a first of its kind, the law is pretty darn good, in particular that it classifies requirements for systems not by technology employed, but by their area of application. And the "likeness of natural person" has arts and freedom of speech exception so this kind of stuff doesn't even need disclosure.

[–] SuckMyFingerKFC@fanaticus.social -3 points 11 months ago (1 children)

No way someone is reading this wall of text lol

[–] barsoap@lemm.ee 0 points 11 months ago

Speak about yourself.

load more comments (3 replies)
load more comments (3 replies)