this post was submitted on 29 Sep 2024
58 points (85.4% liked)

Technology

59593 readers
3350 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 49 comments
sorted by: hot top controversial new old
[–] aisteru@lemmy.aisteru.ch 72 points 1 month ago (2 children)

Honestly? Before the AI craze, I'd have said yes, because I believe AIs tailored to do one specific thing can outperform humans. Today? I'd rather not, as I could not let go of the thought that it might be somme shitty model quickly put together by the nephew of the CEO...

[–] AbidanYre@lemmy.world 22 points 1 month ago (1 children)

Equally likely, they're collecting data for their porn generating AI bot.

[–] gsfraley@lemmy.world 14 points 1 month ago (1 children)

😬 I'm not sure how I'd feel about porn generated on a data set of potential STIs

[–] wreckedcarzz@lemmy.world 7 points 1 month ago* (last edited 1 month ago)

Bugchasers everywhere feel kinkshamed by you now

[–] RecluseRamble@lemmy.dbzer0.com 4 points 1 month ago* (last edited 1 month ago)

Hey now, it was carefully designed and thoroughly trained on Tinder dickpics

[–] pixeltree@lemmy.blahaj.zone 42 points 1 month ago (1 children)

Would I trust the accuracy of the output? No, but it might be a decent warning to get tested to make sure. Would I trust a company with pictures of my genitals attached to my identity? Certainly not an AI company.

[–] SkaveRat@discuss.tchncs.de 21 points 1 month ago (1 children)

but it might be a decent warning to get tested to make sure

just show "better get checked by a professional" as the only result. no AI needed

[–] slacktoid@lemmy.ml 6 points 1 month ago (1 children)

Great app idea to get pics of genitals!

[–] SkaveRat@discuss.tchncs.de 5 points 1 month ago (1 children)

Just create a twitter account with a model as the avatar and you'll get the same. With a small chance of fewer deseased pics

[–] slacktoid@lemmy.ml 1 points 1 month ago

Hilarious and depressing

[–] gedaliyah@lemmy.world 31 points 1 month ago (1 children)

Short answer, yes.

Finding complex patterns in noisy data is an application that AI is actually well suited for. It still requires human follow-up. Anyway, human experts make mistakes in these areas as well. There is a good chance that a well designed AI could be more accurate.

[–] Petter1@lemm.ee -2 points 1 month ago

It is already more accurate, in many places in medicine 😇

[–] pennomi@lemmy.world 17 points 1 month ago

Locally run AI, yes. Hosted AI, no.

[–] Num10ck@lemmy.world 16 points 1 month ago (1 children)

just post your junk on bluesky and crowdsource it.

[–] simplejack@lemmy.world 4 points 1 month ago

Twitter is mostly verified dicks these days. That might be the better platform.

[–] Randomgal@lemmy.ca 14 points 1 month ago (1 children)

Honestly? I've leaked pics of those voluntarily, so curiously I'd be a-okay with this one.

[–] Nougat@fedia.io 10 points 1 month ago (1 children)

... leaked ...

Well, there's your problem.

[–] Randomgal@lemmy.ca 5 points 1 month ago (1 children)

No no. There's no problem. That's what I'm saying Lol.

[–] Nougat@fedia.io 14 points 1 month ago (1 children)

I was trying to do a "it's not supposed to leak, that's probably an STI" joke.

[–] wreckedcarzz@lemmy.world 3 points 1 month ago (1 children)

an Impreza with an oil leak problem: embarrassed Pikachu face

[–] Nougat@fedia.io 2 points 1 month ago

Not the headgaskets again!

[–] figaro@lemdro.id 13 points 1 month ago

Not all AI is an LLM. So yes.

[–] azl@lemmy.sdf.org 11 points 1 month ago (1 children)

What's the difference between one technology you don't understand (AI engine-assisted ) and another you don't understand (human-staffed radiology laboratory)?

Regardless of whether you (as a patient hopelessly unskilled in diagnosis of any condition) trust the method, you probably have some level of faith in the provider who has selected it. And, while they most likely will choose what is most beneficial to them (cost of providing accurate diagnoses vs. cost of providing less accurate diagnoses), hopefully regulatory oversight and public influence will force them to use whichever is most effective, AI or not.

[–] Chozo@fedia.io 6 points 1 month ago

What's the difference between one technology you don't understand (AI engine-assisted ) and another you don't understand (human-staffed radiology laboratory)?

The difference is that people think they understand AI. Even here in this thread, there are people confusing this for an LLM.

[–] solrize@lemmy.world 8 points 1 month ago (1 children)

I dunno, maybe the diagnosis is fine but the companies that run it are sure to save copies. I can just see databreaches now, "5 million stolen dick picks uploaded to dark web". Complete with labelling of which ones are diseased though, so that's a help.

[–] wreckedcarzz@lemmy.world 2 points 1 month ago

If we could filter by length, girth, un/cut, ball size, hair amount, and (most importantly) diagnosis... I'm not saying I would put that tool together, but as a user...

[–] Death_Equity@lemmy.world 8 points 1 month ago

I wouldn't trust it to tell me if something is or isn't a banana.

[–] SatansMaggotyCumFart@lemmy.world 8 points 1 month ago (2 children)

I’d welcome it.

I could probably teach it a thing or two.

[–] Grimy@lemmy.world 4 points 1 month ago (1 children)

And love.

Like that movie where Joaquin Phoenix gives Scarlett Johansson a STI.

[–] rigatti@lemmy.world 3 points 1 month ago

Would you teach it how to make creepy comments on the internet?

[–] lemmyng@lemmy.ca 8 points 1 month ago (1 children)

AI trained to do that job? Sure, yeah. LLM AI? Fuck no.

[–] wreckedcarzz@lemmy.world 3 points 1 month ago (1 children)

AI: "Your penis appears to be an avocado. This is normal, and you should not be concerned. However you have 3 testicles and this should be looked into."

You, a female: "uhhhhhh"

[–] lemmyng@lemmy.ca 2 points 1 month ago

That's LLM AI, but the type I'm talking about is the machine learning kind. I can envision a system that takes e.g. a sample's test data and provides a summary, which is not far from what doctors do anyway. If you ever get a blood test's results explained to you it's "this value is high, which would be concerning except that this other value is not high, so you're probably fine regarding X. However, I notice that this other value is low, and this can be an indicator of Y. I'm going to request a follow-up test regarding that." Yes, I would trust an AI to give me that explanation, because those are very strict parameters to work with, and the input comes from a trusted source (lab results and medical training data) and not "Bob's shrimping and hoola hoop dancing blog".

[–] werefreeatlast@lemmy.world 7 points 1 month ago (1 children)

Ok press the start button and slowly scan your penis, asshole and testicles. First apply included wax and pull forcefully and swiftly to remove hair.

[–] wreckedcarzz@lemmy.world 4 points 1 month ago

screams, cries owieeeee, my nutsack...

[–] nxn@biglemmowski.win 5 points 1 month ago

Every passing day we delve deeper into this hole that is a cold technology driven world. Instead we really should be taking the time to share our outbreaks with friends and family.

[–] GreenKnight23@lemmy.world 5 points 1 month ago

no, but not for why you think.

because it's far more effective to scan samples from you than whole organs.

[–] aesthelete@lemmy.world 3 points 1 month ago
[–] Imgonnatrythis@sh.itjust.works 3 points 1 month ago

Depends on the specificity and sensitivity of the test. Would have to be damn close to gold standards to justify. Company providing tech would need to be heavily regulated. Could be promising tech for sex workers if sensitivity was decent, but by time skin manifestations are present most of these are fairly far along.

[–] hendrik@palaver.p3x.de 2 points 1 month ago* (last edited 1 month ago)

I don't think the app in the picture is driven by AI. Seems like a catalogue of questions. Probably to assess some situation by some standard procedure. I'd trust that. Regarding the AI apps mentioned below: I wouldn't trust them at all. If my private parts start itching and I can't make sense of it, I'd go to the doctor. At least if it's serious. Or use Dr. Google if it's not too bad.

[–] andallthat@lemmy.world 2 points 1 month ago* (last edited 1 month ago) (1 children)

I'm not sure we, as a society, are ready to trust ML models to do things that might affect lives. This is true for self-driving cars and I expect it to be even more true for medicine. In particular, we can't accept ML failures, even when they get to a point where they are statistically less likely than human errors.

I don't know if this is currently true or not, so please don't shoot me for this specific example, but IF we were to have reliable stats that everything else being equal, self-driving cars cause less accidents than humans, a machine error will always be weird and alien and harder for us to justify than a human one.

"He was drinking too much because his partner left him", "she was suffering from a health condition and had an episode while driving"... we have the illusion that we understand humans and (to an extent) that this understanding helps us predict who we can trust not to drive us to our death or not to misdiagnose some STI and have our genitals wither. But machines? Even if they were 20% more reliable than humans, how would we know which ones we can trust?

[–] Petter1@lemm.ee 5 points 1 month ago

I think ML is used since about 20 years in medicine already. In various laboratory processes/equipment.

Maybe not as pure decision, but to point experts to where to watch and what to check.

[–] Rhoeri@lemmy.world 2 points 1 month ago
[–] kokesh@lemmy.world 1 points 1 month ago

You just need Ann to check those. Ask Joe from Sewage.

[–] UraniumBlazer@lemm.ee 1 points 1 month ago (1 children)

If it is approved by the FDA? HECK YEAH BAYBEEE

[–] wreckedcarzz@lemmy.world 1 points 1 month ago

The Fucking Dick Association? Aw yiss

[–] sunzu2@thebrainbin.org 0 points 1 month ago

Only if it is hosted by Google.