this post was submitted on 12 Nov 2024
266 points (96.5% liked)

Technology

59314 readers
5725 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 26 comments
sorted by: hot top controversial new old

The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.

-Bill Gates

[–] whithom@discuss.online 46 points 2 days ago (4 children)

You need AI to reword this spaghetti of an article. I know plenty of nurses, most of them shouldn’t be making decisions.

But I do agree that slapping AI on everything is a poor idea.

[–] dan1101@lemm.ee 6 points 1 day ago (1 children)

There is a company hospitals have hired to transcribe recordings. The software makes many transcription errors and then deletes the original audio. Things aren't looking good.

[–] whithom@discuss.online 2 points 1 day ago

Shitty software existed before AI too. Things have always looked bad in hospitals.

[–] pearsaltchocolatebar@discuss.online 24 points 2 days ago (5 children)

Hasn't AI already been shown to be better at catching things like cancer than humans?

There are some things that computers can be better at than humans.

[–] GiveMemes@jlai.lu 2 points 1 day ago

Yes... well, sorta. For example, AI was found to be better at identifying TB than medical doctors. The catch here is that it also falsely diagnosed st a much higher rate than doctors. When an investigation was done as to how the AI was evaluating the imaging that it was given, they found that sets of virtually indistinguishable images were given different diagnoses by the AI. In many cases where there were no visible indicators of TB, a positive diagnosis wss given. The reason for this is that the AI was not weighting their TB diagnosis based on markers that doctors would look for alone, but also the age of the machine. Older machines have a much greater chance of being located in developing countries where TB is both more common and more deadly, leading to the age of the machine being considered an important factor, whereas a human would know that the age of a machine has absolutely zero relationship with the chance of getting TB, and doctors in these areas are already aware of and watching out for TB as it's a much more serious illness than in Germany, for example.

Idk much about the cancer thing, but basically the machine learning for diagnosis thing is iffy at best afaik.

[–] 9488fcea02a9@sh.itjust.works 20 points 1 day ago* (last edited 1 day ago)

Machine learning for helping a radiologist analyze images is super helpful and a mature field.

Whatever "AI" LLM nonsense tech bros are trying to add in to everything in the last 2 years is probably not all that helpful, but i could be proven wrong

[–] whithom@discuss.online 26 points 2 days ago (1 children)

Yes! And we should use it when it has been proven effective. But the AI shouldn’t be able to administer drugs.

[–] pearsaltchocolatebar@discuss.online 15 points 2 days ago (1 children)

For sure. There always needs to be a human in the loop. But this notion people seem to have that all AI is completely worthless just isn't true.

What's scary is the hospital administration that will use AI to deny care to unprofitable patients (I've listened in on these conversations).

[–] deranger@sh.itjust.works 10 points 2 days ago* (last edited 2 days ago) (2 children)

Where’s anyone saying it’s worthless? That’s not in the article nor in these comments.

The issue is how it’s being used. It’s not being used to detect cancer. It’s being used for “efficiency”, which means more patients being seen by fewer nurses. It’s furthering the goals of the business majors in hospital administration, not the nurses or doctors who are caring for the patient.

LLMs are largely worthless (in the context of improving human society).

Neural Nets aimed at much more specific domains (recognizing and indicating metastases or other abnormalities in pathology slides for human review, for example) are EXTREMELY useful and worthwhile.

[–] kryptonidas@lemmings.world 2 points 2 days ago

Ai nearly everywhere is to improve efficiency, less people become more productive so that the owners keep more money. Because a pay rise because of it is off the books. Since now you need to be “less skilled” anyway.

[–] Mouselemming@sh.itjust.works 8 points 2 days ago

It's also been shown to hallucinate whole parts of the doctor/nurse discussion and instructions

[–] coolmojo@lemmy.world 7 points 2 days ago (2 children)

Dogs are can be also better at detecting cancer than humans. And dogs tend to hallucinate less

[–] conciselyverbose@sh.itjust.works 7 points 1 day ago (1 children)

Hallucinations aren't a problem with the actually medically useful tools he's talking about. Machine learning is being used to draw extra attention to abnormalities that humans may miss.

It's completely unrelated to LLM nonsense.

[–] coolmojo@lemmy.world 2 points 1 day ago (1 children)

Perhaps, we should consider not calling all of them as AI. Machine learning is a useful tool.

[–] conciselyverbose@sh.itjust.works 6 points 1 day ago (1 children)

"AI" long predates LLM bullshit.

[–] coolmojo@lemmy.world 3 points 1 day ago

You are right. My pet peeve is that it is now used as a marketing term without actual meeting. Used to be the word smart. Now instead of “buy this smart toaster”, “buy this AI powered toaster”. Sorry if this reply was too verbose for your liking.

[–] rumba@lemmy.zip 5 points 1 day ago

They're better at smelling cancer than humans.

I'm not sure we can definitively say they hallucinate less.

[–] corsicanguppy@lemmy.ca 0 points 2 days ago (1 children)

This conversation has been edited

You need AI to reword this spaghetti of an article

I love it when writers editing the words of others somehow can't pass grade-school writing classes.

[–] TachyonTele@lemm.ee -2 points 2 days ago (1 children)

know plenty of nurses, most of them shouldn’t be making decisions.

Do expect doctors to do it or something?

[–] whithom@discuss.online 7 points 2 days ago

To make the decisions for the patient? Uhh yeah. And they do. Check the chart.

[–] Aviandelight@mander.xyz 24 points 2 days ago (1 children)

"The reasoning for bringing in AI tools to monitor patients is always that it will make life easier for us, but in my experience, technology in healthcare rarely makes things better. It usually just speeds up the factory floor, squeezing more out of us, so they can ultimately hire fewer of us. " - yup and there's not much we can do about it without harming a lot of people.

[–] zerozaku@lemmy.world 3 points 1 day ago

Yay we don't have to work anymore! Good news right? Right?!

/s

[–] LodeMike@lemmy.today 0 points 2 days ago

Excellent article.