this post was submitted on 19 Aug 2023
143 points (99.3% liked)

Technology

59243 readers
3428 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Cruise robotaxi collides with fire truck in San Francisco, leaving one injured::A crash between Cruise robotaxi and a San Francisco Fire Department truck occurred last night in Tenderloin. The incident happed a week after the California Public Utilities Commission (CPUC) approved 24/7 autonomous taxi ride services.

top 18 comments
sorted by: hot top controversial new old
[–] NAS89@lemmy.world 45 points 1 year ago (3 children)

I’ve had my entire career working within industrial automation and I see the value AI and automated efforts being to the world.

I do not see the value in allowing private companies to playtest autonomous driving with human life as a potential collateral.

The argument keeps getting made — “how many humans make that same mistake daily?” — and it’s not equivocal; if autonomous vehicles cannot reach a 100% safety and accuracy feature, they should not be allowed to risk human lives.

[–] TrumpetX@programming.dev 25 points 1 year ago (1 children)

Don't let perfection be the enemy of good. I'm not suggesting we're don't have a really high bar, but 100% is just unreasonable.

[–] Kecessa@sh.itjust.works 5 points 1 year ago (3 children)

The difference being that autonomous vehicles could reach 100% safety by removing all non autonomous vehicles from the road and imposing a communication standard between vehicles so they all know what the other vehicles are doing at all times.

That only applies to regions of the world where there's no snow because autonomous driving in a snowstorm will probably never be solved.

[–] supercriticalcheese@feddit.it 5 points 1 year ago (1 children)

And making walking illegal

[–] mosiacmango@lemm.ee 3 points 1 year ago* (last edited 1 year ago) (1 children)

US has been working for that for decades.

[–] supercriticalcheese@feddit.it 1 points 1 year ago

Yes they have

[–] Kbobabob@lemmy.world 3 points 1 year ago

I guarantee that it would still not be 100%. Maybe 99% or even 99.9%, but not 100%.

[–] reinar@distress.digital 2 points 1 year ago* (last edited 1 year ago)

"autonomous vehicle" in the article can't handle most basic shit like emergency vehicle approaching. I've spent enough years in automotive engineering and all of this autonomous drive bullshit is ADAS with a few gimmicks and shouldn't be nowhere near full control of the car, however this got out of hand and this shit is on public roads somehow.

[–] lolcatnip@reddthat.com 2 points 1 year ago (1 children)

You're arguing that even if autonomous vehicles are safer drivers than humans, we should choose to make ourselves less safe by disallowing them? Fuck that. Nobody should have to die because AI makes you squeamish.

[–] NAS89@lemmy.world 4 points 1 year ago

Unnecessarily hostile comment, too bad that attitude didn’t stay with Reddit.

AI doesn’t make me squeamish at all. Ignoring the context in which I stayed my background with automation was a choice, but the tub is using the general public to beta test hazardous equipment. Humans make errors and can be held responsible; corporations putting people at risk for no responsibility is reckless.

[–] autotldr@lemmings.world 10 points 1 year ago

This is the best summary I could come up with:


A passenger riding inside the Cruise self-driving vehicle suffered “non-severe injuries” and was transported in an ambulance, according to an official company post on X (formally Twitter) this morning.

“We are investigating to better understand our AVs performance, and will be in touch with the City of San Francisco about the event,” Cruise’s post reads.

The incident comes less than a week after the California Public Utilities Commission voted to allow paid 24/7 robotaxi services in San Francisco, handing companies like Cruise and Alphabet-owned Waymo a huge victory.

City officials and residents have pleaded with the state to slow down the efforts, citing incidents in which self-driving cars have interfered with emergency vehicles.

Since Cruise began testing in San Francisco, its vehicles have obstructed traffic on multiple occasions, including a situation where 10 autonomous vehicles halted traffic in a busy intersection during a music festival.

And a cement mason’s worst nightmare occurred on Tuesday when a Cruise vehicle reportedly got stuck in wet concrete.


The original article contains 285 words, the summary contains 164 words. Saved 42%. I'm a bot and I'm open source!

[–] skymtf@pricefield.org 6 points 1 year ago

I hate how we can't just have decent public transit, like self driving cars can be cool but they usually end up like this.

[–] Rentlar@lemmy.ca 4 points 1 year ago

Allowing operation of autonomous vehicles in high risk situations should include stipulations for major fines and restitution for anyone hurt by a misbehaving vehicle. It's not the fire truck's responsibility to give way, Waymo/Cruise better figure it out, they have smart people on their team.