this post was submitted on 18 Oct 2024
783 points (98.4% liked)

Technology

59287 readers
4458 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] rsuri@lemmy.world 60 points 3 weeks ago* (last edited 3 weeks ago) (16 children)

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras.

This of course assumes 1) that cameras are just as good as eyes (they're not) and 2) that the processing of visual data that the human brain does can be replicated by a machine, which seems highly dubious given that we only partially understand how humans process visual data to make decisions.

Finally, it assumes that the current rate of human-caused crashes is acceptable. Which it isn't. We tolerate crashes because we can't improve people without unrealistic expense. In an automated system, if a bit of additional hardware can significantly reduce crashes it's irrational not to do it.

[–] SkyeStarfall@lemmy.blahaj.zone 46 points 3 weeks ago (3 children)

Also, on a final note...

Why the fuck would you limit yourself to only human senses when you have the capability to add more of any sense you want??

If you have the option to add something that humans don't have, why wouldn't you? As an example, humans don't have gps either, but it's very useful to have in a car

[–] sue_me_please@awful.systems 16 points 3 weeks ago

Because a global pandemic broke your sensor supply chain and you still want to sell cars with FSD anyway, so cameras-only it is!

load more comments (2 replies)
load more comments (14 replies)