What new concerns? All I see are preexisting concerns.
Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
I was ready to write shit on Tesla then I saw the video. The asshole was driving at that speed in the fog??? And let some alpha proof of concept software that relies only on cameras to let drive in the fog?????
If the human eye doesn't see the train crossing, how a camera can see it?
Although a bit of fault to Tesla, because the system should have known from the maps that over there was a train crossing and should have slowed down anyway
If the human eye doesn't see the train crossing, how a camera can see it?
If Elon Musk wasn't so anti-lidar then that would be the answer, but here we are.
And let some alpha proof of concept software that relies only on cameras
That "alpha proof of concept software" is marketed as "full self-driving". If it doesn't mean what it says it means, the company producing it should be fully liable for any crashes caused as a result of its use.
Full Self-Driving. Not "kinda sometimes in perfect conditions". Full.
True, it should also refuse to activate in low visibility situation
looks like the camera has seen those blinky lights tho, at least i did from the footage. so maybe the driver could have acted quite a bit sooner too. a shame lidar was too good for tesla tho
You'd think the driver would have start hitting the brakes seeing how fast it was going towards the gates
Driver decides to use Automatic Cruise Control in Foggy Conditions, basically Blames car and manufacturer
This picture does not look like what happens to a car when it is hit by a train
Because it didn't get hit by a train. Driver steer the car away from the passing train and hit a stop light.
"It wasn't in self driving mode at the time of the accident."
- Tesla legal team, probably
Probably technically true. He had to steer it away from the train.
Can’t view, got a link?
That might explain why the title says "nearly"
It’s what happens when the driver swerves into the crossing arm pole to not hit the train in front of it.