892
Tesla’s Autopilot and Full Self-Driving linked to hundreds of crashes, dozens of deaths
(www.theverge.com)
This is a most excellent place for technology news and articles.
Cameras and AI aren't a match for radar/lidar. This is the big issue with the approach to autonomy Tesla's take. You've only a guess if there are hazards in the way.
Most algorithms are designed to work and then be statistically tested. To validate that they work. When you develop an algorithm with AI/machine learning, there is only the statistical step. You have to infer whole systems performance purely from that. There isn't a separate process for verification and validation. It just validation alone.
When something is developed with only statistical evidence of it working you can't be reliably sure it works in most scenarios. Except the exact ones you tested for. When you design an algorithm to work you can assume it works in most scenarios if the result are as expected when you validate it. With machine learning, the algorithm is obscured and uncertain (unless it's only used for parameter optimisation).
Machine learning is never used because it's a better approach. It's only used when the engineers don't know how to develop the algorithm. Once you understand this, you understand the hazard it presents. If you don't understand or refuse to understand this. You build machines that drive into children, deliberately. Through ignorance, greed and arrogance Tesla built a machine that deliberately runs over children.