Tesla Model Y Driving On FSD Knocks Down Kid-Sized Dummies Eight Times
Briefly

A recent experiment conducted by The Dawn Project reveals critical flaws in Tesla's Full Self-Driving (FSD) software. In a test involving a Model Y, the car repeatedly failed to stop for a child-sized dummy crossing the street, despite having the advanced driver assistance system engaged. The test, which occurred eight times, showed the Tesla driving at around 20 mph and failing to disengage after hitting the dummy. Critics argue this raises substantial concerns about FSD's reliability, particularly concerning pedestrian safety, especially in scenarios involving children. The organization behind the experiment is known for its vocal opposition to Elon Musk's autonomous vehicle ambitions.
The test conducted by The Dawn Project demonstrated that Tesla's Full Self-Driving software failed to stop for a child-sized dummy crossing the street, confirming safety concerns.
Despite traveling at 20 mph, the Model Y's FSD system did not prevent collisions with mannequins, revealing significant flaws in its pedestrian safety capabilities.
The Dawn Project's experiment raises emotional and controversial questions about the reliability of Tesla's FSD software, especially in critical situations involving children.
Critics argue that if the software struggles with a simple dummy test, its application in real-world scenarios could pose serious risks, especially in urban areas.
Read at InsideEVs
[
|
]