
Can Tesla’s Bold Driverless Cars Be Trusted? New Shocking Crash Test Video Stuns The Internet
A viral test shows a Tesla Model Y running over a child dummy, reigniting the debate over self-driving safety as Tesla eyes a robotaxi future.
- 40,000+ Teslas equipped with Autopilot on US roads
- 13.2.9: Latest Full Self-Driving (FSD) software version tested in viral video
- 1 million+ miles claimed by Tesla for robotaxi readiness testing
Tesla finds itself at the center of controversy—again. This time, a chilling video set social media ablaze: a Tesla Model Y, running Tesla’s latest ‘Full Self-Driving’ (FSD) software, fails a basic safety test by mowing down a child-sized dummy instead of stopping. While no real harm was done, the staged crash by safety advocate The Dawn Project has reignited fierce debates about the limits—and dangers—of autonomous driving tech in 2025.
The demonstration took place on Austin’s public streets, sending shockwaves through Tesla owners and industry watchers alike. As the electric giant pushes to launch coast-to-coast robotaxis, experts ask: Is the future of driverless cars arriving too quickly for public safety to keep up?
NHTSA and Tesla both remain under relentless scrutiny as this debate grows louder.
Q: What Actually Happened During the Viral Tesla Demo?
In the video shared by The Dawn Project, a white Tesla Model Y equipped with FSD (Supervised) 13.2.9 software approaches a school bus stopped with its lights flashing. As a child dummy crosses the street, the car doesn’t even slow, plowing through without hesitation. The dashcam footage shows the AI “identifying” a pedestrian—but not stopping.
As previous investigations by NHTSA reveal, this isn’t Tesla’s first safety incident linked to Autopilot or FSD. In fact, Human drivers must legally stop for flashing school bus lights because pedestrians—often children—are expected to cross. The Model Y, in this case, simply didn’t comply.
Q: Who Is Behind The Dawn Project—and What’s Their Mission?
The Dawn Project is led by entrepreneur Dan O’Dowd, a longtime Tesla user and outspoken critic. O’Dowd has made headlines with high-profile campaigns, including a dramatic Super Bowl ad blasting Tesla’s software failures.
His message is clear: Tesla’s pursuit of a driverless future is putting real lives at risk. Despite years of warnings, O’Dowd says Tesla’s priorities remain profits and robotaxi ambitions over fundamental safety fixes.
How Does Tesla’s Tech Compare to Other Robotaxi Pioneers?
Tesla continues to shun LiDAR sensors—favored by competitors like Waymo—claiming its camera-and-AI approach is the future. Yet companies like Waymo and Cruise (a division of GM) already operate driverless taxis in cities like Austin and San Francisco, with far fewer reported incidents.
O’Dowd and other experts argue that neglecting proven tech for the sake of cost and speed leaves everyone on the road exposed. Human drivers, at least, are held to clear laws and responsibilities around school zones—something software must match or exceed.
How Safe Are Tesla’s Robotaxis—and Should You Trust Them?
Tesla CEO Elon Musk insists the software passes rigorous tests and claims Tesla will soon launch fully driverless vehicles. Critics point out the timeline routinely slips, and videos like this underscore just how far the system has to go.
Authorities, including the NHTSA, continue to investigate. But shocking footage and growing public concern might force regulators to crack down before more autonomous cars roam American streets.
What Can Drivers—and Parents—Do Now?
– Stay informed about your car’s software updates and limitations.
– Never rely exclusively on self-driving features; keep hands on the wheel and eyes on the road at all times.
– Report unusual vehicle behavior to manufacturers and safety boards.
Don’t become a headline—put safety over hype. Bookmark this story for updates, or visit trusted resources for the latest on autonomous vehicles: Consumer Reports, NHTSA, and Tesla.
Checklist: How to Stay Safe with Semi-Autonomous Vehicles
- ✔ Always supervise—don’t trust “full self-driving” claims
- ✔ Avoid using Autopilot or FSD in school zones or near crosswalks
- ✔ Read up on known issues for your car and software version
- ✔ Demand transparency and action from automakers on safety
Stay alert—because when your car is in control, your life may be too.