Watchdog’s Tesla demo shows car hitting ‘child’ dummy • The Register

Video Tesla has been testing self-driving Model Ys on the streets of Austin, Texas. But according to the automaker’s bête noire, the Dawn Project, kids should keep clear.
The self-proclaimed self-driving gadfly posted a video this week showing a staged event in which a Tesla running the latest version of its self-driving software illegally overtook a school bus with its red lights flashing (indicating it is dropping off passengers) and then slammed into a child-sized mannequin emerging from the sidewalk. The car identified the pedestrian, but didn’t stop after hitting it and carried on down the road.
To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy, although one would hope that a human driver would have noticed the lights and stopped as legally required. However, the software running the car – FSD (Supervised) 13.2.9 – did not. (FSD originally for “full self-driving,” but as the parenthetical now denotes, Tesla’s own documentation has always said that the driver should be supervising at all times.)
The Dawn Project has its own agenda. It was founded by software entrepreneur Dan O’Dowd to highlight the dangers of Tesla’s self-driving claims when matched with reality. O’Dowd, who owns several Teslas himself – including original Roadster models – has been sounding concerns about the software for years.
“It’s been two and a half or three years since we pointed this out in a Super Bowl advert,” O’Dowd told The Register. “They just don’t care. They have other priorities, like a robotaxi working on Austin streets, that is a priority. Elon sets priorities, and he’s never made safety a priority.”
The staged demo was not the first time Tesla has had problems with school bus lights. In 2023, the US National Highway Traffic Safety Administration reportedly investigated an incident in which student Tillman Mitchell was struck by a Tesla Model Y after exiting a stopped school bus with its red warning lights flashing. The driver was allegedly using Tesla’s earlier driver-assist system, Autopilot, and had affixed weights to the steering wheel to fool the hands-on detection, according to The Washington Post.
Musk has said that his car business is already testing driverless Model Y cars in Austin “a month ahead of schedule.” That said, his deadlines tend to be fairly flexible – he’s aiming to land an uncrewed Starship on Mars by the end of 2026, but so far the rocket continues to fail key test flights.
There is already a self-driving taxi service operating in Austin using Google offshoot Waymo, and it’s been operating a similar service in San Francisco for over a year. The difference, according to O’Dowd, is that Waymo vehicles use expensive Light Detection and Ranging (LiDAR) sensors, while Tesla eschews this in favor of cheaper vision-based components and relies on software to fill in the gaps.
Self-driving tech is difficult – just ask Cruise, the GM-backed robocar business that burned through billions trying to make the technology work before throwing in the towel last year after a well-publicized incident in which a Cruise car dragged a pedestrian who had been knocked down by another vehicle.
“There’s a million people who die in car accidents every year,” said O’Dowd. “And if we get everybody on quality software that’s better than human drivers, like Waymo, we will save hundreds of thousands of lives per year. That’s absolutely true. But it won’t be Elon. It won’t be Tesla.”
Tesla had no comment at the time of going to press. ®