Orphan Sues Tesla, Claiming Family Died in Crash Because Safety Features Failed
Concerns are growing over the reliability of Tesla’s autopilot and full self-driving systems.

A young New Jersey man who lost his family last summer when their Tesla crashed into a concrete barrier on the Garden State Parkway claimed that the EV’s “Autopilot” feature should have kicked in to prevent the fatal accident.
Max Dryerman, a 20-year-old student from Woodcliff Lake, and his two aunts filed a wrongful death suit with a federal court at Camden, New Jersey, on behalf of the family estate, claiming the braking system and other safety features in their Tesla Model S were defective.
Mr. Dryerman was at school at Philadelphia’s Drexel University when his sister, Brooke, 17, and his parents, David and Michele, were killed in the crash in Woodbridge just before midnight on September 14. The family was heading back to their Bergen County home after attending the Sea Hear Now music festival at Asbury Park. Their car was heading northbound on the toll highway when it suddenly ran off the road to its left, hitting a sign and guardrail before crashing into a concrete bridge support.
The lawsuit alleges that Tesla’s “Autopilot” and other driving assistance features, like forward collision warning and land departure avoidance, should have prevented the car from careening off the road.
“Despite the vehicle camera system [detecting] an approaching stationary obstacle,” the complaint states, according to local reports, “the vehicle continued — without braking or reduction in acceleration or engine torque — into the stationary obstacle.”
“Thousands of Tesla drivers have relied on Tesla’s [technology] as though it were capable of safe, fully autonomous self-driving … when in fact it is incapable of safely handling a variety of routine roadway scenarios without driver input.”
Tesla officials did not immediately return a request for comment.
Questions have long been raised regarding the safety of Tesla vehicles, with particular concern centered around their autonomous features.
Earlier this month, the car manufacturer launched its “RoboTaxi” pilot program in Austin, Texas, despite widespread concern that the self-driving cars have poor safety protocols.
In a demonstration at East Austin attended by The New York Sun on June 13, the self-driving Tesla Model Y repeatedly failed to come to emergency stops for hazards on the road.
In a demonstration by a consumer advocacy group, The Dawn Project, in East Austin, mannequins simulated children running to a school bus. This reporter witnessed eight trials in which the self-driving Tesla — the same model planned for the Robotaxi program — failed every time, striking the dummies despite having ample time to stop.
Since October 2024, NHTSA investigators have verified 44 fatalities involving Tesla vehicles with Full Self-Driving mode engaged.
“Tesla is a massive outlier. Nothing’s even close to it,” the founder of The Dawn Project, Dan O’Dowd, said to The New York Sun. “Ninety percent of all the accidents that have occurred with self-driving or partially self-driving cars are Teslas.”