Justine Saint Amour bought her Cybertruck from a Florida dealer in February 2025 with the Full Self-Driving package included. On 18 August 2025, she was driving northbound on the Eastex Freeway with autopilot engaged when the vehicle approached the Y-junction near the Houston Metro 256 Eastex Park and Ride interchange. The road curves right. The Cybertruck went straight.
Saint Amour disengaged autopilot when she realised the truck was not going to make the turn. It was too late. The Cybertruck hit the concrete barrier head-on. The dashcam captured the whole sequence. Saint Amour was left with two herniated discs in her lower back, a herniated disc in her neck, sprained tendons in her wrist, and neuropathy causing numbness and weakness in her right hand. She is now suing Tesla in Harris County District Court for more than $1 million.
Her attorney, Bob Hilliard, was direct in his public statements.
"Tesla's decisions made Justine's accident inevitable. This company wants drivers to believe and trust their life on a lie: that the vehicle can self-drive and that it can do so safely. It can't, and it doesn't. The dashcam footage shows the type of foreseeable scenario where redundancy and override systems matter most."
The lawsuit accuses Tesla of negligence and gross negligence, misrepresenting the capabilities of its autopilot system, failing to include LiDAR or adequate backup braking systems, and providing insufficient warnings to drivers. It also names Elon Musk personally, alleging he overrode Tesla engineers' recommendations to include LiDAR, choosing instead what the filing describes as "cheap video cameras," and that his continued involvement in vehicle design constitutes a danger to drivers.
The lawsuit goes further, making an allegation unusual even by the standards of automotive litigation: that Tesla was negligent in hiring and retaining Musk as CEO at all.
Like this? Get the app: iOS | Android
The LiDAR argument sits at the centre of this case and has been building across Tesla litigation for years. Musk has consistently dismissed LiDAR as unnecessary, arguing cameras are how humans navigate the world and that vision-based AI is the correct approach. His competitors have gone the other way. Waymo runs LiDAR. Mercedes' Drive Pilot system, which holds limited SAE Level 3 certification in California and Nevada, uses it too. Whether the absence of LiDAR caused this specific failure is a technical question that expert witnesses will argue in court. What the lawsuit does is place Tesla's hardware philosophy on trial alongside its software, a broader legal attack surface than most prior cases attempted.
The Y-junction failure described in the Eastex lawsuit is not an exotic edge case. It is a standard freeway interchange on a Tuesday morning commute in a major American city. If the system cannot reliably navigate that, the words Full Self-Driving are doing considerable damage.
This case lands in a crowded legal environment. In January 2026, a federal judge upheld a $243 million verdict against Tesla in a separate Autopilot crash case. A California judge ruled in December 2025 that Tesla's FSD marketing was, in her words, "actually, unambiguously false and counterfactual." NHTSA currently has 2.88 million Tesla vehicles under investigation for FSD-related incidents, with 58 documented crashes connected to the system including cases where FSD directed vehicles into opposing lanes and through turn-only intersections. Tesla has requested multiple extensions on the deadline to supply crash data to NHTSA. Tesla's Robotaxi programme in Austin, meanwhile, has been producing crashes at roughly four times the human driver rate.
The consistent defence across all Tesla litigation is that Autopilot and FSD are SAE Level 2 systems, meaning they require active driver supervision at all times and the driver remains legally responsible for the vehicle's actions. Tesla states this clearly in its documentation. The marketing, by contrast, uses names like Full Self-Driving and Autopilot, and Musk has described on multiple occasions in public what fully autonomous Tesla vehicles will be capable of, on timelines that have consistently proved optimistic. Telling drivers in the manual that they must supervise the system while telling them in the advertising that the car drives itself is not a contradiction courts have found easy to resolve.
Saint Amour's case is exactly what the legal term "foreseeable harm" was designed to describe. A Y-shaped overpass junction is not an unusual road scenario. The system failed to navigate it. The dashcam shows what happened. Whether that makes Tesla liable, negligent, or both is what Harris County is going to decide.
Sources: Electrek, 11 March 2026 | Carscoops, 12 March 2026 | Jalopnik, 12 March 2026 | Newsweek, February 2026 | CarComplaints, February 2026 | Austin American-Statesman | NHTSA FSD investigation records | Harris County District Court filing: Justine Saint Amour v. Tesla, Inc.
