Video is pretty scary. The software doesn’t even seem to have a remote idea of what commands it should be sending to the steering system, but it sends the commands anyway, causing it to jerk back and forth between “I think I turn here” and “I think I go straight.”
What cities are these garbage cars being tested in, so I can avoid driving, biking, or walking there?
After wrecking the suspension of 2 e-scooters downtown from craptastic roads, I can say nothing of value would be lost from avoiding downtown ATX.
Scary indeed. Steering input looks like it's an unstable (increasing amplitude) oscillator.
Looks like they don't have lane-accurate maps and are still following their vision-only doctrine.
This seems like a failure mode where LIDAR wouldn't have helped. A completely flat and mostly empty road where the only real features are lane lines seems to be entirely a vision issue. Not really an excuse for tesla considering vision should be what they are good at, but the whole LIDAR issue seems to be overemphasized.
> flat … empty … only real features are lane lines
You mean a road? This is hilariously awful.
I'm pretty sure other companies solve this with very precise maps of lanes. Tesla still relies on vision only apparently.
Lidar could help by providing accurate positioning. Though I guess cameras should be good enough for lane position when they scan surrounding features too.
How do we get the NTSB to stop this?
Vote in 2026 to make sure we can vote for a different NTSB in 2028.
Stop what? Autonomous driving? Stop Waymo too?