Earlier this year I entrusted my life to a computer, riding in the passenger seat while one of Waymo’s self-driving Jaguars ferried me across Los Angeles.
Recently, Tesla introduced its long-awaited/long-overdue Full Self-Driving (FSD) technology to Australia, claiming your Model Y or Model 3 can now operate without any input from the driver.
But I will never trust a Tesla in the same way I trusted the Waymo One. More than that, I have serious concerns about the impact the introduction will have on our roads, with the potential to make it more dangerous as people become too reliant on a questionable system.
As we reported earlier this week, Tesla is already using FSD for its Robotaxi services in Austin, Texas. But there have already been three reports of accidents, fortunately none serious, but it should be alarming to all that driverless cars are hitting the roads and having trouble only a few weeks in.

There are much smarter experts than me who have expressed concerns over the Tesla FSD system, which is entirely based on cameras and doesn’t use any LIDAR or radar assistance to clearly map out the obstacles around it.
That is the fundamental difference between Tesla and Waymo, as the latter not only uses cameras but also both LIDAR and radar to minimise risk as much as possible.
In the US, the National Highway Traffic Safety Administration (NHTSA) has been investigating more than 2.4 million Tesla vehicles and possible FSD malfunctions, specifically the possibility the sun glare impacted the cameras – something that is not an issue for LIDAR and radar.
Tesla boss Elon Musk has dismissed LIDAR as unnecessary but it’s telling that the Texas Robotaxi fleet features a Tesla-employee in the front passenger seat as a ‘safety monitor’ ready to step in and stop the car if a problem occurs.
A large part of the problem with Tesla isn’t the technology, it’s the branding. Tesla has long positioned itself as a brand looking to push the boundaries and shake up the car industry. But it has often done so, in my opinion, in a reckless manner at times.
Its ‘Autopilot’ system was largely the same active cruise control and lane keeping assist that many other brands offered, but the name implied something more. Despite being questioned on the potential dangers of owners misunderstanding the limits of Autopilot and relying on it too heavily, Tesla dismissed any concerns and said its disclaimer covered off the limitations.

Amongst these limitations are the fact that the ‘Autosteer’ function was listed as being in ‘Beta’ form when we tested the new Model Y in April this year. My view is car makers should not be beta testing any system with customers, least of all anything to do with safety. But Tesla clearly does not share that view and is comfortable putting incomplete safety systems into production vehicles.
But Tesla and safety are often linked in unfortunate ways. In August this year Tesla was ordered to pay US$200 million to the family of a young woman in Miami who was run over and killed by a Tesla being operated in Autopilot. This is not a simple case of liking Tesla or disliking Tesla or even questioning the technology – this is a literal matter of life and death.
Now FSD has arrived and while it stands for ‘Full Self-Driving’ autonomous car experts have said it is well-short of a system that can be considered Level 5 autonomous driving. So there is now a clear and present danger that Tesla drivers will simply turn on FSD, sit back and not pay proper attention to the road.
Of course they should be paying attention and have ‘control’ of their Tesla, laws around Australia make it clear that the driver remains responsible for any accident, but the reality is that will be nearly impossible to police.
I’m not suggesting that Tesla’s FSD is dangerous and should be banned, but it must be well-regulated and it must be used properly by everyone who pays the $10k to download it into their car. And there is no doubt this is a clever, helpful system that people will embrace, but it must be done with care and an understanding that computers are simply not rear to replace humans – as much as some car companies would like to.
This is just one man’s opinion, but the reality is Tesla is putting everyone on the road at risk of a FSD mistake that has the potential to be fatal, as we have to share the road with them whether we like it or not. But I, personally, would never trust Tesla’s FSD to drive me, not until it embraces the same broad spectrum of tools, including LIDAR and radar, and demonstrates a clear pattern of safety over an extended period.
Discussion about this post