We’re in the market for a Tesla once Level-4 self-driving finally becomes available, so this gave us an opportunity to try out a new 2020 Tesla Model X. In addition, I was very interested to see how far Tesla’s self-driving features have progressed over the past year or so, since our last trip in a Tesla Model 3.
As someone who works with AI and ML professionally, I’m very interested in the technology behind self-driving cars. In addition, as someone who hates driving in rush-hour traffic on the Las Vegas strip, I’m also very interested in owning a car that can drive itself. So, I decided to report on my experience this weekend for those of you who are interested in learning more about the state-of-the-art in self-driving automobiles.
For the duration of our trip from Aug 7-11 2020, we drove a 2020 Telsa Model X with software version 2020.28.6.
Interstate and highway driving are essentially a solved problem for self-driving cars in 2020. Driving was almost completely hands-free during the entire trip from Las Vegas to San Diego. The Tesla successfully stayed in its lane, navigated around all vehicles, switched lanes automatically, and took the correct off-ramps. In fact, our Tesla was able to manage almost everything that I-15 and I-5 threw at it.
There were only a few places on the interstate where it seemed to have any issues. For example, it was a bit quick on the breaks when large vehicles began to drift too close to our lane. It could not detect and avoid some types of debris on the road (e.g. large chunks of a blown tire). It did not switch lanes to give stranded road-side vehicles adequate clearance. And, it had some difficulties when new lanes spontaneously appeared or when the lane markings were heavily faded or obscured.
Once intersections become involved, the Tesla operates quite a bit differently. The car correctly detects and stops at all stop signs in addition to red or yellow lights at metered intersections. However, the current version of the software cannot make either left- or right-hand turns on its own. The car alerts you that it’s at an unsupported intersection and you, as the driver, need to make the turn yourself.
For straight-through intersections, the car needs to be given “permission” to drive through the intersection unless there is another vehicle in front of you leading the way. If there is a vehicle in front of you, it follows the other vehicle’s lead through the intersection. However, if you are the first (or only) vehicle at an intersection, you need to quickly tap the cruise-control stick towards you, to tell the car that it’s safe to proceed.
This is clearly a safety feature for the current beta version of the full self-driving software. It allows a human driver to be “in the loop” to avoid accidents while Tesla is collecting more data about how to safely traverse intersections. However, Elon Musk says that the alpha version of Tesla’s self-driving software can handle intersections unassisted and this update should be coming before the end of 2020.
The Tesla correctly identified pedestrians, cyclists, construction cones, garbage cans, and other obstacles around our vehicle. We didn’t encounter a situation where a pedestrian walked directly in front of our moving vehicle, so we were not able to see how it handled this situation. Based on the car’s other accident avoidance features though, I’m guessing it would have handled this situation just fine.
However, the car did not give cyclists alongside our vehicle as much clearance as I would have expected. As an avid cyclist myself, I always give cyclists several feet of clearance — even if it requires that I have to drift across a double-line to give them the space they need. In Tesla’s defense though, the owners manual specifically mentions using extra caution while using AutoPilot around cyclists.
The user interface was relatively simple, intuitive, and straight forward. If you can operate a car’s cruise control and turn signals, you can use a Tesla in Full Self-Driving (Beta) mode. Beyond that, you just have to keep your hands on the steering wheel and provide it with just enough tension to let the car know you’re not playing video games on your cell phone or asleep at the wheel.
In addition, the Full Self-Driving Visualization Preview provides you with a simplified view of the environment around your car. It includes simple visual representations of cars, trucks, motorcycles, pedestrians, traffic cones, garbage cans, and other objects around you. It’s quite useful to gain an understanding of the car’s perception of its environment in vector space.
However, I was a little disappointed that was unable to turn on the full self-driving diagnostics visualization mode. I was hoping to be able to see what the car sees through its various cameras and sensors, including overlays for object detection, depth sensing, and path prediction. I realize that these types of diagnostic visualizations are only interesting to technicians like me though, so I completely understand why they are not part of the standard user interface.
Driving a Tesla with Full Self-Driving (Beta) is a very different experience than driving a car. It felt more like I was a Driver’s Ed instructor supervising a student driver. Much like a real student driver, the car does a great job with the basics, but you have to give it guidance, from your many years of driving experience, when it encounters a novel situation.
In addition, the car drives itself the way you would expect a computer to drive a car. It stays perfectly in the middle of its lane, it accelerates, decelerates, and turns very mechanically and efficiently, and it handles corners like it innately understands the physics of the road, but not necessarily the psychology of the passengers along for the ride.
Unfortunately, it is missing many of the “human touches” that we humans drivers provide. For example, it doesn’t give way to cars that want to merge into your lane, it doesn’t recognize that the cyclist in front of you is signaling for you to pass them, and it doesn’t wave to the driver behind you to thank them for letting you into their lane. Essentially, it’s a very efficient driver but not a very considerate driver.
The biggest difficulty I had was learning to trust the machine. My initial reaction when the car did something I was not expecting was to take control and handle the situation myself. However, as I learned to trust the car more, I gave it more leeway to see why it was doing what it was doing. Surprisingly, in many cases, what it was doing was actually more efficient or possibly safer than what I, myself, would have done.
Ultimately, as a result of this experience, my wife and I plan to purchase a Tesla once they have feature-complete Level-4 self-driving capabilities in production. In the meantime, it’s important to note that a Tesla, with Full Self-Driving (Beta) enabled, is still about 10x safer than the average human driver.
To learn more about AI and how it will impact you, your career, and our world, be sure to check out my free online course Artificial Intelligence: Preparing Your Career for AI.