Supercharged

Episode 4: Autopilot

The rise of electric vehicles is set to go hand in hand with the introduction of autonomous technology. Tesla is working to make their vehicles fully autonomous via a software update. So, is Tesla’s consumer-first approach the right path to true autonomous driving?

Credits

Executive Producer & Host: Kristofor Lawson.

Mixing & Production by James Parkinson.

Jasmine Mee Lee is our assistant producer.

Artwork by Andrew Millist.

Theme Music by Nic Buchanan.

Other music in the episode from Breakmaster Cylinder, and our ad music comes from Epidemic Sound.

Transcript

News anchor: To the deadly crash - the car on autopilot, using advanced technology. A big feature of the Tesla vehicle in question is that it allows drivers to let go of the wheel. ABC’s Marci Gonzalez on what the company says went wrong.

Marci Gonzalez: Tonight Tesla confirming this car was in Autopilot mode when it crashed in Northern California killing the driver. Walter Huang was behind the wheel of the nearly $80,000 car, heading into work at Apple just before 9:30am, March 23rd. Tesla saying “the Model X sent Huang several warnings to put his hands on the wheel earlier in the drive, but his hands were not detected on the wheel for six seconds prior to the collision.”

KRIS: That’s a broadcast from ABC News in April 2018. Between 2016 and 2019 there have been at least three confirmed deaths, involving a Tesla vehicle, where Autopilot - Tesla’s autonomous technology, was partly blamed for the crash.

KRIS: When we talk about the future of transportation, there are usually two parts to that conversation: The first is around electrifying vehicles, and the second is about creating cars with self-driving capability. For many companies that are working on autonomous technology - like Waymo and Uber - the systems are often implemented into existing petrol powered vehicles. However there’s a broad consensus that autonomous vehicles will likely be electric, and in Tesla’s case, they’re already combining both technologies, producing fully electric cars that they say are capable of becoming fully autonomous within a few years.

KRIS: But while autonomous driving technology is still in its infancy, Tesla is determined to fast-track the technology and future-proof their vehicles, going against the rest of the industry in the process. So is Tesla’s consumer-first approach the right path to true autonomous driving?

KRIS: From Lawson Media, this is Supercharged. A show about power, conflict, and the people who are driving change. I’m Kristofor Lawson, and this season we’re exploring electric vehicles, and how Tesla is forcing the entire automotive industry to move towards an electric future.

KRIS: This is episode four: Autopilot.

[Episode Montage]

KRIS: Every Tesla vehicle has an on-board computer, and just like your smartphone can receive software updates, a Tesla can too. In October of 2015, Tesla rolled out Version 7 of their software, which included an advanced driver assist feature they called “Autopilot”. It’s the company's goal to refine the autopilot software over time, to the point that their vehicles would be completely capable of driving themselves. But since this technology has been in the hands of the public, a number of incidents associated with Autopilot has seen its reliability and safety highly criticised by both the media and authorities.

KRIS: On the 7th of May 2016 near Williston in Florida, a Tesla Model S - driven by 40-year-old Joshua Brown, struck a semi-trailer, travelling underneath it and shearing off the roof of the car. The impact killed Joshua, who was the sole occupant of the vehicle. Following an investigation by the National Transportation Safety Board, or NTSB, the cause was found to be both the truck driver’s failure to give way, and Joshua’s inattention due to overreliance on Autopilot. The collision report went on to determine that a contributing factor to overreliance was the vehicle’s operational design, and use of the feature in ways that were inconsistent with the guidance and warnings from Tesla.

KRISTIN POLAND: The system wanted the driver to be engaged in the driving task and had a method of trying to understand when the driver was engaged in the driving task. But that was just a surrogate method. So it's a torque sensor on the steering wheel.

KRIS: This is Kristin Poland. She’s the Deputy Director at the NTSB’s Office of Highway Safety. And in this case, the Collision Report concluded that the sensor on the steering wheel alone was a poor method of determining whether a driver is visually engaged. Driving is inherently a very visual task, but obviously you can still place your hands on the steering wheel without actually paying attention to the road.

KRISTIN POLAND: If the torque sensor doesn't sense torque on the steering wheel, then it has a perception that the drivers hands aren't on the steering wheel. And that may be a surrogate measure for the drivers engagement. In the Williston crash the vehicle did not sense the torque on the steering wheel for a long period of time. And yet did not give any sort of alerts to the driver, in critical times that the hands needed to be on the steering wheel. In fact, it was very long durations that the system could detect hands off the steering wheel before any sort of warning was issued.  

KRIS: According to the report, Autopilot was active for 37 minutes, in which Joshua placed his hands on the steering wheel on seven different occasions for a total of just 25 seconds. It also determined that the longest period between alerts, where Joshua’s hands were not detected by the steering wheel, was nearly six minutes.

KRISTIN POLAND: We also focused on the operational design domain. So how are these vehicles designed and what is the environment that they're designed to operate in? In the Williston crash, the vehicle was not operating within its design domain. So the Tesla system was designed to operate on limited access roadways. And yet the system was able to be enacted or used on any type of roadway. And the Williston crash was on a roadway that did have a divided highway but it wasn't limited access, meaning that there was cross traffic. And that system is not able to detect cross traffic at this time. And so that was then outside its design domain.

KRIS: Following their investigations, it’s part of the NTSB’s role to make safety recommendations. These are put forward to the US Department of Transportation and other governing bodies, as well as the manufacturers of vehicles equipped with Level 2 automation systems - more on that shortly. In response to the Williston crash in 2016, Tesla made several design changes to Autopilot, via firmware updates. They reduced the period of time that Autopilot will allow a driver to keep their hands off the steering wheel before being warned. And if the driver is warned on three separate occasions by alerts, the ‘Autosteer’ function deactivates, becoming unavailable for one hour, or until the car is restarted.

KRIS: One recommendation that Tesla haven’t implemented though is an additional monitoring system. This could be in the form of a camera that detects whether the driver’s eyes are focused on the road in front of them. Despite this being a key recommendation that would help to improve safety, pressure on the steering wheel remains the only way that Autopilot monitors driver attention.

KRIS: And since that 2016 crash, there have been further collisions linked to Autopilot. A very similar incident occurred on the 1st of March 2019, also in Florida, and involving a Tesla Model 3. The car collided with a semi-trailer, striking its left side, passing underneath and taking off its roof, resulting in the death of the 50 year old driver. There have also been cases where death or injury didn’t occur, but the involvement of autonomous technology still prompted the NTSB to investigate. Here’s Kristin.

KRISTIN POLAND: We have investigated a number of other crashes and a couple of those are now public. There was another non fatal crash in Culver City, California involving a Tesla Model S that was operated in an HOV Lane. In that case, there was a fire truck that was stopped in the lane, lending assistance to a crash that happened on the other side of the roadway earlier in the day. This was a cut-out scenario where the vehicle was following a lead vehicle at a relatively low speed, because there was traffic, around 20 miles per hour. And when that last vehicle changed lanes to the right, to avoid the fire truck, the Tesla then remained in the lane. And then because it was no longer following a lead vehicle, it started to accelerate to its pre-set cruise speed. About a half a second before it impacted the fire truck, it did give a Forward Collision Warning, but it didn't activate automatic emergency braking or react to the fire truck, nor did the driver take any action. And so it impacted at around 30 miles per hour into the back of that fire truck.

KRIS: Across all of these cases - including the 2018 crash we heard at the top of the show - the probable cause determined by the NTSB has been consistent; Autopilot’s operational design and overreliance on that technology from the driver. An additional monitoring system could be a technical solution to help reduce driver complacency, but in placing this technology in the hands of drivers, Tesla also have a responsibility in the way that they market Autopilot to customers. A company statement following the Williston crash investigation claimed they would, quote “continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology.”. But that message isn’t always clear, particularly when it’s coming from the CEO Elon Musk.  

Elon Musk: “All cars being produced have all the hardware necessary - compute and otherwise - for full self-driving.”

DANA HULL: You know, Tesla has stressed that the driver needs to maintain control at all times.

KRIS: This is Dana Hull - she reports on Tesla for Bloomberg News.

DANA HULL: But there is YouTube videos of plenty of people doing things like reading the newspaper or sitting in the backseat while the car is driving, and even Elon Musk himself, when he's demonstrated the technology, has made a point of taking his hands off the wheel and showing that the car is capable of its own.

Bloomberg News - Elon Musk: So it’s on full Autopilot right now.

Bloomberg News - Betty Liu: Okay.

Bloomberg News - Elon Musk: I’m not touching anything, no hands, no feet, nothing.

KRIS: That’s Elon Musk demonstrating Autopilot with Bloomberg’s Betty Liu in October 2014, prior to its public release the following year. And here he is speaking at Tesla’s Autonomy Day event in April 2019.

Elon Musk: I’ll say that again, all Tesla cars being produced right now have everything necessary for full self-driving. All you need to do is improve the software.”

KRIS: Elon goes on to admit that advancing the software is still a significant challenge to overcome.

Elon Musk: The software problem here should not be minimised, it’s a massive software problem that...yeah. Managing vast amounts of data, training against the data, how do you control the car based on the vision. It’s a very difficult software problem.“

KRIS: But then moments later, Musk is predicting that Tesla will launch their first robo-taxi as soon as 2020.

Elon Musk: And we expect to have the first operating robo-taxis next year, with no one in them, next year. I feel very confident predicting autonomous robo-taxis for Tesla next year. Not in all jurisdictions, because we won’t have regulatory approval everywhere, but I’m confident we’ll have at least regulatory approval somewhere, literally next year.”

KRIS: Of course, you don’t get all of the details at these types of media events, but the messaging is enough to blur the lines between the future potential of this technology and its current limitations. Even the description on Tesla’s website for the upcoming Model Y - scheduled for release in 2020 - suggests that Autopilot is further along than you might think. Quote: “Model Y will have Full Self-Driving capability, enabling automatic driving on city streets and highways, pending regulatory approval, as well as the ability to come find you anywhere in a parking lot.”

KRIS: Now, Elon Musk has achieved some incredible things that people once thought impossible - like creating reusable rockets with SpaceX - but he’s also known for his bold predictions that don’t always align in reality. And when an influential CEO like Musk makes these kinds of ambitious projections, how does that contribute to the public perception of autonomous cars?

Elon Musk: So, all these things, I said we’d do them - we did it. I’d said we’d do it, we did it. We’re going to do that robo-taxi thing too. Only criticism, and it’s a fair one, and sometimes I’m not on time (laughs). But I get it done, and the Tesla team gets it done.”

KRIS: Joshua Brown, The victim in the Williston crash in 2016, actually posted several videos to his personal YouTube channel, demonstrating Autopilot while driving his Tesla Model S. They were still online at the time of recording, and from watching these, Brown clearly loved the driving experience, and he does sound like he had a good understanding of the system’s capabilities. And yet, the NTSB investigation concluded that he still came to over rely on the technology.

Joshua Brown:Here we are, stop-and-go traffic 271 and ah jeez, car’s doing it all itself. I don’t know what I’m going to do with my hands down here. It takes all the stress out of it, tell you what, it’s like “meh”. Get to your destination slightly slower but eh, at least now you don’t have to worry about anything. Just let it go.”

KRIS: The element of over-reliance by experienced Tesla drivers is something that Elon Musk acknowledges. But he still places that responsibility solely on the driver. Of course, all drivers take on a responsibility when they get behind the wheel of any vehicle. But a Tesla is not like most other cars. It’s clear there is still a lot of misunderstanding around the technology, and you could argue that some of that is also due to the autopilot name itself. Here’s Dana Hull again.

DANA HULL: Yeah, I mean I think Tesla has been criticised for the way they’ve marketed the technology. Certainly the name ‘Autopilot’ gives you this sense that you can just sort of sit back and let the car or the car's computer drive itself. When we think of autopilot, we tend to think of the airline industry, where the pilot is in the cockpit at the controls but the plane can largely fly itself. And you know, ‘autopilot’, it sounds like automatic in some ways.

DANA HULL: I think it is a bit of a challenge in terms of marketing it. I think what happens is Tesla owners who have the car, they get used to Autopilot, and they're more aware of what it's capable of, and then they can at times over rely on it. So, there's still a long way to go in terms of just making sure that everybody understands what this technology can and cannot do. And Tesla's always updating the technology, so the Autopilot is capable of more and more every day, and they're able to update the software over-the-air to people.

DANA HULL: ..I do think that the news coverage of the crashes is a little bit overblownbut yeah, I mean the name ‘Autopilot’ makes it sound like it's slightly capable of doing more than it really is, at least at the moment.

EDWARD NEIDERMEYER: If they were pursuing this approach to autonomy as a research project, that would be fine. You know, maybe they'll be able to get that technology there at some point. What they're doing though, is actually product development.

KRIS: This is Edward Neidermeyer, author of Ludicrous.

EDWARD NEIDERMEYER: The critical distinction here is that you don't take customer cash for a research project, you take customer cash because you have scoped - because in the course of your research you've determined that something is in fact possible, you've created a spec for that product and you know you have a path to get there. Even then taking cash from the customers for that product is risky. People are not good at monitoring automation. So if a system is good enough to keep you, you know, to take care of driving, 90% of the driving, but 10% of the time, you'll have to step in and save it or, you know, save yourself, that's very hard. Driving is hard, and humans are generally not always that great at it. But at least if you're driving, you're driving, and you're engaged in the task, and you have to be, to some extent engaged in the task. When you start to automate, you know, the pedals and then the steering wheel, you know, this is when you get into this weird place where you're no longer driving, you're monitoring it.

KRIS: As we’ve already explored in this series, Tesla as a company has rapidly evolved since its founding in 2003. What began as a fun startup driven by a love for fast cars and a focus on the premium auto market, quickly turned into a global mission for ‘accelerating the world’s transition to sustainable energy’. And in recent years, the promise of full self-driving capability, has become a huge messaging focus for the company. And as I discovered talking to some current Tesla customers, it’s a real draw card.

MARK TIPPING: So we’ve driven about 120,000km without my hands on the wheel, and just watching the car make decisions and making sure it’s making the right decision.

KRIS: This is Mark Tipping, who we heard from in the first episode, this season. He’s the owner of a Model S, and as you can hear, he loves using Autopilot.

MARK TIPPING: So basically, on all the boring bits, like right now in bumper-to-bumper traffic, or going down the freeway where it’s boring, you can give the car to drive. But when it comes to some tight corners and a little bit of fun, well you take over and drive the car. So you get to do all the fun bits of driving - if you like driving, like I do - but you don’t have to do the boring bits to get there.

KRIS: Some people might buy a Tesla because they’re environmentally conscious. But Mark says one of the main selling points for him was the Autopilot feature.

Mark Tipping: Well, the fact that it’s electric isn’t really the big thing. The fact that it’s got incredible performance and the fact that I’ve just let go of the steering wheel, and I’m not looking at where the traffic is - well, I am watching a bit - but the car’s driving itself right now.

Kris: Okay...

MARK TIPPING: And it will drive with the traffic, it will slow down, it will stay in the lanes, and I can just relax and watch to make sure that no silly person runs across the road or something. But yeah, that’s the biggest one.

KRIS: You don’t have to be a car enthusiast to buy a Tesla, but it seems to be a common trait that many Tesla drivers do have a deep understanding of how their vehicle operates. Mark is certainly one of those people - and so is Adrian Stone, another owner that I got to take a ride with. And there was no doubt that Autopilot had made a genuine impression on him.

ADRIAN STONE: I literally drive in ‘self-driving’ in every single opportunity - I’ve just done it now. And I’ll stay in self-driving mode until there’s an issue with the traffic to pull me out of it. Whereas in every other car, you’ve got to concentrate on the traffic the whole time.

KRIS: For Adrian, Autopilot has practically become his default method of driving his Tesla.

ADRIAN STONE: So I use self-driving in any potential opportunity I have. And that really means any time it tells me that self-driving is available, and I’ll double click and turn it blue and I’m in self-driving. And I’ll let the car tell me when it can’t handle it and pop itself out. So for example, we’re on a road now, that’s got a very washed out lane marking on the left-hand side - in fact the Tesla can’t even see it because it hasn’t shown them - and it’s got a curb on the right-hand side. But yet, the Tesla has decided that there’s a drivable route here and is allowing the self-drive. And the car has started by itself, as the traffic has moved, and it’s going around the curb - there’s a car parked on the curb, it’s keeping a safe distance from that all by itself. So, self-driving means that I’m effectively in an Uber without a driver.

KRIS: But this isn’t really true for where this technology is at right now - and we’ll get into why in a moment - but that’s not the way that many drivers like Adrian think and talk about Autopilot.

ADRIAN STONE: I mean, we’re talking and I’m driving and I’m paying very low attention to actually, the road or where we’re going, cause the car is doing most of it. But I’m still watching for hazards.

[Montage of Adrian saying ‘Self-Driving’]

KRIS: Adrian says he does keep a close eye on things - and during this ride I did see the driver warning system in action. But Adrian was also willing to push the limits a bit. At one point, we approached a red light and there was a truck which had stopped in the right hand turn lane, with the back of it hanging out into our lane. Autopilot was active, and to be honest I was a little bit concerned that the Tesla might not stop.

ADRIAN STONE: Let’s see if it stops for the truck...I wasn’t happy with that so I took it out of Autopilot. So you see there, there was a truck in the right hand turn lane, but his back end was slightly sticking out in my lane? And the car didn’t seem to recognise that until the last minute. And I don’t want to play chicken with Autopilot to see if it actually would have stopped the car dramatically. So I pulled out of Autopilot and stopped it myself. But who knows? Would it have hit the back of the truck or wouldn’t it have? Would I have been another “Tesla hits a truck” article or not? So I’m always vigilant.

KRIS: Coming up after the break, we look at how Autopilot actually works; the technology behind it, why Tesla’s approach is so different to the rest of the industry, and we ask, just how far are we from full autonomous driving?

[ AD BREAK ]

KRIS: The technology behind Autopilot is a combination of hardware - which every Tesla vehicle is built on - and advanced software. Autopilot is included on all Tesla models - however there is a paid upgrade you can buy to activate some of the advanced driving functionality - which Tesla calls ‘Full Self Driving’ on checkout.

KRIS: The Society of Automotive Engineers classifies vehicle autonomy on a 5-point scale, with Level 1 being Zero Automation - meaning the driver performs all the driving tasks. Level 5 is classified as Full Automation where the vehicle is capable of performing all driving functions under all conditions. And that’s the holy grail Tesla are trying to get to.

KRIS: But Autopilot in its current form is rated as Level 2 - which means the vehicle has some semi-autonomous functions, but the driver must pay full attention, monitoring the driving environment at all times. Semi-autonomous features offered by Autopilot include Traffic Aware Cruise Control, the ability to automatically change lanes, and the ‘Autosteer’ function - this maintains the cars position within the travelling lane, and is perhaps the most controversial feature in Tesla’s.  

KRIS: If your own car has things like emergency braking, blind spot detection, or even regular cruise control, these driver-assist features are classified as Level 1 - or a combination of them may mean the vehicle is under Level 2. Tesla’s system is slightly more advanced than most cars in these categories, but at the moment Autopilot remains a partial automation feature. And it’s not until you reach Level 4, that a self-driving system under this classification would no longer require input from a human driver.

KRIS: A fundamental requirement for autonomous driving is the ability to detect three dimensional objects in a vehicle’s environment. In order to deliver these capabilities, Autopilot has three main hardware components: Radar, Ultrasonic and Passive Visual. Let’s break them down.

KRIS: Firstly, Radar. You’re probably familiar with this one, and it’s simple enough - a system that uses radio waves to determine the range, angle or velocity of objects. In a Tesla, it has a range of 160 metres, predominantly in front of the car.

KRIS: Next is Ultrasonic. This system uses ultrasonic sound waves to determine the distance between the vehicle and surrounding objects. It has a much shorter range than Radar - a maximum of 8 meters - but covers the entire circumference of the vehicle.

KRIS: And lastly, Passive Visual. This is a detection system that uses passive, forward-facing cameras and complex algorithms to basically determine the cars field of view. Optical cameras are the core component that Tesla is relying on for autonomous driving.

KRIS: Other companies that are working on autonomous cars, like Waymo, employ some combination of these three systems. But they also use something else, which Tesla does not. It’s called LIDAR - Light Detection And Ranging.

KRIS: Simply put, it’s a kind of light beam sensor. It measures distance by illuminating a target with a laser light. The LIDAR scans the surroundings and creates a three dimensional depth map, which is then analysed by software to identify objects. And across the industry, LIDAR is considered crucial to achieving full Level 5 automation because it’s extremely accurate and reliable. But Elon Musk disagrees. Here he is again, speaking at Tesla’s 2019 Autonomy Day conference.

Elon Musk: LIDAR is a fool’s errand, and anyone relying on LIDAR is doomed. Expensive sensors that are unnecessary. I should point out that I don’t actually super hate LIDAR as much as it may sound. At SpaceX, SpaceX Dragon uses LIDAR to navigate to the space station and dock. Not only that, SpaceX developed its own LIDAR from scratch to do that, and I spearheaded that effort personally - because in that scenario, LIDAR makes sense - and in cars it’s friggin stupid. It’s expensive and unnecessary, and as [inaudible] was saying, once you solve vision it’s worthless.“

KRIS: Musk is correct in saying that currently, LIDAR systems are extremely expensive. They would significantly increase the cost of production and the purchase price for Tesla customers. They’re also bulky and housed on top of the car, which creates unwanted drag. But there are companies working to make LIDAR cheaper and easier to integrate into the body of a vehicle. It’s inevitable that LIDAR will eventually become cheaper and more streamlined, but rather than wait for the technology to catch up, Tesla is relying solely on the camera based system of Autopilot. And as Dana Hull explains, it integrates seamlessly into their current vehicles, enabling them to refine the software in real-world environments.

DANA HULL: You know, every company has a slightly different approach. So while other competitors and manufacturers are gathering miles by doing test miles or doing simulation, Tesla's miles are largely from regular owners driving them.

DANA HULL: Tesla has taken the approach that rolling out the technology to customers is the best way for the AI and the algorithms to learn, and so they've made a big bet that ultimately more lives will be saved by getting this into the hands of consumers earlier rather than later, and that comes with enormous risks, and there have been fatal crashes as a result of it. But we don't hear about all of the lives that were saved because of Autopilot. We just hear about the fatality. So it's a really interesting time right now in terms of the whole race for self-driving, and Tesla is very much going against the industry conventional wisdom with their rollout of Autopilot.

KRIS: And the reason Tesla is able to take their unique approach is because there are very few regulations, especially in the United States. The technology is so new that authorities are still playing catch-up. And this is even true for companies that are working with LIDAR systems, under more controlled parameters.  

DANA HULL: It's really kind of like a regulatory wild west right now. There is no national regulation in place, so what you're seeing is that in certain municipalities, in cities and states, there are pilot projects underway where people can test it. For example, Waymo, which is the self-driving car company that spun out of Google, is piloting passenger drop-off and pick-up in the Phoenix area, but my understanding is that there's still a safety driver behind the wheel as they're unrolling that system. Then eventually that will go away. But we're still waiting for a regulatory framework to look at how this is all going to unfold here in the United States. And I'm imagining it will be very different, country to country.

KRIS: We reached out to a number of Australian governing bodies to find out how our own country plans to regulate the roll out of autonomous cars. But the responses we received were conflicting.

KRIS: In Australia - vehicle certification is governed by the federal Department of Infrastructure, Transport, Cities and Regional Development. The government has strict design rules which govern things like Safety - that’s making sure you have particular safety equipment like seatbelts and airbags installed and that these will all work in an accident. Every manufacturer will also undergo a safety rating as part of the independent Australasian New Car Assessment Program or ANCAP.  Tesla Model 3s - for instance -  have a 5-Star ANCAP safety rating in Australia - the highest rating - with an adult occupant protection rating of 96%. This is a huge win for Tesla - as safety standards for vehicles - broadly - are something that is agreed upon at a national level.

KRIS: When it comes to autonomous functionality though - the legislation is a little unclear. Back in 2016 - each state or territory in Australia signed on to the National Policy Framework for Land Transport Technology. This is about creating a coordinated approach to autonomous vehicle policy and regulation… however as I’ll explain - this approach is anything but coordinated.

KRIS: As it stands the onus is on the manufacturer to make sure their vehicles continue to meet the Australian Design Rules. And in the case of Tesla they are constantly updating and enabling functionality through software updates with the hope of achieving full autonomy at some point during 2020. The government doesn’t check these updates before they get pushed to consumers - meaning manufacturers could push updates - such as fully autonomous functionality - that our existing legislation is definitely not ready for.

KRIS: A statement from the Department of Infrastructure, Transport, Cities and Regional Development says that “A national approach is important to the safe and successful deployment of automated vehicles in Australia” and that the Australian Road Rules don’t currently permit the operation of highly autonomous vehicles. They also said it’s the “responsibility of state governments to investigate serious accidents of vehicles using Australian roads”.

KRIS: And this is where it gets confusing because it’s clear that Australia doesn’t have a national approach to dealing with incidents involving autonomous functionality, if ever they arise. In the US the NTSB investigates serious accidents at a national level and makes recommendations - that’s why we know so much about the crashes involving Tesla’s autopilot. However in Australia the Australian Transport Safety Bureau has no responsibility over investigating road accidents. If you’re in a plane, train, or on a boat, and you crash, they will investigate and determine the cause of the crash, but the roads are a state problem.

KRIS: I reached out to the Victorian State Government Department of Transport for details on their autonomous vehicle pilot program, and to find out who was tasked with investigating autonomous vehicle accidents should they ever occur. The Department told me that the Police investigate all road accidents, and during the trial period companies running autonomous vehicle trials have to notify the department within 24 hours of an incident by filing a Serious Incident Report. However if there was a potential safety issue the Australian Competition & Consumer Commission, or ACCC, may investigate these incidents and issue a recall.

KRIS: So I reached out to the ACCC - who said it’s an issue for the Federal Department of Infrastructure to administer voluntary recalls as well as investigate safety issues. They also told us that there has been a small number of consumer reports filed with the ACCC relating to potential safety issues with Tesla vehicles. Those reports get forwarded to the Department of Infrastructure, because it’s not the ACCC’s job to investigate them. So we’ve filed a Freedom Of Information request with the Department of Infrastructure to obtain those records - and we will inform you when we receive a response.

KRIS: When it comes to autonomous functionality, our current legislation in Australia falls well behind the pace being set by companies like Tesla. The Victorian Department of Transport told us that they don’t anticipate any autonomous vehicle legislation for public use being ready until at least 2021 - which is after Tesla’s desired goal for the rollout of a fully-autonomous versions of Autopilot. And that version of Autopilot will be based on cameras.

KRIS: Elon Musk is adamant that a camera-based system will win the race to full autonomy. And there is actually new research that suggests he may be right.

KILIAN WEINBERGER: I would think in a couple of years we’ll probably match the accuracy of LIDAR.

KRIS: This is Kilian Weinberger. He’s a Professor of Computer Science at Cornell University in New York. Kilian and his colleagues have used machine learning to increase the accuracy of passive cameras - similar to the ones used in Tesla’s - and he says that through this process, they’ve basically been able to mimic the behaviour of LIDAR.

KILIAN WEINBERGER: So, cameras essentially - the cameras are inherently very, very different than LIDAR. Because LIDAR is an active center, right? You’re sending out a laser beam, and I'm actively probing, right, “where is that object in front of me?”, right? A camera is a passive sensor, so I'm not sending anything out, I'm just receiving light that comes from the sun or something, right? So, there is a limitation to camera, right. So if there’s an object in front of you a LIDAR will always see it, you know, super, super reliable. Cameras on the other hand, what we’re doing is essentially using stereo images - the same as humans do. So we have two eyes - that's essentially what it can basically do, it's called stereo vision. You have two cameras, on the very right side of the car, and on the very left side of the car. And then if you see an object in front of you, you match these two images. And something that's very, very far away will be exactly the same location in both cameras. Something that’s very close will have an offset. And so you measure that offset, and then you compute basically how far away that object is from you. It's not as reliable as LIDAR, but it's way, way, way cheaper. And one thing is, with machine learning we can make it increasingly accurate. So it used to be that people just matched pixels, that's very inaccurate, that doesn't work very well. But now we can train convolutional neural networks - a specific type of machine learning algorithm - to essentially take these two images, do all the matching within the neural network, and then output a depth map. So essentially we are mimicking LIDAR using these two cameras. And what we could show with this is that we could get reasonably close to the LIDAR accuracy.

KRIS: The team at Cornell were able to raise the detection accuracy for objects within a 30 metre range to 74%. This is up from the previous benchmark of 22%, and shows that cameras have much greater potential than previously thought. For further distances, it’s more a question of the camera's resolution and computing power of the system. But machine learning algorithms are key to analysing this data and improving software like Autopilot. It becomes smarter through repetition and recognising patterns. This is the very reason why Tesla want this technology in the hands of as many drivers as possible. But there are also limitations to this strategy, because in a lot of cases, there’s a diminishing return of new data.

KILIAN WEINBERGER: So it’s not the case that if I double my data, now I’m now getting half the error rate. That’s not how it works.

KRIS: For example, if you always drive the same way to work and back every day, there isn’t a lot of new information for the system to process. The other problem with machine learning algorithms is they don’t understand logic like humans do. Kilian uses the example of a stop sign. You can program and teach the algorithm to stop at a stop sign, but it also wouldn’t be able to differentiate a sign from a person wearing a t-shirt with a stop sign on it.

KRIS: Another major criticism of cameras in autonomous cars is their capacity to operate effectively in a variety of weather conditions. If a human can’t see what’s in front of them, how can a camera? But Kilian says that for most circumstances, this isn’t a huge problem. Although, for extreme weather, like maybe a snow storm, it does pose an issue. In that situation an autonomous car would likely be programmed to pull over and stop until the weather cleared. That’s also how most human drivers would probably react anyway.

KILIAN WEINBERGER: I think that is an open problem, but it’s, I'm not so worried about it, because I think that is something like rain behaves kind of similar every single time. You know, that's something that machine learning algorithms can adapt to. What is more tricky, actually, is very rare cases. The biggest problem of autonomous driving is more, what if you see scenarios that you only see once in a lifetime or even less? So the question is, can we compensate the lack of understanding by just throwing enough data at it?

KRIS: In many cases, the answer is yes. But in rarer scenarios, the algorithm just won’t get enough opportunities, if any, to learn from those situations. So these systems may never be perfect. But the counter argument is that humans are actually more unpredictable. There were more than 37,000 motor vehicle fatalities in the United States alone in 2017. If autonomous cars can significantly reduce the death toll on our roads, isn’t that the better outcome?

KILIAN WEINBERGER: So the question is what’s the trade off? I think what people in the field hope is that algorithms will be so, so, so much safer than humans that it's out of the question. But there will always still be accidents.

KRIS: Kilian’s research paper also emphasised that relying on just one sensor is a huge safety risk. Therefore, autonomous cars should have a secondary sensor as a backup. The study concluded that improved optical cameras can fill this role effectively because they’re relatively cheap, and you can place several of them in a car. But the best solution, at least in an ideal world, is to pair cameras with LIDAR. Again, it all comes down to cost.

KILIAN WEINBERGER: I think one thing we show is that you can get very far with a camera. But if you have a LIDAR, right, if you can stick a $400 LIDAR on a car, absolutely do it, right? Because it’s an active sensor and there are certain things that occasionally you can’t see very well with a camera. That’s currently the way I would do it. I would say, use cameras as far as they get you, and then put a very cheap LIDAR on top of the car to correct the cameras where it can. Because you know the LIDAR never lies. So when you get some information from the LIDAR, you can basically use that to correct the cameras really effectively.

KRIS: The deciding factor in the full self-driving race will likely be determined by which technology not only improves the fastest but also becomes the cheapest to manufacture. But the development of this tech will most certainly clash with state and federal regulations many times before you’re able to just jump in your car and tell it where to go. So what about those robo taxis?

KILIAN WEINBERGER: I would say that these robo-taxis without geofencing seems very ambitious. Now it could be that they introduce geofencing, right, you know, because it has GPS. So they could say maybe allow this robo-taxi mode but only in parts of, for example, the United States, or Australia, or whatever, where we know it operates very fail-safe or something. I think some compromise would have to be made like that. And of course, the way the product is announced, they don’t go over all these details, right, and I could imagine that with geofencing. Will they have driverless robo-taxis anytime soon? I’m more skeptical.

DANA HULL: Making a car that is fully self-driving is incredibly hard. Roboticists and machine learning specialists and AI researchers and engineers have been working on this for well over a decade, and we are still far away from having fully autonomous cars on the road for people to use.

KRIS: We should also mention that Tesla isn't the only company linked to crashes around self-driving cars. Uber’s Advanced Technologies Group is one of those companies who are testing autonomous vehicles on public roads, with a safety operator in the driver’s seat. The vehicles are Volvo SUVs and have been modified with an Automated Driving System. In March of 2018, one of these cars struck and killed a pedestrian in Arizona, and it was found that the safety operator was using a phone at the time of the collision.

KRIS: The NTSB’s investigation determined that a number of factors contributed to the probable cause, including: the failure of the vehicle operator to monitor the driving environment, and Uber’s inadequate safety risk assessment procedures and a lack of adequate mechanisms for addressing complacency in vehicle operators.

KRIS: We live in a world where technology is taking over and automation is rapidly changing industries. And there’s an inevitability to that progress that seems impossible to fight. As for our self-driving future, Tesla have clearly accepted the risks in conducting an active experiment on public roads with customers as voluntary test drivers - and everyone else as unwilling participants. Yes, we need to improve road safety and autonomous cars could play a vital role in reducing the death toll. But how much are we willing to tolerate in the process?

KRIS: Next time on Supercharged, he’s been compared to Ironman’s Tony Stark, but how did Elon Musk become the person he is today? And how has his personality influenced the culture of Tesla?

[Montage of Episode 5 & CREDITS]

ENDS