During Tesla’s first Autonomy Day in April, CEO Elon Musk raised some eyebrows when he criticized LiDAR and said the technology would soon be obsolete.
As part of a discussion on the merits of Tesla’s autonomous vehicle (AV) technology, Musk derided LiDAR as a "fool’s errand" due to its expense and the sensors not being high-powered enough to determine what the AV should stop for and what it should not. "Anyone who relies on LiDAR is doomed," Musk continued.
Instead, Musk said, AVs should use visual recognition cameras, which are underpinned by having gathered data in the real world and experienced the scenarios that human drivers must react to. Andrej Karpathy, senior director of artificial intelligence (AI) at Tesla, echoed these concerns by saying LiDAR has trouble distinguishing between a plastic bag and a rubber tire. "It sidesteps the fundamental problem, the important problem of visual recognition, that is necessary for autonomy," Karpathy said at Autonomy Day.
While it is a debate that has been ongoing in the AV space, many say LiDAR still has a big role to play as self-driving cars grow in use.
In defense of LiDAR
Unsurprisingly, AV companies that currently rely on LiDAR are big proponents of it and say that it is a crucial tool for safety. The technology fires rapid pulses of laser light at a surface, then a sensor measures how long it takes for those pulses to bounce back. The differences in the time it takes those pulses to return are used to make digital 3D images. High-powered LiDAR can see up to 200 meters ahead.
Velodyne Lidar, a LiDAR technology company based in Silicon Valley that has worked with 25 AV programs, took issue with Musk’s claims that the technology is going to be superseded. Company president Marta Hall called on Tesla to publish testing results and comparisons to back up his claims, which she said are part of a broader trend that Musk "brought us innovation, but not quality when it comes to safety."
"Not until we see published test results and comparisons that a Tesla can 'see just fine,' without a LiDAR, can claims be trusted," Hall told the Silicon Valley Business Journal in an interview. "Without LiDAR, the system is missing crucial 3D vision with 3D data points for collision avoidance and advanced autonomous navigation."
And AV companies are standing by their continued emphasis on LiDAR as a key tool. Google-backed Waymo has leaned heavily on LiDAR and has been developing its own sensors since 2011, when it started designing three types of LiDAR for use on its AVs. "Our custom LiDARs have been instrumental in making Waymo the first company in the world to put fully self-driving cars on public roads," Simon Verghese, head of the company’s LiDAR team, wrote in a Medium post.
Waymo made moves to expand the LiDAR market and bring down the cost of equipment earlier this year by selling its custom sensors to other companies, a process that began with its Laser Bear Honeycomb short-range perimeter sensor being made available. The product has a 95-degree vertical field of vision, a 360-degree horizontal field of vision and can see objects directly in front of it, and Verghese said it is a "best-in-class perimeter sensor."
And LiDAR technology continues to evolve. Late last year, Aeva, a startup founded by two former Apple engineers, unveiled an advanced sensor said to be a "generational leap" that will "play a big part in enabling autonomous vehicles to go mainstream.”
In a Medium post, Aeva co-founder Mina Rezk said the system would be free from interference and offer "a point density that approaches camera-like resolutions," among other innovations. "All of these differentiators provide a sensing system of unprecedented performance, giving vehicles equipped with our technology the complete insights they need to see, understand, and navigate their environment in a drastically safer way," Rezk continued.
Alternatives
In a crowded technology marketplace, Musk is right that there are alternatives to LiDAR for helping AVs navigate and detect things in the world around them. Those alternatives are driven in part by what some see as a deficiency in LiDAR to understand different scenarios and obstacles that may require a different reaction from the AV.
"It turns out that even with LiDAR, what’s missing today is not seeing more accurately or seeing further, what’s missing is understanding what’s happening to those vehicles around you and being able to predict what the motion is so you can react accordingly and be safe," Anthony Levandowski, the former Uber and Waymo engineer, said in an interview earlier this year.
Tesla relies on cameras mounted on its cars alongside a radar on the front and some ultra-sonic sensors. Musk said having those cameras collect data on scenarios is better than the simulations run by companies like Waymo, as simulators do "not capture the long tail of weird things that happen in the real world."
He said that method, which collects data and sends it to the cloud, then uses that data to enhance the AVs' built-in knowledge of all the things that could happen on the street, and also has more redundancies built in. "The general principle here is that any part of this could fail, and the car will keep driving," Musk said.
One other alternative floated by some companies is thermal Far Infrared (FIR), which uses thermal cameras to build images for the AV and is most useful in situations where LiDAR might struggle, like inclement weather, quick changes in lighting and in very densely trafficked areas. Similar technology has been used by the military for decades, including for autonomous low-level high-speed flight capabilities in airplanes and on missile seekers.
"The beauty of our sensor is on the one hand, it's a passive technology," Yakov Shaharabani, CEO of thermal FIR company AdaSky, told Smart Cities Dive. "Like daytime cameras, it's very passive, we don't need to use any active beams or something like that, so that's why it is a very cheap solution on the one hand. And on the other hand, it is on a different wavelength than others, so it provides additional information in scenarios and use cases where others are being challenged."
In cities, thermal imaging could be crucial to detect other vehicles, bicyclists and pedestrians, especially in low light. An Uber AV being tested on the dark streets of Tempe, AZ hit and killed a pedestrian last year, and while thermal imaging might not have helped avoid that collision, it could have mitigated the worst of its impacts.
"We are driving in very dense environments where you have buses, trucks, cars and motorcycles, bikes and pedestrians. In some cases you will see over 100 pedestrians," Shaharabani said. "[We] can, in real-time, simultaneously, detect all those different objects in those dense environments and provide this information to the car and to the decision-making process."
Given that, Shaharabani said thermal FIR technology will be implemented as part of a suite of sensors as "there are some use cases where there is no other technology that can introduce a safe solution like FIR." In particular, he said, it could enhance pedestrian safety and help reduce the climbing pedestrian fatality rate, which is at its highest since 1990, according to the Governors Highway Safety Administration (GHSA). "With our sensor, because it is based on thermal imaging, basically human beings are shining," Shaharabani said.
Sensing the future
Despite the alternatives to LiDAR that are floating around the AV landscape, some experts say that as self-driving cars mature, the technology will remain a key part of the conversation, used in concert with others like radar, sonar and other imaging techniques. This contradicts beliefs held by Musk, who said at Autonomy Day that automakers "are all going to dump LiDAR."
Ramesh Raskar, an associate profession at the Massachusetts Institute of Technology’s Media Lab, said that having many different technologies combined helps build in redundancies for safety, making crashes and other incidents less likely. He also said that while the human eye can be limited in its abilities as a sensor for driving, it is tough to recreate, and so technology needs to work together to try. "It's going to be multimodal," he told Smart Cities Dive.
Experts said another key part of the technological evolution will be ensuring that the software and data-processing is up to snuff and is able to manage a wealth of data collected on the streets. During a speech at the Smart Cities Connect conference in Denver last month, Moran David, U.S. and Canada general manager for Israeli AV software company Mobileye, said that "data is the building blocks, paving the road for autonomous vehicles."
"Not until we see published test results and comparisons that a Tesla can ‘see just fine,’ without a LiDAR, can claims be trusted."
Marta Hall
President, Velodyne Lidar
As AVs connect to the infrastructure around them, David said that data collection could be useful to help city planners see where cars are more likely to exceed speed limits; where there are any hot-spots for collisions and near-misses with pedestrians; how many people wait at a bus stop; and even how they behave in construction zones.
Raskar agreed with Musk's belief that LiDAR use will decline, but suggested it's because it will not be the only sensor. "The way car companies are using it today with very high-resolution LiDARs, multiple of them in fact, one in the front, one in the back, two on the side and one on top doing 360, that's complete overkill," Raskar said.
Some in the venture capital arena have misgivings about investing in LiDAR companies, in part because there are so many that have raised a lot of money, but also because the technology is gradually getting cheaper as it becomes more widely produced. Chad Bailey, vice president at investment firm NGP Capital, told Smart Cities Dive that he thinks LiDAR is going to become an easily-available commodity. And while the sensor technology needs to evolve, he said the software needs to keep up the pace of development.
"Seeing is one thing, but the interpretation and the actual software behind the sensors is also really important," Bailey said. "I think that's probably more of a limiting factor than producing a sensor that can give you a very distinct point cloud."