Arguably the biggest tech buzz-word of the past couple years has been self-driving, and the way the market is ramping up, that trend isn’t going to subside for quite some time – if ever. From research across electrical and mechanical engineering, computer science and beyond, as well as nearly every organization vaguely associated with vehicle manufacturing, the race is on to produce true Level 5 autonomy (no human intervention required), and the resulting variety of autonomous robotic systems.
One thing that every autonomous vehicle needs is sensors, and one of the most popular sensor is LIDAR (Light Detecting And Ranging) – especially 3D LIDAR. Compared to other sensors used for autonomy, like RADAR and cameras, LIDAR has the added benefits of offering higher data resolution, or being less computationally expensive depending on the corresponding unit. Often for self-driving vehicles, a combination of 2 or more sensor types achieve the most reliable localization and detection suite.
With the explosion in development of self-driving platforms, demand for 3D LIDAR has grown drastically, and now researchers and organizations are facing the dilemma of choosing a sensor. There are the industry incumbents with years of experience offering units we’re familiar with, but as the market has grown we now have new players offering alternatives, putting a new spin on things (pun intended).
A selection of options
Let’s take a look at a few of the base-units for some sensors that are available:
|Velodyne VLP-16||Carnegie Robotics Multisense SL||Quanergy M8-1||ASI Forecast||Ocular Robotics RobotEye RE05|
|Type||3D||Spinning 2D||3D||Spinning 2D||Spinning 2D|
|Scan Rate (Hz)||5-20||40||5-30||0.5||15|
|Field of View||
In order to get 3D images there are really two types of systems available: true 3D scanners with multiple channels producing a 3D image with each cycle, and systems that use variations of spinning 2D lasers to mimic 3D systems by continuously changing the scanning position.
Time to scan
One of the major differences between these two types of systems is the time required to scan a specific area. We’ve phrased it that way, rather than just “scan rate”, because not all scans are created equal. For example, taking the low-end rate of a VLP-16 (5 Hz), and the ASI Forecast’s standard rate (0.5 Hz) it would take the Forecast 10 seconds to complete the same number of cycles as the VLP completes in 1 second. The thing to note, however, is that in that first cycle, the VLP has 16 scans of data vs the Forecast’s one channel worth.
So, while you might be able to get the same (and maybe even more dense or more accurate) data from the single channel scanners, it’s going to take one or perhaps two orders of magnitude more time to collect it.
When high time to scan is paired with low sensing range, the result is a domino effect: while moving, an increase in time needed to gather data causes either a limit on speed, or data loss/ineffectiveness as the moving unit passes, or too quickly approaches, an area it needs to scan. Just like in your own car, having headlights that are too dim means you need to either drive slower to have time to react to obstacles, or moving at a regular speed but risk getting in an accident.
So if your application prevents you from either moving slowly or stopping intermittently to collect data, a sensor with low scan rates just might not be technically feasible.
Field of view
The other main difference is the Field of View(FOV). While the Velodyne, Quanergy, and Occulus Robotics units have 360° horizontal FOV, the other two units listed here achieve only 270° FOV, meaning there’s at least a 90° blind spot behind the sensor.
For scanners that are being mounted on the front or sides of a platform, that 90° blind spot might not be a big deal, but for top mounted sensors, you’re likely going to need an additional sensor to achieve full coverage.
While we can’t comment on specific prices (you’ll have to contact these manufacturers directly) we know that these range from roughly 8,000 USD to several times that for the various units. And, while you might be most inclined to simply go for the lowest-cost LIDAR, your project or research time requirements might force you to go elsewhere. We’ve seen a large influx in requests for 3D LIDAR, and unless manufacturing increases drastically, it will be just a matter of time before supply is eclipsed by demand.
At CES 2017, there was an increasing number of autonomous street-vehicles, personal robots, and drones, and as the markets for all these platforms explode over the next couple of years (especially in the early development) LIDAR and 3D LIDAR are going to continue being the primary sensors.
As each LIDAR manufacturer creates their niche in the autonomous vehicle sensing market, it will be up to the developers, researchers, and engineers to decide whether time, data density, accuracy, or available supply is most important.