About LIDAR

KALM-150x150"

Tom explores the history, capabilities, and contemporary uses of LIDAR.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to [email protected]

Episode Script
I heard cars need Lidar but then Tesla doesn’t use it?
And now Lidar is in the iPhone?
Is my iPhone going to drive my car?
Are you confused?
Don’t be.
Let’s help you Know a Little more about Lidar

AS best we can tell the term Lidar originated as a simple combination of the words Light and Radar. It has been backronymed to mean “light detection and ranging” or “laser imaging detection and ranging”.
But whatever you think it stands for, Lidar is essentially a system for measuring distance, and therefore depth, based on bouncing laser light off your surroundings and measuring how long the round trip takes.
The speed of light in a vacuum is fixed, and even inside the atmosphere or water it’s pretty consistent. So the longer light takes to bounce back, the farther away whatever it bounced off of is. If you know the medium it’s traveling through you can calculate distance and depth.
Lidar operations all use an emitter, to send out the laser light and a sensor, to detect when it returns. They differ in what kinds of lasers and sensors they use, how many and how they position them but all Lidar systems basically rest on these principles.
Whether you’re using one laser and changing its position rapidly to get multiple measurements or a bunch of lasers at once with multiple sensors, you can use the data you collect from a lidar system to map your surroundings.
It’s metaphor time.
Imagine you have a few big rocks in your backyard. And yes those of you without a backyard also imagine you have a backyard. If you’re British imaging you have a garden.
Let’s also imagine, because why not that you have robot eyes that let you see how long it takes light to reflect off things. Now you take your flashlight into the backyard and you turn it on and off again. Your robot eyes tell you how long it took between you turning on the flashlight and you seeing something in the reflected light. That’s the time it took for the light to bounce off the thing in your yard. Turn your flashlight on and off a bunch and you can map where the boulders are.
You may rightly ask why you just didn’t leave the flashlight on and look, to which I remind you, you replaced your eyes with special robot eyes.
But despite the confusing if hilarious metaphor, I hope you get the sense that Lidar doesn’t see, but it measures a bunch of points and when you map all those points you get an approximation of solid objects around you and if you’re moving you can use Lidar to “see” what’s around you.
Lidar is used in loads of sciences. You can think of the obvious uses it has for geology, forestry, geography, even archaeology. It’s also used in atmospheric physics and seismology and more. And of course most people hear about it with autonomous cars and more recently for augmented reality.
But let’s step back to the very beginning.
The concept of Lidar derives from work by EH Synge in 1930 who imagined using searchlights to probe the atmosphere. See? My garden flashlight robot eye metaphor doesn’t seem QUITE as outlandish now.
The first actual Lidar system was introduced by the Hughes Aircraft Company in 1961 for satellite tracking. It was called “COherent Light Detecting And Ranging” or Colidar, a play on Radar.
In 1962, MIT scientists measured the distance between the Earth and the Moon using a reflected laser beam. Just bounce it off the moon! Depth sensed.
Back to Hughes, its Colidar mark II was put into practical use in 1963 for military targeting.
But what about LIDAr, not COLIDAR or moon bouncer?
The Oxford English Dictionary suggests the first use of just Lidar happened in 1963 in an article in New Astronomy about lasers. “Eventually the laser may provide an extremely sensitive detector of particular wavelengths from distant objects. Meanwhile, it is being used to study the moon by ‘lidar’ (light radar)”
In fact Lidar was used on Apollo 15 to measure surface height and topography on the Moon.
It also got its first regular widespread use by the National Center for Atmospheric Research to measure clouds and pollution. For decades Lidar was associated with weather research.
The lasers in Lidar systems can use various types of light from ultraviolet, through the visible spectrum into the near infrared. It can operate at wavelengths from 10 micrometers ro 250 nanometers depending on the target objects. The longer the wavelength the longer the range. It works on rocks, rain, chemical compounds, aerosols, clouds and even single molecules.
Light tends not to reflect straight back at you off most objects, like it does in a mirror. Instead it backscatters and different types of scattering can be used for different applications too.
Let’s talk hardware!
The lasers are often semiconductor diode lasers, similar to what you see in a laser printer or CD player. Kids ask your parents.
Autonomous cars use near infrared, underwater lidar uses green lasers with shorter wavelengths.
And the iPad’s lidar uses VCSELs or vertical cavity surface-emitting lasers. That’s a fancy way of saying the laser light emits from the top instead of the edge of the wafer.
VCSELs used to not be good for Lidar. They were used in mice and optical networking and only recently could be made powerful enough for Lidar. That’s cool because VCSELs are cheaper to make because the light comes from the top not the edge so you don’t have to cut wafers. That means you can put a lot more VCSELs, up to thousands, on a silicon chip which brings down the cost per unit.
OK that’s the hardware for sending the light. How about the detector? That’s the thing that does all the work measuring after all. Those laser photons are just joyriding.
There are two main kinds of detection schemes, coherent and incoherence.
Coherent detection is more sensitive, measuring doppler shifts or changes in phase of the reflected light. It’s more complex but uses less power.
Incoherent detection directly measures the amplitude change of the light which is less sensitive, but simpler and uses more power.
Both types of detection use pulses of light. Older systems use high energy pulses which need a lot of power and require eye protection. These are often used in atmospheric research with things like clouds, wind and humidity.
Micropulse is newer, made possible by better computer power and improved laser technology. It uses intermittent bursts of energy usually less than a microjoule, meaning it uses less power and can be eye-safe.
And the detectors themselves?
Detectors are often photoelectric cells made of silicon or gallium arsenide. Short-range systems use silicon photodiodes. Longer-range system with longer-wavelength lasers use avalanche photodiodes or APDs. APDs can detect lower light levels and be built into a chip to create a multi-pixel photon counter (MPPC). The iPad uses Single-photon avalanche diodes or SPADs from Sony.
Ok so how do you make all this hardware work?
There are too many ways to operate Lidar for us to cover them all. But in general they all send out millions of pulses per second. Most quickly move the light on a gimbal in a pattern that can quickly cover the available area. The laser is fixed but mirrors based on MEMS (Micro Electro Mechanical Systems) do the moving.
The gimbal-based versions are often found in advanced Driver Assistance Systems and autonomous cars. If you’ve seen a spinning can-looking thing on top of a car, that’s an old larger Lidar system. the cheaper lower power fixed versions are found in mobile devices.
Other simpler systems that don’t need as long of a range use more but less powerful lasers all in a fixed position to try to cover the available area.
Either way results in a point cloud of detected returning light. Lidar images look similar to infrared images. They don’t return natural colors but they do show detailed depth information down to the centimeter.
So what can’t Lidar do.
Well it can’t read most road signs (though it can sometimes use reflective coating to figure them out) and it can’t detect anything that’s flat. It just sees the flat. As you might have guessed based on its usefulness in atmospheric surveys it doesn’t see through fog and clouds, it sees the fog and clouds but not what’s behind them, at least not very well.
Lidar is really good at showing depth maps but it shines when combined with other data. 2D visual systems can show what the sign actually says. GPS can tell you where you actually are and turn the 3D images into a 3D map. Accelerometers can tell you how long before you hit that thing the Lidar sensor sees.
Cheaper materials are making lidar more practical in more places. That means they may be used in more robots and cars and are definitely showing up in phones.
I hope this helps you understand a little more about how those autonomous and augmented reality devices can see the world around them
In other words I hope now you know a little more about Lidar