RADAR and LiDAR are both wave-based technologies that detect, track, and image the environment. Although these two technologies serve similar purposes, they are different in how they work. These differences then make them appropriate for different scenarios, where you would favor one over the other.
Both of these technologies transmit waves and receive the reflected waves. Then, they account for the duration it took for the reflected wave to return, calculate the distance, and finally give an image of the environment. But where RADAR uses radio waves, LiDAR uses light waves. Let’s see how this difference further distinguishes these two.
What Is RADAR?
The idea of RADAR, or Radio Detection and Ranging, was introduced in 1935 and developed later on to become RADAR as we know it now. A RADAR device comes with a transmitter, antenna, and receiver.
The transmitter creates radio waves which are amplified and sent through the antenna. These waves are sent to the environment, where they bounce back from objects they collide with.
The receiver then takes in the reflected waves. Radio waves travel at a constant speed, so the RADAR can calculate how far objects are, based on the time it took for the transmitted waves to bounce back to the receiver.
Radio waves can have wavelengths from 3 millimeters to thousands of meters. A larger wavelength means a lower frequency and vice versa. RADARs that use high frequency, short wave radio waves have a shorter range of detection but yield a much clearer image.
RADARs are classified by the wavelength of their radio waves. There are seven general bands of RADARS.
|Radar Band||Frequency (GHz)||Wavelength (cm)|
Even though radio waves can have wavelengths well above 100 centimeters, they are not used in RADARs as they don’t provide adequate precision and accuracy in imaging.
RADARs are used in various applications, for example, in ships and airplanes to navigate in poor weather conditions, in cars as parking sensors, and by astronomers to detect changes in the atmosphere.
What Is LiDAR?
LiDAR or Light Detection and Ranging was invented a couple of decades after RADAR. Rather than radio waves, LiDAR uses light waves to detect its surrounding objects and track them.
A LiDAR device comes with a transmitter and a receiver. The transmitter shoots waves of lights, usually in laser form, which then reflect from objects and return to the receiver.
The time it takes for the light wave to return to the LiDAR device is the measure of how far it is located. A LiDAR device can quickly form a full image of its surroundings by shooting light waves in every direction.
Light waves have a very short wavelength, and the waves used in LiDARs are usually around 950 nanometers in length. Here’s an idea of how small a nanometer is: If you split a meter-long stick into a billion equal parts and pick one up, that one piece would be a nanometer in length.
Due to their high accuracy, LiDARs can give detailed 3D images of the environment. This makes LiDARs desirable for various uses, such as creating 3D maps of forests and ecosystems, or even topologic maps of other planets.
LiDARs are also used in autonomous vehicles, as their superior accuracy allows self-driving cars to better understand what’s in front of them.
RADAR vs. LiDAR
RADAR and LiDAR are both wave-based detection and ranging technologies. The two are identical in how they work, except that RADAR uses radio waves, whereas LiDAR uses light waves. However, RADAR and LiDAR are used in different applications due to their different properties. Let’s see how the two compare to each other.
Resolution and Clarity
There are different bands of RADARs available, and each uses a specific range of radio waves. This makes one RADAR differ from another one. However, as mentioned before, a wave with a higher frequency and smaller wavelength can yield clearer images. Because of this very reason, the millimeter-band RADARs have the highest clarity and resolution.
LiDARs create much clearer images compared to RADARs. Even a millimeter-band RADAR’s resolution is still drastically lower than that of a LiDAR. This is because the smallest radio waves are still immensely bigger than lightwaves when it comes to wavelength.
LiDARs send and receive light waves to judge how far objects in their environment are. The potential issue with this method is that many things can manipulate the way light travels, and the most infamous one is poor weather. LiDARs can significantly lose accuracy under bad weather conditions like rain or fog.
On the other hand, RADARs use radio waves with much bigger wavelengths and have lower attenuation. This means that they don’t lose energy as they travel and can move a longer range through moist air without affecting their performance. Due to the same reason, RADARs also have an extended detection range than LiDARs.
Price and Maintenance
LiDARs are much pricier than RADARs, as it uses a newer and more complicated technology. LiDARs use light in the form of lasers to gather information on their surroundings, and shooting lasers requires advanced equipment.
On the other hand, RADARs have been around for nearly a century, and engineers have found ways to make them at a lower price. You could buy a millimeter band RADAR for your car for as cheap as 20 dollars. RADARs are often solid-state devices, and this means that they don’t have moving parts which makes the chances of it needing repairs minuscule.
RADAR or LiDAR?
There’s no clear winner here, as both RADAR and LiDAR have their fair share of upsides and downsides. LiDARs offer superior clarity but are prone to fail in bad weather and don’t have a long range.
RADARs have different bands, but even the high-resolution RADARs fall short in image clarity compared to LiDARs. However, RADARs have a longer range and don’t lose their function in poor weather conditions to compensate for this.
It all boils down to your application and of course, your budget, as LiDARs are much more expensive than RADARs.
Looking for a new smartphone? Want the best features? Then, you might want consider a smartphone with LiDAR.
About The Author