Scientists Work on Better ‘Vision’ for Driverless Vehicles

Scientists Work on Better ‘Vision’ for Driverless Vehicles

MIT researchers develop a system that doesn’t rely on light.

Autonomous vehicles that rely on light-based image sensors often struggle to see through blinding conditions, such as fog. But MIT researchers have now developed a system that could help steer driverless cars when traditional methods fail, MIT News reports.

The new system uses sub-terahertz wavelengths, which are between microwave and infrared radiation on the electromagnetic spectrum. They can be detected through fog and dust clouds with ease, whereas the infrared-based LiDAR imaging systems used in autonomous vehicles struggle.

But implementing sub-terahertz sensors into driverless cars is challenging. Sensitive, accurate object-recognition requires a strong output signal. Traditional systems are large and expensive. Smaller ones exist, but they produce weak signals.

In a paper published online on Feb. 8 by the IEEE Journal of Solid-State Circuits, the researchers describe a chip (pictured above) that’s orders of magnitude more sensitive, meaning it can better capture and interpret sub-terahertz wavelengths.

“A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones,” says co-author Ruonan Han, an associate professor of electrical engineering and computer science, and director of the Terahertz Integrated Electronics Group in the MIT Microsystems Technology Laboratories (MTL).

Read the full story:

How To Qualify