Autonomous vehicles that rely on light-based image sensors usually find it challenging to see through blinding conditions like fog. However, researchers at MIT have created a sub-terahertz-radiation receiving system that could assist in steering driverless cars when conventional methods fail.
MIT researchers have developed a chip that leverages sub-terahertz wavelengths for object recognition, which could be combined with light-based image sensors to help steer driverless cars through fog. (Image credit: Courtesy of the researchers)
On the electromagnetic spectrum, sub-terahertz wavelengths fall between microwave and infrared radiation and can be easily detected through dust clouds and fog; however, the infrared-based LiDAR imaging systems employed in autonomous vehicles find it hard. A sub-terahertz imaging system detects objects by sending an initial signal through a transmitter. Then, a receiver measures the absorption and reflection of the backlashed sub-terahertz wavelengths. A signal is sent to a processor to recreate the image of the objects.
However, it is difficult to incorporate sub-terahertz sensors into driverless cars. Accurate, sensitive, object-recognition mandates a strong output baseband signal to be sent from the receiver to the processor. Conventional systems built with discrete components that generate such signals are massive and costly. Although there are smaller, on-chip sensor arrays, they generate weak signals.
In a paper published online in the IEEE Journal of Solid-State Circuits on February 8th, 2019, the researchers have reported in detail about a two-dimensional, sub-terahertz receiving array on a chip that is orders of magnitude more sensitive, indicating it can capture and interpret sub-terahertz wavelengths in a better way in the presence of a lot of signal noise.
They achieved this by implementing a scheme of independent signal-mixing pixels—known as “heterodyne detectors”—that are often very challenging to densely integrate into chips. The researchers dramatically reduced the size of the heterodyne detectors so that a number of them can be fit into a chip. The trick was to develop a compact, multipurpose component with the ability to simultaneously down-mix input signals, synchronize the pixel array, and generate strong output baseband signals.
The researchers developed a prototype, including a 32-pixel array integrated on a 1.2-mm2 device. The pixels are roughly 4300 times more sensitive compared to the pixels in the prevalent best on-chip sub-terahertz array sensors. By developing a little more, the chip could prospectively be used in autonomous robots and driverless cars.
A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones. Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.
Ruonan Han, Associate Professor, Department of Electrical Engineering and Computer Science, MIT.
Han is also the study co-author and director of the Terahertz Integrated Electronics Group in the MIT Microsystems Technology Laboratories (MTL). Collaborating with Han on the paper are first author Zhi Hu and co-author Cheng Wang, both PhD students in the Department of Electrical Engineering and Computer Science, working in Han’s research team.
The central aspect of the design is what the researchers call “decentralization.” In this design, a single pixel—known as a “heterodyne” pixel—produces the frequency beat (the difference in frequency between two incoming sub-terahertz signals) and the “local oscillation,” an electrical signal that varies the frequency of an input frequency. As a result of this “down-mixing” process, a signal in the megahertz range is produced, which can be simply interpreted by using a baseband processor.
The distance of objects can be used to compute the output signal, analogous to the way LiDAR computes the time taken by a laser to hit an object and rebound. Furthermore, high-resolution images of a scene can be produced by combining the output signals of a pixel array and steering the pixels in a specific direction. This enables not just the detection but also the identification of objects, which is crucial in robots and autonomous vehicles.
Heterodyne pixel arrays function only if the local oscillation signals from all pixels are synchronized, indicating that there is a need for a signal-synchronizing method. A single hub sharing local oscillation signals to all pixels is included in the centralized designs.
In general, these designs are employed by receivers of lower frequencies and can lead to problems at sub-terahertz frequency bands, where it is terribly difficult to produce a high-power signal from a single hub. With the scale-up of the array, there is a decrease in the power shared by each pixel, thereby reducing the output baseband signal strength, which is largely based on the power of local oscillation signal. Consequently, a signal produced by each pixel could be very weak, resulting in low sensitivity. Although this design is now being used in certain on-chip sensors, it is limited to eight pixels.
The decentralized design of the researchers overcomes this scale-sensitivity trade-off. Each pixel produces its own local oscillation signal that can be used for receiving and down-mixing the incoming signal. Moreover, the local oscillation signal of an integrated coupler is synchronized with that of its neighbor. This offers more output power to each pixel as the local oscillation signal does not flow from a global hub.
According to Han, an irrigation system is a good analogy for the new decentralized design. In a conventional irrigation system, a pump directs a powerful stream of water through a pipeline network that circulates water to various sprinkler sites. The water flow spluttered out by each sprinkler is considerably weaker compared to the initial flow from the pump. Another control system would be required to make the sprinklers to pulse precisely at the same rate.
The design of the researchers, by contrast, offers each site its own water pump, thereby ruling out the need for connecting pipelines, and offers each sprinkler with its own powerful water output. Moreover, each sprinkler communicates with its neighbor to synchronize their pulse rates. “With our design, there’s essentially no boundary for scalability,” stated Han. “You can have as many sites as you want, and each site still pumps out the same amount of water … and all pumps pulse together.”
However, the new architecture prospectively renders the footprint of each pixel considerably larger, thereby posing a great challenge to the large-scale, high-density integration in an array fashion. In their architecture, the scientists integrated different functions of four conventionally separate components—oscillator, antenna, coupler, and downmixer—into a single “multitasking” component offered to each pixel. This enables developing a decentralized design of 32 pixels.
We designed a multifunctional component for a [decentralized] design on a chip and combine a few discrete structures to shrink the size of each pixel. Even though each pixel performs complicated operations, it keeps its compactness, so we can still have a large-scale dense array.
Zhi Hu, Department of Electrical Engineering and Computer Science, MIT.
Guided by Frequencies
It is necessary for the frequency of the local oscillation signal to be stable for the system to gauge the distance of an object.
For this purposes, a component known as a phase-locked loop was incorporated by the researchers into their chip. The phase-locked loop locks the sub-terahertz frequency of all 32 local oscillation signals to a low-frequency, stable reference. Since the pixels are coupled, all their local oscillation signals share identical, high-stability frequency and phase. This guarantees the possibility of extracting meaningful information from the output baseband signals. This complete architecture reduces signal loss and enhances control.
In summary, we achieve a coherent array, at the same time with very high local oscillation power for each pixel, so each pixel achieves high sensitivity.
Zhi Hu, Department of Electrical Engineering and Computer Science, MIT.