New Holographic Camera Can See Around Corners – Or Inside Your Skull

Say cheese! Image credit: Cesar M. Romero/Shutterstock.com

It sounds like something out of Star Trek: the doctor aims a camera at your chest, and a computer generates a hologram of your heart and blood vessels. She enlarges the image and takes a look at some of your smallest capillaries, each beautifully rendered in sub-millimeter detail.

But thanks to a team at Northwestern’s McCormick School of Engineering, that may soon be a reality. They’ve created a prototype technology capable of seeing around corners and through everything from fog to the human skull. Their results are published in the journal Nature Communications.

“It’s like we can plant a virtual computational camera on every remote surface to see the world from the surface’s perspective,” explained Florian Willomitzer, first author of the study. “This technique turns walls into mirrors.”

This is the area of science known as non-line-of-sight (NLoS) imaging, and, in the era of self-driving cars and cutting-edge medical breakthroughs, it’s big news. They work – in extremely simplified terms – using a sort of visual sonar: they send out a pulse of light and measure how much it’s changed by the time it gets back.

“If you can capture the entire light field of an object in a hologram, then you can reconstruct the object’s three-dimensional shape in its entirety,” Willomitzer explained. “We do this holographic imaging around a corner or through scatterers — with synthetic waves instead of normal light waves.”

The technology setup for both seeing around corners and seeing through scattering media. Credit: Willomitzer et al, Nature Communications, 2021

The technique uses what the researchers call a “synthetic” light wave, created by merging two lasers with different wavelengths. This light wave hits the object of interest and gets scattered away so that, under normal circumstances, we wouldn’t be able to see it. That can be because it’s around a corner, behind a wall of fog, or inside our body – from an engineering perspective, it’s all basically the same question, Willomitzer explained.

“If you have ever tried to shine a flashlight through your hand, then you have experienced this phenomenon,” Willomitzer said. “You see a bright spot on the other side of your hand, but, theoretically, there should be a shadow cast by your bones, revealing the bones’ structure. Instead, the light that passes the bones gets scattered within the tissue in all directions, completely blurring out the shadow image.”

It's far from the first attempt by researchers to develop NLoS techniques, but current technologies have always run into a few obstacles: low-resolution imaging, long processing times, and various technical size restrictions – existing methods often need either very large areas to work in, or else give only extremely limited fields of view. On top of that, using just one light source comes with its own problems: after all, light, famously, is extremely fast.

“Nothing is faster than the speed of light, so if you want to measure light’s time of travel with high precision, then you need extremely fast detectors,” Willomitzer said. “Such detectors can be terribly expensive.”

But using two different wavelengths instead of one allows the prototype to work without ultrafast light sources and detectors – but it also results in a fast, high-resolution image with a wide field of view.

“It gets better,” Willomitzer added, “as the technique also can work at night and in foggy weather conditions.”

Although Willomitzer is clear that there’s “still a long way to go” before this technology turns up in everyday life, he’s sure that “it will come.” And while its applications for driving and medical imaging are clear, he says the potential for the technology are far wider reaching than we might think.

“Our technology will usher in a new wave of imaging capabilities,” he said. “Our current sensor prototypes use visible or infrared light, but the principle is universal and could be extended to other wavelengths. For example, the same method could be applied to radio waves for space exploration or underwater acoustic imaging.”

“It can be applied to many areas, and we have only scratched the surface,” he added.

Comments

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.