Researchers from Rice University and the University of Maryland have achieved a significant breakthrough in the field of optical imaging. They have developed a cutting-edge technology called NeuWS, which stands for “neural wavefront shaping,” that has the potential to revolutionize camera systems. This technology enables cameras to capture full-motion video even in challenging conditions such as fog, smoke, driving rain, murky water, and even through media like skin and bone, which scatter light and obscure objects.
Imaging through scattering media has long been considered the “holy grail problem” in optical imaging. Scattering of light limits spatial resolution and hinders effective imaging in various scenarios. By leveraging the core technique of neural wavefront shaping, the researchers have devised a solution to mitigate the effects of scattering, thereby pushing the boundaries of imaging capabilities.
The applications of this breakthrough technology are vast. Autonomous driving vehicles, for instance, often struggle with imaging in bad weather conditions, which are essentially characterized by light scattering. By overcoming this challenge, this technology can greatly enhance the imaging capabilities of autonomous vehicles, improving their performance and safety.
Microscopy, particularly in the study of deep tissues in living organisms, has faced similar hurdles due to light scattering. With the ability to image deep tissue in vivo, researchers in biology and medicine can gain valuable insights into complex biological processes.
Underwater photographers have also long grappled with the limitation of imaging only objects in close proximity due to light scattering in water. By addressing this problem, the new technology opens up opportunities for capturing clearer and more detailed images of subjects at greater depths.
The researchers, including Ashok Veeraraghavan and Christopher Metzler, have made significant progress in developing NeuWS, a technology that has the potential to overcome scattering in various challenging scenarios. Veeraraghavan expressed optimism about the practicality of their approach, although there is still substantial work to be done before building prototypes for different application domains.
NeuWS is based on the understanding that light waves possess magnitude (energy) and phase (oscillation state) properties. Measuring the phase is crucial for combating scattering, but directly measuring it is impractical due to the high frequency of optical light. Instead, the researchers measure incoming light as “wavefronts” that contain both phase and intensity information. Through backend processing, they rapidly decode the phase information from hundreds of wavefront measurements per second.
The main technical challenge lies in swiftly measuring phase information. Rather than directly measuring the oscillation state, the researchers measure its correlation with known wavefronts. By interfering a known wavefront with an unknown wavefront and measuring the resulting interference pattern, they establish the correlation between the two wavefronts.
Veeraraghavan explains this correlation-based approach, using the analogy of observing the North Star through clouds at night. By knowing what the North Star should look like and detecting how it appears blurred, one can infer the blurring effects on other objects. The goal of NeuWS is not only to undo the effects of scattering but also to do it rapidly enough so that the scattering media itself does not change during the measurement.
The researchers demonstrated the capability of NeuWS by capturing video of moving objects obscured by scattering media using state-of-the-art spatial light modulators. In experiments involving a spinning microscope slide with printed images, NeuWS corrected for light scattering caused by various media such as onion skin, nail polish-coated slides, chicken breast tissue slices, and light-diffusing films. The technology successfully restored clear video of the spinning figures.
To achieve this, the researchers developed algorithms that continuously estimate both the scattering and the scene. They employ a neural network with 16,000 parameters, which takes the altered phases from modulated light and rapidly computes the necessary correlations to recover the original phase information of the wavefront. Leveraging neural networks allows for the design of algorithms that require fewer measurements, resulting in faster capture times and the ability to capture video rather than still frames.
In conclusion, the development of NeuWS holds promise in addressing the challenge of imaging through scattering media. The technology’s ability to correct for light scattering in real time has been demonstrated, although further research and development are needed to advance its application in various domains.
Source: Rice University