This new imaging technology breaks the rules of optics
Scientists have unveiled a new way to capture ultra-sharp optical images without lenses or painstaking alignment. The approach uses multiple sensors to collect raw light patterns independently, then synchronizes them later using computation. This sidesteps long-standing physical limits that have held optical imaging back for decades. The result is wide-field, sub-micron resolution from distances that were previously impossible.
Imaging tools have dramatically reshaped how scientists study the world, from charting faraway galaxies with radio telescope networks to revealing intricate structures inside living cells. Even with decades of progress, one major obstacle has remained. At optical wavelengths, it has been extremely difficult to capture images that are both highly detailed and cover a wide area without relying on bulky lenses or ultra-precise physical alignment.
A newly published study in Nature Communications offers a possible way forward. The work was led by Guoan Zheng, a biomedical engineering professor and director of the UConn Center for Biomedical and Bioengineering Innovation (CBBI), along with his research team at the University of Connecticut College of Engineering. Their findings introduce a new imaging approach that could reshape how optical systems are designed and used across science, medicine, and industry.
Why Synthetic Aperture Imaging Falls Short in Optics
"At the heart of this breakthrough is a longstanding technical problem," said Zheng. "Synthetic aperture imaging -- the method that allowed the Event Horizon Telescope to image a black hole -- works by coherently combining measurements from multiple separated sensors to simulate a much larger imaging aperture."
This strategy has been highly successful in radio astronomy because radio waves have long wavelengths, making it feasible to precisely synchronize signals collected by widely spaced sensors. Visible light, however, operates on a much smaller scale. At those wavelengths, the physical precision required to keep multiple sensors perfectly synchronized becomes extraordinarily difficult, if not impossible, to achieve using conventional methods.
MASI and a Software-First Approach to Synchronization
The Multiscale Aperture Synthesis Imager (MASI) takes a fundamentally different approach to this challenge. Instead of demanding that optical sensors remain in exact physical alignment, MASI allows each sensor to collect light independently. Advanced computational algorithms are then used to synchronize the data after the measurements are complete.
Zheng compares the idea to a group of photographers capturing the same scene. Rather than taking traditional pictures, each photographer records raw information about how light waves behave. Software then combines these separate measurements into a single, extremely high-resolution image.
By handling phase synchronization computationally, MASI avoids the rigid interferometric setups that have long limited the practicality of optical synthetic aperture systems.
How Lens-Free Imaging Works in MASI
MASI departs from traditional optical imaging in two major ways. First, it eliminates lenses altogether. Instead of focusing light through glass, the system uses an array of coded sensors placed at different locations within a diffraction plane. Each sensor records diffraction patterns, which describe how light waves spread after interacting with an object. These patterns contain both amplitude and phase information that can later be recovered using computational techniques.