What if distance could be measured without lasers or timing? A camera-based optical method hints at a simpler way to track objects and sense environments.

An optical method first described in the early 1900s is now being tested as an alternative to LiDAR for sensing and measurement. Researchers at the Georgia Tech Research Institute are using the Scheimpflug technique to build camera-based rangefinding systems that can measure distance, track objects, and monitor atmospheric effects. The approach works over short to medium distances and can run on its own or with laser-based systems.
Unlike time-of-flight (ToF) LiDAR, which calculates distance by measuring how long a laser pulse takes to return, this method relies on camera geometry. By tilting the camera relative to the scene, it can determine distance along the optical axis without tracking signal travel time. This reduces the need for timing electronics and laser setups. It also allows the system to capture spatial information that can be combined with other data sources.
The system uses event-based cameras that respond only to changes in brightness at each pixel instead of recording full frames. This enables microsecond-level response and supports range-finding algorithms that isolate signals from specific distances. As a result, the design avoids many of the hardware needs of ToF systems, including fast detectors and synchronized laser pulses.
This technique can be used in both active and passive sensing modes. It can also be combined with LiDAR to improve range resolution and extend performance across different wavelengths. Early experiments show that it can detect atmospheric turbulence and signal variations over distances from a few meters to several kilometers. Prototype systems have shown the ability to capture changes in real time.
The principle was introduced by Theodor Scheimpflug to correct perspective distortion in aerial photography. It defines how the image plane, lens plane, and subject plane intersect to maintain focus across a tilted field. It has been used in imaging and ophthalmology and is now being adapted for static monocular 3D sensing.
A study on this method and its remote sensing applications was presented at the SPIE Defense + Commercial Systems Conference 2025. Current work focuses on improving performance for atmospheric monitoring and exploring other use cases. The method is still under development, but early results show it can reduce system size, weight, power use, and cost while offering a different approach to range sensing.




