Falcon Neuro, a space-based event sensor, logs lightning from orbit with microsecond timing, offering a faster, lighter alternative to traditional satellite imagers.

A space experiment has shown a new way to record lightning. Researchers at Western Sydney University, with support from the U.S. Air Force Research Laboratory, flew an event-based vision sensor on the International Space Station for two years. The imager, called Falcon Neuro, tracked lightning activity from orbit with high temporal precision and low data volumes.
Most satellites use frame-based sensors that capture full images. GEO systems like GOES GLM use filtered CCDs to detect lightning at ~10 km resolution. Falcon Neuro uses microsecond event-based sensing in LEO with 1m detail, trading constant coverage for high speed and low data load.
In capturing lighting, where the timing to capture is unpredictable. The system is built, to detect the difference in brightness to capture the lighting with minimal data. The package flew two modified sensors, one down-looking and one forward-looking, collecting data of visible light and near-infrared light spectrum, onboard software clusters, spatiotemporal events to render metre-scale flash footprints and timelines, which then need independent validation.
Each pixel fires when the cloud-top brightens or dims, producing a sparse stream of time-stamped events. Onboard clustering turns those events into a precise footprint and timeline of the flash at 500 to 1000 frames per second, with only 3 to 4 Mbit/s to downlink, and with locations cross-checked against ground RF networks.
Using a fast lens at infinity focus, neutral-density filters to prevent clipping, and an optional band-pass filter to reduce background. Clean optics help only if time and location are exact.
Timing and geolocation anchor each detection. A GPS-disciplined clock with a 1-PPS signal keeps absolute timing to 1 to 10 microseconds. Attitude and position come from IMU and GNSS on ground or aircraft, and from IMU, star tracker, and ephemeris in space.
Trigger and exposure control protect dynamic range. Triggers based on luminance change or native event thresholds start captures without flooding storage. Daylight needs higher thresholds to avoid false events.
Processing shapes what you can downlink, compresses them, and writes to a ring buffer. Event streams often fit in a few Mbit/s; high-speed frames can need hundreds.








