Video recording of ultrafast phenomena using a detector array based on the CCD or CMOS technologies is fundamentally limited by the sensor’s on-chip storage and data transfer speed. To get around this problem, the most practical approach is to utilize a streak camera. However, the resultant image is normally one dimensional—only a line of the scene can be seen at a time. Acquiring a two-dimensional image thus requires mechanical scanning across the entire field of view. This requirement poses severe restrictions on the applicable scenes because the event itself must be repetitive.
To overcome these limitations, we have developed a new computational ultrafast imaging method, referred to as compressed ultrafast photography (CUP), which can capture two-dimensional dynamic scenes at up to 100 billion frames per second. Based on the concept of compressed sensing, CUP works by encoding the input scene with a random binary pattern in the spatial domain, followed by shearing the resultant image in a streak camera with a fully-opened entrance slit. The image reconstruction is the solution of the inverse problem of above processes. Given sparsity in the spatiotemporal domain, the original event datacube can be reasonably estimated by employing a two-step iterative shrinkage/thresholding algorithm.
To demonstrate CUP, we imaged light reflection, refraction, and racing in two different media (air and resin). Our technique, for the first time, enables video recording of photon propagation at a temporal resolution down to tens of picoseconds. Moreover, to further expand CUP’s functionality, we added a color separation unit to the system, thereby allowing simultaneous acquisition of a four-dimensional datacube (x,y,t,λ), where λ is wavelength, within a single camera snapshot.
|