Beam and image steering by Micro Electro Mechanical System (MEMS) Spatial Light Modulators decouples trade-offs between resolution, field of view, and size of displays and optics that are a common challenge found in optical designs. We overview solid state lidar and augmented reality display engine employing MEMS SLMs, Texas Instruments Digital Micromirror Device and Phase Light Modulators.
Enabling all-day-wearable augmented reality (AR) displays require compact engineering solutions that still satisfy requirements like wide field-of-view (FOV) and high resolution. By using a Digital Micromirror Device (DMD) and a pulsed laser in synchronization we are able to perform diffractive image steering which decouples the FOV of the projected image from the display size while not sacrificing image resolution. This approach reduces, by several factors, the lateral extent of the display panel while retaining image resolution. The diffractive-steering-enabled FOV expansion by the DMD, paired with a prism array placed at the exit pupil of the projection lens, maintains a small form factor by re-distributing a part of the volume from the projector engine to the image transfer optics. Together with diffractive image steering and the prism array we demonstrate a 5x increase in field-of-view. This approach decreases the requirement on the number of pixels to maintain high resolution across a wide FOV, which makes it suitable for eventually installing it in small form factor head mounted displays.
Micro Electro Mechanical System (MEMS) spatial light modulators enables adaptive and fast beam and image steering. For lidar applications, Texas Instruments Phase Light Modulator (TI-PLM) is paired with real-time calculation and display of Computer Generated Holograms (CGH) by CUDA-OpenGL interoperability assisted by YOLOv4-tiny network model for object detection and recognition. The real-time object recognition, CGH calculation, and display framework replaces conventional raster scanning with camera-input based and foveated beam steering while having a beam scan rate beyond the frame rate of TI-PLM. For Augmented Reality (AR) application, the same framework is used for image steering based on gaze information of eye. With Texas Instruments Digital Micromirror Device (TI-DMD), image is steered into a part of field of view by following movement of eye. The diffractive image steering enabled by TI-DMD increases FOV while not sacrificing resolution of the image displayed.
We demonstrated a real-time lidar system applying a Digital Micromirror Device (DMD) as a field of view (FOV) expander of a lidar receiver employing a 2D Multi-Pixel Photon Counter (MPPC). By temporally synchronizing the transitional state of micromirrors with returning photons from lidar, receiver FOV is diffractively steered to the targets’ direction enabled by nano-second pulse laser. With a nanosecond 905nm laser transmitter, time-of-flight (ToF) lidar images were captured across seven diffraction orders with the expanded 35 degrees full field of view lidar scanning range.
Micro Mechanical Electronics System based Spatial Light Modulators (MEMS-SLM) enables unique capability “Just in time photon delivery” or steering beam images to where and when they are needed. The beam and image steering solves challenges commonly found in both lidar and AR optical engines dominated by classical tradeoffs, such as image FOV, resolution and SLM size or form factor of optical engine. As a novel beam and image steering device, we transformed Texas Instruments Digital Micromirror Device (TI-DMD) into a diffractive beam and image steering device. TI-DMD is known as a binary spatial light modulator. Micromirros’ tilt re-directs light into on- or off-states. Without modifying TIDMD, but with employing a nano-second pulse illumination synchronized to the transitional movement of micromirrors between the of- and off-states turns DMD into a diffractive beam and image steering device.
We demonstrated a real-time lidar system that utilizes a Digital Micromirror Device (DMD) as a field of view (FOV) expander and a 2D Multi-Pixel Photon Counter (MPPC) as a lidar sensor. By synchronizing the dynamic transition of DMD micromirrors between on- and off-states with the MPPC and a nanosecond pulse laser, the receiver FOV is diffractively steered to expected direction enabled by timing the delay of micromirrors transition to the laser. The DMD-MPPC lidar can capture up to 7 diffraction orders of high-resolution geospatial data. By applying the laser beam steering technique, this system is able to span over 35 degrees FOV, which is 10 times expansion of FOV compared to the single lidar detector FOV. In this work, as a preliminary demonstration towards diffractive FOV expansion, we presented the high resolution lidar images while DMD is switching between on and off state. Also, we performed distance resolution testing to validate the functionality of DMD-MPPC flash lidar system.
Resonant MEMS mirror has been recognized as one of the solid-state laser beam steering (LBS) solutions for AR display and lidar. Such MEMS resonant mirrors’ large angular throw achieves over tens of degrees in scanning field of view (FOV) with operation speed exceeding tens of kHz in resonant frequency. In LBS, beam area is critical especially for lidar to access targets located at a far distance. Having both a large angular throw and beam area, or large Etendue, it is feasible to simultaneously satisfy requirement. For Time of Flight (ToF) lidar transmitter, we proposed and experimentally characterized a large Etendue LBS architecture employing a 2-dimensional MEMS mirror and diffractive LBS by Digital Micromirror Device (DMD). The beam area of MEMS resonant mirror is matched to DMD with relay optics while DMD diffractively increases the Etendue by factor of 5, which is equal to the number of diffraction orders supported by DMD. Along with beam steering, we address laser pulses’ timing to MEMS mirror’s movement to enable raster scanning that eliminates re-sorting of ToF data required for LBS employing a Lissajous pattern.
CUDA-OpenGL interoperability enables to drastically reduce the computational time for CGH calculation and display on Spatial Light Modulators via HDMI display channel. The fast calculation method enables on-the-fly diffractive beam steering by Micro Electro Mechanical System based phase light modulator with YOLOv4-tiny model based object recognition to do AI-based dynamic beam tracking in order to trace the object of interest.
Laser beam steering is an essential function for LiDAR. Phase Spatial Light Modulator (SLM) provides a capability of steering beam in a fast and random-access manner but suffers from limited FOV and side lobes. In this paper, we present a DMD (Digital Micromirror Device)-PLM hybrid beam steering concept that features high resolution, large-FOV, and side-lobe free beam steering.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.