Chapter V: Optical phased array transceiver
5.1 Introduction
The wavefront manipulation capability, high-sensitivity reception, as well as low fabrication cost and compact form factor make integrated photonics a good candidate for realizing 3D imagers. Moreover, its potential application in LiDAR (Light Detection And Ranging) [48] attracted a great deal of interest recently due to the increasing demand in 3D photography, self-driving cars, and next generation of smart phones. Conventional LiDARs, operating based on mechanical components, are bulky and expensive. Integration of nano-photonic projection/reception systems on a chip, capable of adaptive beamforming, holds a promising future for these applications due to their compact form factor, low cost, and high yield enabled by integrated solid-state platforms.
The output data of a 3D imager not only includes a picture of the scene being imaged, but also the distance of each pixel in the picture from the imager. A laser- based 3D imager works by introducing temporal variations to a coherent light wave which is used for illuminating the scene. The illuminated wave reflects from the objects and the imager determines the distance by measuring the round-trip time
1The work presented in this chapter was done in collaboration with Aroutin Khachaturian.
Figure 5.1: Conceptual schematic of 3D imaging with a LiDAR system.
of the wave, considering the speed of light in free-space. The round-trip time of the reflected wave can be calculated using knowledge of the history of the temporal modulation waveform. To perform imaging, the full scene can be flashed with the modulated laser light, followed by recording the reflections from all the points utilizing a lens and a detector array on the receiver side. However, due to the power limits of the available laser sources (those which are feasible to use and suitable for these applications) and eye safety restrictions, one point at a time is illuminated via forming a beam and the beam is scanned over the field-of-view to image the whole scene. This provides larger spontaneous optical power at each point and yields a higher SNR for the detection process. Figure 5.1 shows a conceptual schematic of such a 3D imager in which the transmitter (TX) illuminates a point with an optical beam and the receiver (RX) captures the reflection. The beam is steered over the FOV and measurements of the corresponding reflections are put together to form an image. The time delay (round-trip time) between the transmitted and received signals is
π‘π =2π /π, (5.1)
in which π is the distance between the system and the object, andπis the speed of light. A large variety of waveforms can be used to modulate the illumination light such as pulsing and frequency chirping. For pulse modulation, a periodic sequence of short pulses with engineered amplitude (and even phase/frequency) are used to illuminate the scene. While post-laser optical modulators can be used to create pulses by gating the output light of the laser, the laser itself is usually designed to generate short pulses to avoid the optical power loss caused by the gating function
of the modulator. In contrast to pulse modulation which is mainly an amplitude modulation scheme, frequency chirping varies the frequency of the laser light by linearly sweeping it between two frequencies periodically. A simple frequency chirped waveform is
π₯(π‘) =


ο£²

ο£³
π΄cos(π
0π‘+1
2π π‘2+π) 0< π‘ < π π₯(π‘+π) =π₯(π‘).
(5.2) While the optical frequency of the waveform is chirped in equation (5.2), an alterna- tive modulation method is using a frequency chirp in the RF domain and employing the optical signal as a carrier. Therefore, the chirped signal is added to the optical carrier with an amplitude modulator following the laser. In this case the modulated laser waveform is
π₯(π‘) = π΄(π‘)cos(π
0π‘+π) π΄(π‘) =


ο£²

ο£³
π΄0(1+πΌ)cos(πππ‘+ 1
2π π‘2+π) 0< π‘ < π π΄(π‘+π) = π΄(π‘).
(5.3)
In addition to these chirp signals with a saw-tooth shape, triangular chirps and chirps with a continuous phase at cycle transitions [130] can be used to achieve better spectral properties.
Modulation and signal preparation are performed at the back-end processing of the transmitter. The front-end of the system performs conditioning and engineering the wavefront for illumination which mainly includes forming a nicely shaped beam and steering it. Among various methods of beamforming and steering discussed in section 2.3, electronic beamforming is desirable due to its reliability and faster scanning speed. While a silicon photonic platform is potentially capable of in- tegrating all the system parts on the same chip, here, we only consider front-end wavefront processing on the chip. Using an array of nano-photonic antennas, an engineered wavefront can be generated for illuminating the scene which makes an optical phased array a candidate technology.
The reflected light is captured with an optical receiver and is compared to the signal in the transmission path (as a reference) to extract the time delay it experienced.
In the case of amplitude modulation, such as pulse or amplitude chirp, a photo- detector can be used to remove the optical carrier and extract the modulation signal through direct detection. Therefore, the calculation of the round-trip time can happen in electrical domain. For the case of amplitude/phase modulation, a simple
photo-detector can not extract the variations and requires other techniques such as heterodyne detection for down-converting the optical signal to electronics for processing. However, a heterodyne mixing method can be used for amplitude modulations as well to achieve higher SNR for detection.
The comparison process between the reference signal and the received signal can be performed via calculating the correlation of these two signals. While various meth- ods can be used to determine this correlation in the optical or electronic domains, it is also possible to integrated the correlation process and the photo-detection by using the modulated light in the transmitter as the reference for heterodyne detection. As an example, for frequency chirping of equation (5.2), the simplified output of the double balance heterodyne detection is [23]
πΌππ’π‘ =π» ππ‘{π₯(π‘), π₯(π‘βπ‘π)}=2π π΄2
0+2π΄2
0cos(π π‘ππ‘+π), (5.4) in which π is the responsivity of the photo-diode. Therefore, the mixer output current is a sinusoidal wave and its frequency is an indication of the time delayπ‘π. It should be note that the periodicity of the modulation signal is not considered here which leads to a periodic output signal with instantaneous frequency of π π‘π while its Fourier spectrum includes delta functions at integer multiples ofππ =1/π. The heterodyne detection mixer can perform as a correlator for the amplitude chirp, equation (5.3), as well although some extra spectral content will be generated at the output current due to the varying amplitude which need to be filtered carefully.
While the operation principle of an optical 3D imager is well-known, utilization of integrated photonic as system front-end (on both of the transmitter and receiver sides) put forth many challenges. These challenges are accentuated when there are stringent performance requirements such as the capability of detecting an object at 200 m distance with at least 3 cm resolution in the presence of near by objects at 1 m that reflect a much stronger signal. In the next section, a more detailed discussion of the challenges in such systems are presented and in section 5.3 the proposed integrated optical transceiver system overcoming these challenges is explained.