Image and Signal Processing Methods

Three-dimensional interferometric inverse synthetic aperture radar imaging of maneuvering target based on the joint cross modified Wigner-Ville distribution

[+] Author Affiliations
Qian Lv, Tao Su, Jibin Zheng, Jiancheng Zhang

Xidian University, National Laboratory of Radar Signal Processing, 2 South Taibai Road, Xi’an 710071, China

J. Appl. Remote Sens. 10(1), 015007 (Jan 28, 2016). doi:10.1117/1.JRS.10.015007
History: Received July 31, 2015; Accepted December 23, 2015
Text Size: A A A

Open Access Open Access

Abstract.  Inverse synthetic aperture radar (ISAR) can achieve high-resolution two-dimensional images of maneuvering targets. However, due to the indeterminate relative motion between radar and target, ISAR imaging does not provide the three-dimensional (3-D) position information of a target and suffers from great difficulty in target recognition. To tackle this issue, a 3-D interferometric ISAR (InISAR) imaging algorithm based on the joint cross modified Wigner-Ville distribution (MWVD) is presented to form 3-D images of maneuvering targets. First, we form two orthogonal interferometric baselines with three receiving antennas to establish an InISAR imaging system. Second, after the uniform range alignment and phase adjustment, the joint cross MWVD is used for all range cell of each antenna pair to generate the separation of the scatterer as well as preserve the phase that contains position information of the scatterer. At last, the 3-D images of the target can be directly reconstructed from the distribution. Simulation results demonstrate the validity of the proposal.

Figures in this Article

Inverse synthetic aperture radar (ISAR) has been proven to be a powerful signal processing tool for imaging of moving targets in military and civilian applications.13 In ISAR imaging, the finer range resolution can be obtained by transmitting larger-bandwidth signals, while the cross-range resolution can be improved by wider aspect observations. Generally, wide aspect observations can be obtained by performing long-time observation with a monostatic ISAR or by acquiring multiaspect observations with multiple radar receiver configurations.4 In other words, multiaspect observations can be utilized to form a higher-resolution two-dimensional (2-D) ISAR image as well as perform three-dimensional (3-D) reconstruction of a target. Reference 4 studies the parameter estimation of ISAR imaging with multiaspect observations, which further extends the application of multiaspect observations. However, in general, the ISAR image is just a 2-D range-Doppler projection of the 3-D target’s reflectivity function onto an image plane,2,3,5,6 which is mainly determined by the motion of targets with respect to the line of radar sight (LOS) and cannot be predicted. Thus the conventional 2-D ISAR image no longer meets the increasing demand of target recognition and target identification to some extent.

Recently, to further improve the ability of target recognition, especially for a noncooperative target, many algorithms are introduced for different imaging modes. Reference 7 puts forward the data level fusion method with multiaspect observations, which can obtain the target spatial structure information with known imaging geometry. In contrast to 2-D ISAR images, given the capability of providing the target’s structure information, the 3-D ISAR imaging techniques for maneuvering targets have attracted wide attention in many applications such as target identification and target recognition.8,9 There is much literature that covers 3-D ISAR images with various algorithms. The algorithms in Refs. 10 and 11 require a 2-D antenna array to generate the 3-D images of a target; however, multiantennas inevitably result in great system complexity. References 1213141516 present the interferometric ISAR (InISAR) imaging technique, which combines the interferometric processing and ISAR processing to form 3-D images. Fortunately, the InISAR imaging technique has notable advantages over the aforementioned techniques in both system structure and signal processing; therefore, it attracts the attention of many researchers.

Nevertheless, in order to employ interferometry via each antenna’s ISAR images, the InISAR imaging techniques in Refs. 12 and 15 take the linear time-frequency transform and searching procedure successively, but neglect the bilinear time-frequency transform with the higher time-frequency resolution. Different from the aforementioned InISAR technique, this paper presents a 3-D InISAR imaging algorithm for maneuvering targets based on the joint cross modified Wigner-Ville distribution (MWVD). In this paper, three antennas forming two orthogonal interferometric baselines are located in the same plane orthogonal to the LOS, and a uniform range alignment and phase adjustment must be implemented together on the three antennas’ echo signals to keep the coherence among them. In addition, the joint cross MWVD of the data acquired from each two antennas located along one baseline can be adopted for each range cell, and then the 3-D structure positions of all scatterers can be solved directly from the preserved phase information in the distribution, where each scatterer is distinctly separated. Meanwhile, the 3-D images of the maneuvering target can be obtained.

The remainder of this paper is organized as follows. The InISAR system model and signal format are described in Sec. 2. In Sec. 3, the joint cross MWVD and its application are discussed in detail. Section 4 gives the analyses of the cross-terms suppression and the computational cost. In Sec. 5, a 3-D InISAR imaging algorithm is proposed based on joint cross MWVD. Finally, the simulation results of the presented algorithm and the conclusion are given in Secs. 6 and 7.

The InISAR system in Fig. 1 is based on the assumption that the model is the approximate choice for real application scenarios. The proposed model consists of three antennas located at points A, B, and C, respectively, and XYZ defines a Cartesian coordinate with the origin O as the location of the antenna A. In order to achieve 3-D images of the target, these antennas have to lie on a horizontal and a vertical baseline, respectively. Antenna A, doubling as both a transmitter and a receiver, is chosen at the origin; the LOS is the axis Y, and the receiving antennas B and C are located on the axis X and axis Z with the coordinates (L,0,0) and (0,0,L), respectively.

Graphic Jump Location
Fig. 1
F1 :

Geometry in the InISAR system.

In practical application, a maneuvering target will produce rotational motion that is represented by the rotation vector ωT, whose projection onto the plane perpendicular to the LOS is called the effective rotation vector ωe. The point O is assumed to be the autofocus center for three receivers during the whole observation time, which will be mentioned in Sec. 3.

Assume the transmitted linear frequency modulation (LFM) signal takes the following form: Display Formula

s(t)=exp[j2π(fct+12μt2)]|t|Ts2,(1)
where t, Ts, fc, and μ denote the fast time, the pulsewidth, the carrier frequency, and the chirp rate (CR), respectively. After pulse compression, the echo signal at the receiver Γ(A,B,C) from the scatterer P(xP,yP,zP) can be represented as Display Formula
sΓP(t,tm)=δPBsinc{B[tRAP(tm)+RΓP(tm)c]}exp[j2πRAP(tm)+RΓP(tm)λ],(2)
where tm is the slow time, δP is the amplitude, B is the transmitted signal bandwidth, and λ=c/fc is the wavelength. RΓP(tm) denotes the distance of the antenna Γ(A,B,C) to the scatterer P, and this can be expressed as Display Formula
RAP(tm)={[xp+ΔRxP(tm)]2+[yp+ΔR(tm)+ΔRyP(tm)]2+[zp+ΔRzP(tm)]2}1/2,(3)
Display Formula
RBP(tm)={[xp+L+ΔRxP(tm)]2+[yp+ΔR(tm)+ΔRyP(tm)]2+[zp+ΔRzP(tm)]2}1/2=RAP(tm)+2[xp+ΔRxP(tm)]L+L22RAP(tm),(4)
Display Formula
RCP(tm)={[xp+ΔRxP(tm)]2+[yp+ΔR(tm)+ΔRyP(tm)]2+[zp+L+ΔRzP(tm)]2}1/2=RAP(tm)+2[zp+ΔRzP(tm)]L+L22RAP(tm),(5)
where ΔR(tm) is the range translation quantity, which is the same for all scatterers on the target, and ΔRxP(tm), ΔRyP(tm), and ΔRzP(tm) are the range displacements due to rotation of the target with respect to the autofocus center O as shown in Fig. 1, respectively.

Obviously, it is the RΓP(tm) in Eq. (2) that results in the range migration (the translational range migration and the rotational range migration) and Doppler frequency shift (induced by the translational motion and rotational motion). Similar to the ISAR image, in order to achieve 3-D InISAR images, the motion compensation without destroying the coherence among the three receivers should be first achieved. However, notice that the terms to work for range alignment in the three receivers are different due to the existence of the second terms in Eqs. (4) and (5). Consider, under the far-field conditions, that the target size does not exceed 60 m, and the distances of the radar to target and the baseline length are no fewer than 10,000 and 1 m, respectively. Also, assume that the autofocus point is in the X and Z axes, and the effective range displacements ΔRxP(tm) and ΔRzP(tm) do not exceed 4 m; then, we have Display Formula

2[xp+ΔRxP(tm)]L+L22RAP(tm)2(60+4)×1+1002×10,0000.05  m,(6)
Display Formula
2[zp+ΔRzP(tm)]L+L22RAP(tm)2(60+4)×1+1002×10,0000.05  m.(7)

The distance differences induced by rotation between three antennas are much smaller than the range resolution, when the radar range resolution is 0.1 to 0.3 m. Therefore, under this approximation, the translational range migration of all three receivers can be accomplished by using the same range alignment compensation function. Here we choose receiver A as the reference channel to implement the translational range migration and the Doppler frequency shift induced by the translation, which are the same for all scatterers on the target and can be eliminated by the standard range alignment method17,18 and the phase gradient autofocus method,19 respectively.

However, when the target size is a little larger and the required resolution becomes higher, the migration through resolution cells (MTRC), which is related to the location of each scatterer, can no longer be neglected. The Radon–Fourier transform and generalized Radon–Fourier transform (RFT/GRFT) were proposed in Refs. 202122 to deal with the couple between the envelope and Doppler, which has been shown to be very effective in muchliterature.2023 Thus, the RFT can be used to mitigate the MTRCs and correct all scatterers into the right cell. We will not make a detailed discussion about range alignment in this paper and will only focus on the Doppler frequency shift induced by rotational motion for 3-D InISAR reconstruction.

After the motion compensation, the azimuth echo from the scatterer P can be rewritten as Display Formula

sΓP(tm)=σPexp[jΦΓP(tm)]=σPexp[j2πRAP(tm)+RΓP(tm)λ],(8)
where σP is the relative amplitude. For simplicity, the exponential term in Eq. (8) can be further expressed as follows; and the detailed derivation is given in the 1. Display Formula
sΓP(tm)=σPexp(jϕΓP)exp[j4πλΔRyP(tm)],(9)
where Display Formula
ϕAP=4πλRP,ϕBP=4πλRP+2πLxpλRP,ϕCP=4πλRP+2πLzpλRP.(10)

From Eq. (10), although the terms ϕΓP are independent of the slow time and unnecessary for the ISAR imaging, they carry significant position information of the scatterer, which should be preserved in the 3-D InISAR image processing, the details of which will be thoroughly explained in Sec. 3.2. The second terms in Eq. (9), completely consistent with each other in three antennas, can work for separating the different scatterers in the same range cell. From Eqs. (9) and (10), we obtain Display Formula

ΔϕAB=ϕBPϕAP=2πLxpλRP,ΔϕAC=ϕCPϕAP=2πLzpλRP.(11)

Combined with the range information RnRP, where Rn is the distance of the radar to the n’th range cell, the position of the scatterer P can be obtained as Display Formula

xp=λRnΔϕAB2πL,yP=Rn,zp=λRnΔϕAC2πL.(12)

Therefore, the interferometric phase information is crucial to successfully reconstruct the 3-D position of target, which is also the research emphasis in this paper.

Proposed Algorithm for Signal Separation

After the uniform motion compensation, the position of the autofocus centers for each ISAR image remains the same relative to the three receiving antennas during the imaging time. Identical to the ISAR image for maneuvering targets, the effective rotating velocities ωx and ωz are time-variant and do cause the change of the Doppler frequency, which can be utilized to realize the high-resolution ISAR imaging. They can be approximated as Display Formula

ωx(tm)=αx+βxtm,ωz(tm)=αz+βztm.(13)

Thus, the corresponding effective range displacement can be expressed as Display Formula

ΔRyP(tm)=xP0tmωx(μ)dμ+zP0tmωz(μ)dμ=xP(αxtm+12βxtm2)+zP(αztm+12βztm2).(14)

Then the echo signals received by receivers A, B, and C in the n’th range cell become, respectively, Display Formula

sΓ(tm)=iPσiexp(jϕΓι)exp[j2π(fitm+12μitm2)],(15)
where fi=2(xiαx+ziαz)/λ and μi=2(xiβx+ziβz)/λ denote the centroid frequency (CF) and the CR of the i’th scatterer, respectively. It can be found from Eq. (15) that the echo signals received by the three antennas have the LFM signal format. Therefore, they can be solved by the same processing algorithms as the LFM signal, such as WVD,24 which is extensively applied for ISAR imaging. However, when the WVD is directly used on the echo signal itself of a single antenna, the interferometric phases will be completely lost. As a result, the joint cross MWVD is introduced to separate the scatterer in the same range cell due to its good phase preservation and searching-free procedure, where the term “joint cross” refers to the joint cross-correlation operation of the data acquired from two different antennas of all three antennas. The key to the so-called joint cross MWVD in this paper is the definition of the symmetric instantaneous cross-correlation function (SICCF) from the two different receivers, which is essentially different from the MWVD that performs the instantaneous autocorrelation only aiming at one receiver. The results of bilinear transform on the signal itself in Ref. 24 will bring about the loss of the time-invariant interferometric phase information, which further results in the failure in the image interferometry. Given the symmetric relation of the antennas B and C, here we only take the interferometric antenna pair AB as an example to explain the above analysis. The SICCF of the receiver pair AB can be defined by Display Formula
RAB(tm,τ)=sA(tm+τ2)sB*(tmτ2)=iPσi2exp(jΔϕAB)exp[j2π(fi+μitm)τ]+RAB,cross,(16)
where sA(tm) and sB(tm) denote the echo signals received by receivers A and B in the n’th range cell, respectively. τ is the lag variable, and RAB,cross denotes the cross-terms. By performing the normal WVD transform on Eq. (16), we have Display Formula
WAB(tm,fτ)=sA(tm+τ2)sB*(tmτ2)exp(j2πfττ)dτ=iPσi2exp(jΔϕAB)δ[fτ(fi+μitm)]+WAB,cross.(17)

In Eq. (17), the slow time variable tm and the lag variable τ linearly couple with each other; thus, the joint cross WVD between the receiver pairs AB peaks along the straight line fτ=fi+μitm, whose intercept and slope are related to the CF fi and the CR μi of the i’th scatterer. Borrowing the idea from the classical scale transform (ST), we propose the joint cross MWVD, which can be denoted as Display Formula

GAB(fτ,fτtm)=τ,τtmRAB(tm,τ)d[τtm]d[τ]=iPσi2exp(jΔϕAB)δ(fτfi)δ(fτtmμi)+GAB,cross,(18)
where GAB,cross are the cross-terms corresponding to the MWVD, which will be discussed later.

For Eq. (18), the linear couple between the slow time variable tm and the lag variable τ is removed, and the signal energy is completely accumulated only by FFT operation without searching any parameters. Also, it is clearly seen that the CF and the CR are closely related to its coordinates; different scatterers with different coordinates will be discriminated from each other in the centroid frequency and chirp rate domain (CFCRD). After the joint cross MWVD, each scatterer can be easily separated as a peak point in the CFCRD.

Proposed Algorithm for Information Extraction

Without loss of generality, the joint cross MWVD of the scatterer P from the antenna pair AB can be denoted as Display Formula

GAB(fτ,fτtm)=FFTτ(FFTtm{STτtm[RAB(tm,τ)]})=σp2exp(jΔϕAB)δ(fτfp)δ(fτtmμp),(19)
where tm denotes the new time variable after ST. GAB(fτ,fτtm) has a sole peak at the point (fp,μp) and can be modeled as an ideal point spread function. Also, more attention should be paid to the face that the interferometric phase information ΔϕAB of each scatterer is well preserved in Eq. (19) then the phase differences can be computed as follows: Display Formula
ΔϕAB=[GAB(fp,μp)],ΔϕAC=[GAC(fp,μp)].(20)

We can use Eq. (12) to solve the 3-D position of all scatterers in the n’th range cell. Hence, the 3-D positions of all scatterers on the target will be easily obtained by using the same process in all range cells.

In addition, it is also worthwhile to mention that only when the phase differences ΔϕAB and ΔϕAC do not exceed 2π is the solution to (xp,zp) in Eq. (12) correct. Hence, Display Formula

ΔϕAB=2πLxpλRP2π,ΔϕAC=2πLzpλRP2π(21)
must be satisfied. Note that (xp,zp) depends on the aircraft size. In other words, as long as the aircraft size does not exceed λRn/L, the solution to (xp,zp) is correct. Similar to the aforementioned consideration, the target size does not exceed 60 m, and the distance of the radar to target and the baseline length are no less than 10,000 and 1 m, respectively. Thus, Display Formula
xp,zpλRnL=0.03×10,0001=300  m(22)
can always be satisfied.

Rotation Parameter Retrieval

The rotation parameter estimation is an essential task for ISAR and has drawn much attention.25,26 According to Eqs. (13)–(15), when the coordinates of the scatterer are fixed, the CF fp and the CR μp of the scatterer P mainly depend on the angular velocity αx, αz and angular acceleration βx, βz along the X and Z axes, respectively. That is, Display Formula

fp=2(xPαx+zPαz)/λ=2αe(xPcosθ+zPsinθ)/λ,(23)
Display Formula
μp=2(xPβx+zPβz)/λ=2βe(xPcosθ+zPsinθ)/λ,(24)
where xP and zP denote the coordinates of the scatterer P, and αe and βe are the effective initial rotating velocity (IRV) and effective rotating acceleration (RA), respectively.

Accordingly, the parameters αe and βe can be estimated with the estimated parameters xP and zP. We can rewrite Eqs. (23) and (24) by considering only the contribution of the i’th scatterer. Display Formula

Ci=aXi+bZi,(25)
Display Formula
Di=cXi+dZi,(26)
where C=λfi/2, D=λμi/2, Xi=xi, Zi=zi, a=αecosθ, b=αesinθ, c=βecosθ, and d=βesinθ. Ci and Di, respectively, represent the CF and the CR of the i’th scatterer, which have been extracted in Eq. (19). Also, the coordinates Xi, Zi of the i’th scatterer can be calculated from Eq. (12). Then, the parameters αe and βe can be accomplished by estimating a and b.

The situation can be mathematically dealt with by minimizing the function Display Formula

Ψ(a,b)=iNP[Ci(aXi+bZi)]2,ϒ(c,d)=iNP[Di(cXi+dZi)]2,(27)
where NP is the number of extracted scatterers.

Consequently, the effective IRV, αe, and effective RA, βe, can be obtained by the estimated a^, b^, c^, d^. Display Formula

α^e=a^2+b^2,β^e=c^2+d^2.(28)

Analysis of the Cross-Terms

In order to obtain the high-resolution imaging, the echo signals have to be modeled as multicomponent LFM signals in each range cell. Moreover, due to the nonlinear characteristic of the SICCF, the cross-terms are inevitable and may affect the detection of self-terms. We need to analyze the performance of the joint cross MWVD under the situation of multi-LFM signals. Here, assume that there are two scatterers, and the proof process of the multi-LFM signals can refer to the following discussion: Display Formula

sA(tm)=sAP(tm)+sAQ(tm)=σpexp(jϕAp)exp[j2π(fptm+12μptm2)]+σqexp(jϕAq)exp[j2π(fqtm+12μqtm2)],(29)
Display Formula
sB(tm)=sBP(tm)+sBQ(tm)=σpexp(jϕBp)exp[j2π(fptm+12μptm2)]+σqexp(jϕBq)exp[j2π(fqtm+12μqtm2)].(30)

After performing the SICCF on Eq. (16), we obtain Display Formula

RAB(tm,τ)=RABp,self(tm,τ)+RABq,self(tm,τ)+RABpq,cross(tm,τ)+RABqp,cross(tm,τ),(31)
where Display Formula
RABpq,cross(tm,τ)=σpσqexp[j(ϕBqϕAp)]exp{j2π[(fpfq)tm+(fp+fq)τ2+12(μpμq)tm2+12(μp+μq)τtm+12(μpμq)τ24]},(32)
and the cross-term RABqp,cross(tm,τ) is the same as RABpq,cross(tm,τ) in essence; thus, we only take RABqp,cross(tm,τ) as an example to analyze. Then, after ST, we have Display Formula
STABpq,cross(tm,τ)=σpσqexp[j(ϕBqϕAp)]exp{j2π[(fpfq)tmτ+(fp+fq)τ2+12(μpμq)(tmτ)2+12(μp+μq)tm+12(μpμq)τ24]}.(33)

It can be found from Eq. (33) that the ST can only correct the linear CR migration of self-terms, but not of the cross-terms. Thus, the energy of self-terms is well accumulated after MWVD, while the energy of cross-terms is typically dispersed in the whole distribution. Here, we do a simulation to testify that the proposed algorithm in the paper can handle the situation of multicomponents. Consider three components denoted by AU1, AU2, and AU3, respectively. The sample frequency is 256 Hz, and the effective signal length is 512. The CF and CR of AU1, AU2, and AU3 are as follows: f1=20Hz, μ1=20Hz/s, f2=20Hz, μ2=20Hz/s, and f3=20Hz, μ3=20Hz/s.

As illustrated in Fig. 2, the couplings of the self-terms are removed by ST, but this does not work for the cross-terms. Therefore, the energy of self-terms is well accumulated in CFCRD, and the proposed method is more suitable for the complicated situation. In the above simulation, the amplitudes of the three LFM signals are considered to be the same. However, under the situation of the different amplitudes in a real-world application, the modified clean technique has to be performed to separate strong and weak LFM signals without the loss of significant interferometric phase information.

Graphic Jump Location
Fig. 2
F2 :

Simulation results: (a) contour of WVD, (b) results after ST, (c) contour of MWVD, and (d) stereogram of MWVD.

Analysis of the Computational Cost

Due to the existence of CR in Eq. (15), the echo signals received by the different receivers cannot be directly integrated to form the image. Therefore, algorithms have been proposed to estimate the parameter to reconstruct the 3-D position of moving target, such as Radon transform,15 chirplet decomposition algorithm,12 and Lv’s distribution.27 However, searching procedures are necessary for Radon transform and the adaptive chirplet algorithm, which will reduce the computational efficiency. Although Lv’s distribution can effectively avoid searching procedures, this advantage is at the cost of the redundancy information. However, there is little difficulty in obtaining the redundancy information to some extent due to the target’s maneuverability in real ISAR applications.28

From the above key analysis, we can see that the computation is mainly decided by the part of signal separation. Therefore, the implementation procedures include the defined SICCF of each receiver pair [O(Ntm2)], ST based on the chirp-z transform [O(3Ntm2log2Ntm)], and the Fourier transform with respect to the lag variable [O(Ntm2log2Ntm)]. However, the algorithms like the Radon transform and chirplet algorithm are searching techniques to find the most matched signal in the parameter domain. Because of the high computational load, [O(MNtmlog2Ntm)], where M denotes the number of searching points, which is normally greater than the number of echoes Ntm, these existing algorithms are less suitable for real-time high-resolution ISAR imaging.28 Moreover, when the estimated parameters are in the larger scope, the searching steps and initial parameters’ set are more difficult to control and cannot achieve a balance.

As is well known, the RFT/GRFT (Refs. 20212223) have been developed for motion estimation of maneuvering targets with arbitrary parameterized motion, and much research has verified the effectiveness of RFT and GRFT. What is more, GRFT have been successfully extended to space-time RFT (Ref. 29) for wideband digital array radar and 3-D space RFT (3-D SRFT)30 to realize the 3-D reconstruction of the moving target. However, the research on the fast implementation for GRFT should be further conducted because of the prohibitive computational burden induced by the multidimensional searching. Fortunately, the particle swarm optimizer28,31 has been widely employed in solving the multiparameter searching problems mentioned above.

Nonetheless, compared to the proposed method in this paper, the 3-D SRFT in Ref. 28, where the acceleration is omitted, just aims at a slow maneuvering target via searching the minimum 3-D image entropy versus six-dimensional motion parameters. Also, the searching procedures will be more complicated with an increase of the motion parameters in many scenarios, like targets with complex motion whose rotational motion may cause quadratic phase terms.

For 3-D InISAR imaging for maneuvering targets, the echo signals in a range cell at all receivers can be characterized as the same multicomponent LFM signals after uniform range alignment and phase adjustment. By using the MWVD algorithm without a searching procedure, the scatterer separation and phase extraction are simultaneously accomplished, and then 3-D images are achieved. Consequently, the 3-D InISAR imaging algorithm based on MWVD is illustrated in detail in the following, and the corresponding flowchart is shown in Fig. 3.

  • Complete the range compression of the echo signals received by the three antennas.
  • Choose antenna A as the reference channel to accomplish uniform motion compensation with the existing methods in Refs. 171819, and then the scatterers on the target remain the uniform position and autofocus center for the three antennas.
  • Utilize the function exp(jπL2/λRP) to compensate the phase difference due to L.
  • For each range cell, separate each scatterer in CFCRD after joint cross MWVD between two antenna pairs AB and AC.
  • Apply the dechirping method to estimate the amplitude and subtract the estimated LFM from the original signal without loss of significant interferometric phase information. Meanwhile, extract the interferometric phase of the scatterer and regain the corresponding coordinate by using Eq. (12).
  • Repeat steps 4 and 5 until the residual energy of the signal is smaller than threshold T.
  • Repeat the aforementioned steps (4 to 6) until all the range cells have been finished.
  • Combine the range information along the Y axis and output 3-D images.

Graphic Jump Location
Fig. 3
F3 :

Flowchart of the proposed InISAR imaging algorithm.

In realistic applications, the scatterers on the target may be composed of some disturbed sources, and it is also possible that some scatterers may be sheltered by the body of the target. In this case, the readers can refer to Ref. 32 for a detailed solution. However, when compared to the radar wavelength, the target size is much larger; the assumption is normally valid that the scatterers on a real target can be regarded as separated point-like,8,9,1216 and the obtained image may be depicted by the location of strong scatterers. Therefore, the simulation in this paper is always under the condition that the scatterers on the target are point-like.

Example A

In this section, a simple turntable target shown in Fig. 4(a) is modeled as seven scatterers. The parameters used are set as follows: target distance R=10  km, baseline length L=1  m, effective velocity αx=0.08  rad/s, αz=0.04  rad/s, and effective acceleration βx=0.06  rad/s2, βz=0.06  rad/s2. The pulse repetition frequency is 256 Hz, and the number of effective pulses is 512.

Graphic Jump Location
Fig. 4
F4 :

Simulation results: (a) Ideal scatterer model, (b) the WVD of antenna AB, (c) the MWVD of antenna AB, and (d) 3-D reconstructed scatterer.

Figure 4(a) shows the ideal target model including seven ideal scatterer points. As is clearly seen in Fig. 4(b), in the joint cross WVD of antenna pairs AB in a certain range cell (225 range cell), though the three scatterers are presented as different lines, the signal energy is not accumulated well and cross-terms do exist and should be considered in extreme cases. Based on the aforementioned consideration, we introduce the joint cross MWVD to accomplish energy accumulation without loss of the interferometric phase information. In Fig. 4(c), each scatterer is distinctly separated in the CFCRD, and the coordinates of each scatterer can be obtained easily from the interferometric phase information on its peak. Consequently, the 3-D images of a target are achieved based on the proposed joint cross MWVD algorithm. Figure 4(d) shows the reconstructed result of a 3-D InISAR image of an ideal model, where the real scatterer points are presented to make a comparison.

To quantitatively evaluate the performance of the proposed algorithm, the relative mean square error (MSE) of 3-D reconstructed coordinates is computed, and the MSE is defined by33Display Formula

MSE=RestR2R2,(34)
where R and Rest represent the original data and the reconstructed data obtained by the proposed algorithm in this paper. The MSEs of the 3-D reconstructed coordinates obtained by using the proposed method are shown in Table 1.

Table Grahic Jump Location
Table 1Reconstruction performance of example A.
Example B

In this simulation, we perform the proposed 3-D InISAR algorithm on a synthetic airplane model, which is a rigid object composed of 137 ideal scatterers. The corresponding parameters used are shown in Table 2 and the target in the simulation is moving along a straight line with respect to the radar LOS. Here, we assume that uniform motion compensation has been completed and all scatterers have the same autofocus center for the three antennas.

Table Grahic Jump Location
Table 2Simulation parameters.

The models are shown in Fig. 5, and the results of the 3-D InISAR image in different views from three visual angles are given in Fig. 6, where the reconstructed results correspond to its projections on the XY, XZ, and YZ planes, respectively. As is clearly seen from those figures, though not all of the scatterer are achieved correctly, the proposed algorithm can perform high-quality 3-D InISAR imaging of the maneuvering target. The position error of the scatterer shown in Fig. 6 results from the above approximation, noise, and cross-terms in the correlation algorithm. For the interference of the noise and cross-terms, the spurious scatterers have been reconstructed, which will lead to inconsistency in the number between the ideal scatterers and the reconstructed scatterers. In addition, in real ISAR imaging applications, because the number of scatterers on the target is usually unknown, the MSE of 3-D reconstructed coordinates cannot be obtained. Fortunately, the reconstruction accuracy is closely related to the parameter estimation precision.

Graphic Jump Location
Fig. 6
F6 :

Target’s 3-D imaging result using the proposal.

Therefore, similar to providing a quantitative evaluation for the reconstruction performance of the 3-D target in Sec. 6.1, in order to characterize the parameter estimation precision of the proposed algorithm quantitatively, the MSEs of IRV and RA are calculated by using Eq. (34). Here, the input signal-to-noise is 20 dB, and the experiment is repeated 50 trials. From Table 3, it can be found that the MSEs of the estimated parameters with the proposed algorithm in the paper are relatively small and within the acceptable range in a real application scenario,33 which indirectly demonstrates the effectiveness of the reconstruction via the joint cross MWVD algorithm. On the other hand, in order to improve the 3-D image quality in the future, more attention should be paid to the areas that can acquire a higher antinoise performance and effectively suppress cross-terms.

Table Grahic Jump Location
Table 3Reconstruction performance of example B.

This paper has presented a 3-D InISAR imaging algorithm for maneuvering targets based on the joint cross MWVD. The characteristics of such a 3-D InISAR imaging algorithm include the following: (1) it is a nonsearching method in both cross-range resolution and interferometric phase extraction; (2) it can deal with the multicomponents due to its good performance in suppressing the cross-terms via coherent integration; and (3) it can accurately implement the retrieval of the rotation parameters, which is essential for target recognition and target identification in ISAR imaging application.

Derivation of Signal Phase

This appendix mainly presents the simplification of the phase in Eq. (9). Rearranging Eqs. (3)–(5), we have Display Formula

RAP(tm)RP+2ypΔRyP(tm)2RP++2ypΔR(tm)+[ΔR(tm)]22RP+2xpΔRxP(tm)+2zpΔRzP(tm)+[ΔRxP(tm)]2+[ΔRyP(tm)]2+[ΔRzP(tm)]2+2ΔRyP(tm)ΔR(tm)2RP,(35)
where RP=xp2+yp2+zp2 is the distance of the radar to scatter P. It is worth noting that this paper is under the assumption of far-field conditions, that is, the distance of the scatterer to the three antennas is same and is much larger than the target size. So the approximations ypRP and RPxP or zP hold. According to Eq. (35), the phase of the echo signal from the scattering center located at scatterer P on the target will have the form Display Formula
ΦAP(tm)4πλ[RP+ΔRyP(tm)]+2πλ{2ΔR(tm)+[ΔR(tm)]2RO}+2πλ{2xpΔRxP(tm)+2zpΔRzP(tm)+[ΔRxP(tm)]2+[ΔRyP(tm)]2+[ΔRzP(tm)]2+2ΔRyP(tm)ΔR(tm)RP}.(36)

Evidently, the second terms in Eq. (36), independent of the scatterer, are just autofocus, which should be estimated and removed from all scatterers on the target in the motion compensation. Moreover, the rotation angle is small, and as demonstrated in Ref. 12, the third terms in Eq. (36) can be neglected.

After the autofocus and the aforementioned approximation, the phase in Eq. (9) can be rewritten as Display Formula

ΦAP(tm)=4πλRP+4πλΔRyP(tm),(37)
Display Formula
ΦBP(tm)=ΦAP(tm)+2πλLxpRP+2πλLΔRxP(tm)RP+πλL2RP,(38)
Display Formula
ΦCP(tm)=ΦAP(tm)+2πλLzpRP+2πλLΔRzP(tm)RP+πλL2RP.(39)

Similarly, when the baselines are shorter compared to radar-target distance, the third terms in Eqs. (37)–(39) can also be neglected. As the length of the baselines L is known, the fourth terms can be compensated easily. So, the phases become Display Formula

ΦAP(tm)=4πλRP+4πλΔRyP(tm),(40)
Display Formula
ΦBP(tm)=4πλRP+2πλLxpRP+4πλΔRyP(tm),(41)
Display Formula
ΦCP(tm)=4πλRP+4πλLzpRP+4πλΔRyP(tm).(42)

This work was supported by the National Natural Science Foundation of China under Grant Nos. 61271024 and 61201292. Qian Lv conceived the work in this paper that led to the submission, designed experiment, and accomplished the writing of the manuscript. Jibin Zheng is responsible for interpreting the results and drafting the manuscript. Jiancheng Zhang played an important role in revising the manuscript and providing English language support. Tao Su is mainly responsible for data analysis and approval of the final version.

Berizzi  F.  et al., “High-resolution ISAR imaging of maneuvering targets by means of the range instantaneous Doppler technique: modeling and performance analysis,” IEEE Trans. Image Process.. 10, (12 ), 1880 –1890 (2001). 1057-7149 CrossRef
Wang  Y., and Zhao  B., “Inverse synthetic aperture radar imaging of nonuniformly rotating target based on the parameters estimation of multicomponent quadratic frequency-modulated signals,” IEEE Sensors J.. 15, (7 ), 4053 –4061 (2015). 1530-437X CrossRef
Li  Y.  et al., “Inverse synthetic aperture radar imaging of targets with nonsevere maneuverability based on the centroid frequency chirp rate distribution,” J. Appl. Remote Sens.. 9, , 095065  (2015).CrossRef
Ye  C. M.  et al., “Key parameter estimation for radar rotating object imaging with multi-aspect observations,” Sci. China Inf. Sci.. 53, (8 ), 1641 –1652 (2010).CrossRef
Zheng  J.  et al., “ISAR imaging of targets with complex motions based on the keystone time-chirp rate distribution,” IEEE Geosci. Remote Sens. Lett.. 11, (7 ), 1275 –1279 (2014).CrossRef
Wang  Y., “Inverse synthetic aperture radar imaging of manoeuvring target based on range-instantaneous-Doppler and range-instantaneous-chirp-rate algorithms,” IET Radar, Sonar Navig.. 6, (9 ), 921 –928 (2012).CrossRef
Li  Z., , Papson  S., and Narayanan  R. M., “Data-level fusion of multilook inverse synthetic aperture radar images,” IEEE Trans. Geosci. Remote Sens.. 46, (5 ), 1394 –1406 (2008). 0196-2892 CrossRef
Liu  Y.  et al., “Achieving high-quality three-dimensional InISAR imageries of maneuvering target via super-resolution ISAR imaging by exploiting sparseness,” IEEE Geosci. Remote Sens. Lett.. 11, (4 ), 828 –832 (2014).CrossRef
Martorella  M.  et al., “3D interferometric ISAR imaging of noncooperative targets,” IEEE Trans. Aerosp. Electron. Syst.. 50, (4 ), 3102 –3114 (2014). 0018-9251 CrossRef
Ma  C. Z.  et al., “Three-dimensional ISAR imaging based on antenna array,” IEEE Trans. Geosci. Remote Sens.. 46, (2 ), 504 –515 (2008). 0196-2892 CrossRef
Ma  C. Z.  et al., “Three-dimensional ISAR imaging using a two-dimensional sparse antenna array,” IEEE Geosci. Remote Sens. Lett.. 5, (3 ), 378 –382 (2008).CrossRef
Wang  G., , Xia  X., and Chen  V. C., “Three-dimensional ISAR imaging of maneuvering targets using three receivers,” IEEE Trans. Image Process.. 10, (3 ), 436 –447 (2001). 1057-7149 CrossRef
Xu  X., and Narayanan  R. M., “Three-dimensional interferometric ISAR imaging for target scattering diagnosis and modeling,” IEEE Trans. Image Process.. 10, (7 ), 1094 –1102 (2001). 1057-7149 CrossRef
Zhang  Q., , Yeo  T. S., and Du  G., “Estimation of three-dimensional motion parameters in interferometric ISAR imaging,” IEEE Trans. Geosci. Remote Sens.. 42, (2 ), 292 –300 (2004). 0196-2892 CrossRef
Zhang  D.  et al., “A new interferometric ISAR image processing method for 3-D image reconstruction,” in  Proc. of IEEE Conf. on Synthetic Aperture Radar , pp. 555 –558,  IEEE  (2007).
Liu  Y.  et al., “High-quality 3-D InISAR imaging of maneuvering target based on a combined processing algorithm,” IEEE Geosci. Remote Sens. Lett.. 10, (5 ), 1036 –1040 (2013).CrossRef
Wang  J., and Kasilingam  D., “Global range alignment for ISAR,” IEEE Trans. Aerosp. Electron. Syst.. 39, (1 ), 351 –357 (2003). 0018-9251 CrossRef
Xing  M., , Wu  R., and Bao  Z., “High resolution ISAR imaging of high speed moving targets,” IEE Proc. Radar, Sonar Navig.. 152, (2 ), 58 –67 (2005).CrossRef
Li  X., , Liu  G., and Ni  J., “Autofocusing of ISAR images based on entropy minimization,” IEEE Trans. Aerosp. Electron. Syst.. 35, (4 ), 1240 –1252 (1999). 0018-9251 CrossRef
Xu  J.  et al., “Radon-Fourier transform for radar target detection, I: generalized Doppler filter bank,” IEEE Trans. Aerosp. Electron. Syst.. 47, (2 ), 1186 –1202 (2011). 0018-9251 CrossRef
Xu  J.  et al., “Radon-Fourier transform for radar target detection (II): blind speed sidelobe suppression,” IEEE Trans. Aerosp. Electron. Syst.. 47, (4 ), 1186 –1202 (2011). 0018-9251 CrossRef
Yu  J.  et al., “Radon-Fourier transform for radar target detection (III): optimality and fast implementations,” IEEE Trans. Aerosp. Electron. Syst.. 48, (2 ), 991 –1004 (2012). 0018-9251 CrossRef
Xu  J.  et al., “Radar maneuvering target motion estimation based on generalized Radon-Fourier transform,” IEEE Trans. Signal Process.. 60, (12 ), 6190 –6201 (2012). 1053-587X CrossRef
Xing  M.  et al., “New ISAR imaging algorithm based on modified Wigner-Ville distribution,” IET Radar, Sonar Navig.. 3, (1 ), 70 –80 (2009).CrossRef
Yeh  C. M.  et al., “Rotational motion estimation for ISAR via triangle pose difference on two range-Doppler images,” IET Radar, Sonar Navig.. 4, (4 ), 528 –536 (2010).CrossRef
Peng  S. B.  et al., “Inverse synthetic aperture radar rotation velocity estimation based on phase slope difference of two prominent scatterers,” IET Radar, Sonar Navig.. 5, (9 ), 1002 –1009 (2011).CrossRef
Lv  X.  et al., “Lv’s distribution: principle, implementation, properties, and performance,” IEEE Trans. Signal Process.. 59, (8 ), 3576 –3591 (2011). 1053-587X CrossRef
Zheng  J.  et al., “ISAR imaging of nonuniformly rotating target based on a fast parameter estimation algorithm of cubic phase signal,” IEEE Trans. Geosci. Remote Sens.. 53, (9 ), 4727 –4740 (2015). 0196-2892 CrossRef
Xu  J.  et al., “Space-time Radon-Fourier transform and applications in radar target detection,” IET Radar, Sonar Navig.. 6, (9 ), 846 –857 (2012).CrossRef
Xu  J.  et al., “Radar target imaging using three-dimensional space Radon-Fourier transform,” in  Proc. of Int. Radar Conf. , pp. 1 –6 (2014).
Qian  L. C.  et al., “Fast implementation of generalised Radon-Fourier transform for manoeuvring radar target detection,” Electron. Lett.. 48, (22 ), 1427 –1428 (2012). 0013-4759 CrossRef
Bai  X.  et al., “High-resolution three-dimensional imaging of spinning space debris,” IEEE Trans. Geosci. Remote Sens.. 47, (7 ), 2352 –2362 (2009). 0196-2892 CrossRef
Wu  Y.  et al., “Fast marginalized sparse Bayesian learning for 3-D interferometric ISAR image formation via super-resolution ISAR imaging,” IEEE J. Sel. Topics Appl. Earth Obs. Remote Sens.. 8, (10 ), 4942 –4951 (2015).CrossRef

Qian Lv received her BS degree in measuring and control techniques and instruments from Xi’an Shiyou University, Shaanxi, China, in 2013. Currently, she is working toward her PhD with the National Laboratory of Radar Signal Processing, Xidian University, Xi’an, China. Her research interests include SAR and inverse SAR signal processing, time-frequency analysis, and interferometric InISAR imaging.

Tao Su received his BS degree in information theory, MS degree in mobile communication, and PhD in signal and information processing from Xidian University, Xi’an, China, in 1990, 1993, and 1999, respectively. Currently, he is a professor with the National Laboratory of Radar Signal Processing, School of Electronic Engineering, since 1993. His research interests include high-speed real-time signal processing on radar, sonar and telecommunications, digital signal processing, parallel processing system design, and FPGA IP design.

Jibin Zheng received his degree in electronic information science and technology from Shandong Normal University, Shandong, China, in 2009 and his PhD in signal and information processing from Xidian University, Xi’an, China, in 2015. From September 2012 to September 2014, he worked as a visiting PhD student at the Department of Electrical Engineering, Duke University, Durham, North Carolina. His research interests include SAR and inverse SAR signal processing, and cognitive radar.

Jiancheng Zhang received his BS degree in measurement and control technology and instrumentation from Xidian University, Shaanxi, China, in 2011. Currently, he is pursuing his PhD in the National Key Laboratory of Radar Signal Processing, Xidian University, Xi’an, China. His research interests include target detection, parameter estimation, time-frequency analysis, and radar imaging.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.

Citation

Qian Lv ; Tao Su ; Jibin Zheng and Jiancheng Zhang
"Three-dimensional interferometric inverse synthetic aperture radar imaging of maneuvering target based on the joint cross modified Wigner-Ville distribution", J. Appl. Remote Sens. 10(1), 015007 (Jan 28, 2016). ; http://dx.doi.org/10.1117/1.JRS.10.015007


Figures

Graphic Jump Location
Fig. 1
F1 :

Geometry in the InISAR system.

Graphic Jump Location
Fig. 2
F2 :

Simulation results: (a) contour of WVD, (b) results after ST, (c) contour of MWVD, and (d) stereogram of MWVD.

Graphic Jump Location
Fig. 3
F3 :

Flowchart of the proposed InISAR imaging algorithm.

Graphic Jump Location
Fig. 4
F4 :

Simulation results: (a) Ideal scatterer model, (b) the WVD of antenna AB, (c) the MWVD of antenna AB, and (d) 3-D reconstructed scatterer.

Graphic Jump Location
Fig. 6
F6 :

Target’s 3-D imaging result using the proposal.

Tables

Table Grahic Jump Location
Table 1Reconstruction performance of example A.
Table Grahic Jump Location
Table 2Simulation parameters.
Table Grahic Jump Location
Table 3Reconstruction performance of example B.

References

Berizzi  F.  et al., “High-resolution ISAR imaging of maneuvering targets by means of the range instantaneous Doppler technique: modeling and performance analysis,” IEEE Trans. Image Process.. 10, (12 ), 1880 –1890 (2001). 1057-7149 CrossRef
Wang  Y., and Zhao  B., “Inverse synthetic aperture radar imaging of nonuniformly rotating target based on the parameters estimation of multicomponent quadratic frequency-modulated signals,” IEEE Sensors J.. 15, (7 ), 4053 –4061 (2015). 1530-437X CrossRef
Li  Y.  et al., “Inverse synthetic aperture radar imaging of targets with nonsevere maneuverability based on the centroid frequency chirp rate distribution,” J. Appl. Remote Sens.. 9, , 095065  (2015).CrossRef
Ye  C. M.  et al., “Key parameter estimation for radar rotating object imaging with multi-aspect observations,” Sci. China Inf. Sci.. 53, (8 ), 1641 –1652 (2010).CrossRef
Zheng  J.  et al., “ISAR imaging of targets with complex motions based on the keystone time-chirp rate distribution,” IEEE Geosci. Remote Sens. Lett.. 11, (7 ), 1275 –1279 (2014).CrossRef
Wang  Y., “Inverse synthetic aperture radar imaging of manoeuvring target based on range-instantaneous-Doppler and range-instantaneous-chirp-rate algorithms,” IET Radar, Sonar Navig.. 6, (9 ), 921 –928 (2012).CrossRef
Li  Z., , Papson  S., and Narayanan  R. M., “Data-level fusion of multilook inverse synthetic aperture radar images,” IEEE Trans. Geosci. Remote Sens.. 46, (5 ), 1394 –1406 (2008). 0196-2892 CrossRef
Liu  Y.  et al., “Achieving high-quality three-dimensional InISAR imageries of maneuvering target via super-resolution ISAR imaging by exploiting sparseness,” IEEE Geosci. Remote Sens. Lett.. 11, (4 ), 828 –832 (2014).CrossRef
Martorella  M.  et al., “3D interferometric ISAR imaging of noncooperative targets,” IEEE Trans. Aerosp. Electron. Syst.. 50, (4 ), 3102 –3114 (2014). 0018-9251 CrossRef
Ma  C. Z.  et al., “Three-dimensional ISAR imaging based on antenna array,” IEEE Trans. Geosci. Remote Sens.. 46, (2 ), 504 –515 (2008). 0196-2892 CrossRef
Ma  C. Z.  et al., “Three-dimensional ISAR imaging using a two-dimensional sparse antenna array,” IEEE Geosci. Remote Sens. Lett.. 5, (3 ), 378 –382 (2008).CrossRef
Wang  G., , Xia  X., and Chen  V. C., “Three-dimensional ISAR imaging of maneuvering targets using three receivers,” IEEE Trans. Image Process.. 10, (3 ), 436 –447 (2001). 1057-7149 CrossRef
Xu  X., and Narayanan  R. M., “Three-dimensional interferometric ISAR imaging for target scattering diagnosis and modeling,” IEEE Trans. Image Process.. 10, (7 ), 1094 –1102 (2001). 1057-7149 CrossRef
Zhang  Q., , Yeo  T. S., and Du  G., “Estimation of three-dimensional motion parameters in interferometric ISAR imaging,” IEEE Trans. Geosci. Remote Sens.. 42, (2 ), 292 –300 (2004). 0196-2892 CrossRef
Zhang  D.  et al., “A new interferometric ISAR image processing method for 3-D image reconstruction,” in  Proc. of IEEE Conf. on Synthetic Aperture Radar , pp. 555 –558,  IEEE  (2007).
Liu  Y.  et al., “High-quality 3-D InISAR imaging of maneuvering target based on a combined processing algorithm,” IEEE Geosci. Remote Sens. Lett.. 10, (5 ), 1036 –1040 (2013).CrossRef
Wang  J., and Kasilingam  D., “Global range alignment for ISAR,” IEEE Trans. Aerosp. Electron. Syst.. 39, (1 ), 351 –357 (2003). 0018-9251 CrossRef
Xing  M., , Wu  R., and Bao  Z., “High resolution ISAR imaging of high speed moving targets,” IEE Proc. Radar, Sonar Navig.. 152, (2 ), 58 –67 (2005).CrossRef
Li  X., , Liu  G., and Ni  J., “Autofocusing of ISAR images based on entropy minimization,” IEEE Trans. Aerosp. Electron. Syst.. 35, (4 ), 1240 –1252 (1999). 0018-9251 CrossRef
Xu  J.  et al., “Radon-Fourier transform for radar target detection, I: generalized Doppler filter bank,” IEEE Trans. Aerosp. Electron. Syst.. 47, (2 ), 1186 –1202 (2011). 0018-9251 CrossRef
Xu  J.  et al., “Radon-Fourier transform for radar target detection (II): blind speed sidelobe suppression,” IEEE Trans. Aerosp. Electron. Syst.. 47, (4 ), 1186 –1202 (2011). 0018-9251 CrossRef
Yu  J.  et al., “Radon-Fourier transform for radar target detection (III): optimality and fast implementations,” IEEE Trans. Aerosp. Electron. Syst.. 48, (2 ), 991 –1004 (2012). 0018-9251 CrossRef
Xu  J.  et al., “Radar maneuvering target motion estimation based on generalized Radon-Fourier transform,” IEEE Trans. Signal Process.. 60, (12 ), 6190 –6201 (2012). 1053-587X CrossRef
Xing  M.  et al., “New ISAR imaging algorithm based on modified Wigner-Ville distribution,” IET Radar, Sonar Navig.. 3, (1 ), 70 –80 (2009).CrossRef
Yeh  C. M.  et al., “Rotational motion estimation for ISAR via triangle pose difference on two range-Doppler images,” IET Radar, Sonar Navig.. 4, (4 ), 528 –536 (2010).CrossRef
Peng  S. B.  et al., “Inverse synthetic aperture radar rotation velocity estimation based on phase slope difference of two prominent scatterers,” IET Radar, Sonar Navig.. 5, (9 ), 1002 –1009 (2011).CrossRef
Lv  X.  et al., “Lv’s distribution: principle, implementation, properties, and performance,” IEEE Trans. Signal Process.. 59, (8 ), 3576 –3591 (2011). 1053-587X CrossRef
Zheng  J.  et al., “ISAR imaging of nonuniformly rotating target based on a fast parameter estimation algorithm of cubic phase signal,” IEEE Trans. Geosci. Remote Sens.. 53, (9 ), 4727 –4740 (2015). 0196-2892 CrossRef
Xu  J.  et al., “Space-time Radon-Fourier transform and applications in radar target detection,” IET Radar, Sonar Navig.. 6, (9 ), 846 –857 (2012).CrossRef
Xu  J.  et al., “Radar target imaging using three-dimensional space Radon-Fourier transform,” in  Proc. of Int. Radar Conf. , pp. 1 –6 (2014).
Qian  L. C.  et al., “Fast implementation of generalised Radon-Fourier transform for manoeuvring radar target detection,” Electron. Lett.. 48, (22 ), 1427 –1428 (2012). 0013-4759 CrossRef
Bai  X.  et al., “High-resolution three-dimensional imaging of spinning space debris,” IEEE Trans. Geosci. Remote Sens.. 47, (7 ), 2352 –2362 (2009). 0196-2892 CrossRef
Wu  Y.  et al., “Fast marginalized sparse Bayesian learning for 3-D interferometric ISAR image formation via super-resolution ISAR imaging,” IEEE J. Sel. Topics Appl. Earth Obs. Remote Sens.. 8, (10 ), 4942 –4951 (2015).CrossRef

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Multistatic synthetic aperture radar image formation. IEEE Trans Image Process 2010;19(5):1290-306.
Spatially resolving antenna arrays using frequency diversity. J Opt Soc Am A Opt Image Sci Vis 2016;33(5):899-912.
Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.