Open Access
2 July 2021 Three-dimensional modeling of EUV photoresist using the multivariate Poisson propagation model
Luke T. Long, Andrew R. Neureuther, Patrick P. Naulleau
Author Affiliations +
Abstract

Background: As target feature sizes for EUV lithography shrink, it is becoming ever more important to understand the intricate details of photoresist materials, including the role of the “third dimension”—the dimension perpendicular to the wafer. With resist thicknesses shrinking toward the single-digit nanometer scale alongside target linewidths, small changes in resist performance in this dimension will have a greater overall effect on pattern quality.

Aim: To use modeling to understand the effect that the third dimension has on resist performance, in particular the interplay between the third dimension and resist stochastics.

Approach: We developed a three-dimensional version of the multivariate Poisson propagation model, a stochastic resist simulator. As a test case for the model, we explore the role of acid diffusion in the so-called third dimension by simulating 105 vias at a series of z-blur conditions.

Results: Our model suggests that increased z-blur yields an improvement in both dose to size and pattern uniformity without sacrificing resolution.

Conclusions: We have developed a 3D resist model that can simulate large numbers of contacts. Early results from the 3D model show improved patterning performance can be achieved by increasing the z-blur.

1.

Introduction

As features sizes shrink toward the single-digit nanometer regime, the demands made of photoresist materials become ever stricter. The omnipresent RLS triangle, illustrated at least 16 years ago by Gallatin,1 still represents a challenging hurdle for extreme ultraviolet (EUV) photoresist, particularly as resolution targets preclude the large resist blurs required to smooth and amplify the incoming EUV signal.

To overcome these challenges, resist manufacturers and researchers have turned to exploring new avenues for improving patterning performance ranging from all together new material systems to process-based solutions using underlayers24 and additives. Common among these efforts is a recognition that better resist materials are needed for device scaling to continue. One area of potential improvement, touched on by the underlayer work mentioned above, involves utilizing the third dimension of the patterning process, the dimension perpendicular to the wafer, as a means to improving two-dimensional (2D) pattern quality.

Driven by this idea, we expanded the 2D multivariate Poisson propagation model (MPPM)5 to a full three-dimensional (3D) model capable of simulating the role of 3D parameters to overall resist performance. In this paper, we present some of the challenges associated with this expansion and discuss the computational tools used to overcome them. Then using our new tool, we explore the role of the third dimension with respect to acid blur. We examine how, with increased acid blur in the z dimension, we can achieve improvement in pattern uniformity and dose sensitivity without sacrificing resolution. Furthermore, we evaluate how our assumptions about resist dissolution, metrology, and etch have a profound impact on the conclusions that we draw about photoresist performance.

2.

Methods

This section is divided into two subsections. The first is comprised of the details of the model and the computational challenges of a full 3D implementation. The second details the methodology used to study the role of acid z-blur in pattern formation. We discuss how the full 3D model forced us to grapple with the role of resist dissolution and metrology/etch to analyze our simulated data.

2.1.

Model Details and 3D Challenges

2.1.1.

Model details

While the core stochastic details of the model mirror those of the 2D MPPM, and are detailed elsewhere,57 a few of the key features will be summarized here. The MPPM is an “error” propagation model, whereby the initial distribution of resist components and photons are treated as random variables (RVs). Whereas in this study, the RVs are picked from Poisson distributions, we note that this in general need not be the case; any sensible probability distribution could be used instead. These initial RVs are then propagated forward by accounting for the various resist processes: electron yield and blur, acid generation, and, ultimately, reaction-diffusion during post-exposure bake (PEB). A key feature of the MPPM is the ability to choose which variables are considered stochastic versus deterministic. The set of stochastic variables in this paper mirrors that of the fully stochastic simulations performed by Naulleau,5 with the change to tracking 3D voxel rather than 2D pixel counts. It is worth noting that, in the absence of the quenching reaction, the diffusion term in our simulation results in a Gaussian point spread function (PSF) of each acid. The standard deviation of this PSF is denoted as the blur and is the key parameter that is changed in our z-blur investigation. For the interested reader, more details on the numerics of the reaction-diffusion simulation can be found in Appendix A.

2.1.2.

2D versus 3D

In migrating from a 2D to a 3D model, the biggest challenge is associated with the large increase in data. Take, for example, the aerial image shown in Fig. 1 that was used for the simulation work presented in this paper. This aerial image is comprised of 1280×1280  pixels, 0.4 nm per side, to build a 15×15 array of 16 nm contacts. If our model is used to track three species—acid, base, and deprotection—then the model requires that we keep track of 1280×1280×34.9×106 floating point numbers, for a total of 20 MB of data at single precision.

Fig. 1

Aerial image comprised of a 15×15 array of 16 nm contacts. Pixels are 0.4 nm per side.

JM3_20_3_034601_f001.png

In the 3D model, on the other hand, if our resist is 35 nm thick, then we need an additional factor of 88 voxels at a side length of 0.4 nm to represent the same domain, bringing our totals to 1280×1280×88×3430×106 floating point numbers and 1.7 GB worth of data. This does not count the memory required to store temporary arrays during the reaction-diffusion simulation. This large increase in data coincides with a large increase in the number of computations required on the 3D grid, an increase that poses a serious challenge when trying to simulate the hundreds of thousands or even millions of features required to get statistics on, for example, missing contacts.

Our solution to this problem is to create a GPU-compatible version of the model. While more details can be found in Appendix B, the basic idea is to take advantage of the parallel structure of the reaction-diffusion simulation by running this part of the code on an accelerator. The resulting code is able to simulate about one contact per 2.5 s per GPU. With 10 GPUs available at a given time on the cluster utilized, each data set of 105 contacts required about 7 h of computation, representing about an order of magnitude improvement over the equivalent code using a multithreaded CPU implementation.

2.2.

z-Blur Experiment

As a test case for the newly developed model, we looked at the impact of acid z-blur, or acid diffusion perpendicular to the wafer, on contact uniformity. Previous work using analytic models suggests that increased z-blur should yield a resist performance improvement.8 While the present paper is mostly a numerical implementation of a thought experiment related to anisotropic blur, it is rooted in past studies in the deep UV regime. For example, Cheng et al.9 showed that pattern profiles could be improved by the application of low-frequency AC electric fields during PEB, and Yuan10 found that numerical resist models best matched experimental data when the deprotection chemistry locally altered acid and quencher diffusion rates. Although these effects have origins physically distinct from those studied here (namely, they represent an additional advection term in what is otherwise a reaction-diffusion process), the observation of these effects lends credence to the possible presence of anisotropies in the patterning process that could be exploited in future resist materials. One such chemistry that occurs to the authors is the use of polymer brush-like materials, in which the polymer structure itself contains anisotropies that may induce directional acid diffusion. We further note that advection can be explicitly included in the model and that directionally dependent diffusion merely represents a starting place for 3D investigations.

For our simulation, we used the resist model specified in Table 1. For our baseline study, we held the xy blur constant at 12 nm, while varying the z-blur across a range of values. For each z-blur, we simulated a minimum of 105 contacts. A graphical understanding of the impact of the differing z-blur conditions can be gained from the first row of Fig. 4, which illustrates the acid PSF in the absence of quenching effects.

Table 1

Resist model parameters.

ParameterValue
CD/pitch16/32  nm
Resist thickness35 nm
Voxel size0.4×0.4×0.4  nm3
Dose (peak at wafer)30  mJ/cm2
Absorptivity4/μm
PAG concentration0.2/nm3
Base concentration0.085/nm3
QE3
Acid blur (x,y,z)(12,12,[2,6,12,24]) nm
Base blur (x,y,z)(5,5,5) nm
Deprotection rate1  nm3/s
Quenching rate10  nm3/s

At the x and y boundaries of the model, we assumed periodic conditions, while no flux boundary conditions were used in the z dimension. With these conditions, acid and quencher counts in the model domain would be conserved if not for the quenching reaction. To avoid nonphysical correlations between exposure events at the xy extremes of the model domain, the half-contacts that comprise the boundary of the aerial image shown in Fig. 1 were discarded and not used in the final analysis. We note that this was performed out of an abundance of caution due to the rarity of the failed contact phenomena that we chose to study; correlations would be minimal regardless due to the xy acid blur being about 1/3 of the pattern pitch.

To extract a critical dimension (CD) distribution from our 3D model, two methods were evaluated. In the first, which we call “volume-averaging,” we sum the total volume of cleared voxels in the domain of the contact using a threshold deprotection model. The threshold was set to a deprotection fraction of 0.27. Then, assuming that the contact is a cylinder, we convert this volume to a CD using the equation CD=2V/(πh). This method is in essence performing a sort of z-averaging of the contact CD. In the second method, we instead consider the minimum area open from the top to the bottom of a contact. This analysis consists of counting “cleared” pixels, in which a pixel is clear if the contact is deprotected in a straight line in z from the top of the resist down to the substrate. We then compute an effective CD using CD=2A/π. Simple cartoons of the two cases are shown in Fig. 2.

Fig. 2

Schematics of the analysis methods. (a) The volume averaging method. Note that volume above and below the blockage is counted. (b) Minimum area method. The blockage sets the CD to zero.

JM3_20_3_034601_f002.png

Although both analysis methods have some merit, we deem method 2 to be closer to what actually matters for lithographic processes as only this method can account for the so-called road blocks preventing a contact from opening. However, it provides an overly pessimistic view of contact size and uniformity. In particular, we know that a real developer is capable of punching through small regions of undeveloped resist as there is always some dark loss associated with the development process. To describe this phenomena, we developed a dissolution model to accompany the MPPM.

2.3.

Dissolution Model

As illustrated by the discrepancy between the analysis methods illustrated above, our investigation forced us to engage with the subtleties of interpreting our model results. Part of our solution was to look at the impact of a realistic versus threshold dissolution model on our critical dimension analysis. To that end, we developed a dissolution model based on the fast marching level set algorithm.11 This model propagates the dissolution front through the photoresist material using a development front speed map. The speed map can be determined by combining experimentally measured parameters with the original Mack development model.12 The model was implemented in C++ and runs fast enough that it does not add significantly to the overall run time of the MPPM model.

In modeling the develop, the maximum and minimum develop rates were chosen in accordance with experimental results,13 and we chose a threshold in the Mack model such that the develop rate at a deprotection fraction of 0.27 was 35  nm/30  s. Develop parameters are shown in Table 2, and the corresponding dissolution rate curve is given in Fig. 3. At this develop speed, an entire stack of voxels at the 0.27 threshold is cleared during the develop time. With this construct, the region below the red dashed line and to the left of the black dashed line represents the voxels that could change from blocked in the threshold model to cleared after applying the realistic develop model. Following the develop simulation, we performed the abovementioned analysis method 2 to get a new value for the open areas of the contacts.

Table 2

Develop model parameters.

ParameterValue
rmax200  nm/s
rmin0.02  nm/s
n10

Fig. 3

Develop speed versus deprotection corresponding to the Mack model with parameters given in Table 2.

JM3_20_3_034601_f003.png

2.4.

Etch/Metrology

As a final step, we looked at how etch or SEM metrology might change our conclusions about z acid diffusion. In particular, small regions of resist material may be effectively invisible to the etch or the electron beam used to measure these materials. To simulate this effect, we performed a simple thresholding of our developed resist images by first creating a 2D xy map in which each pixel represents the number of undeveloped voxels in the z stack. The chosen threshold then corresponds to the thickness of resist removed by the etch or invisible to the SEM electron beam. This thresholding implicitly assumes that the etchant/electrons are coming from a single direction and that thus these are perfectly anisotropic processes in the z dimension. Note that a threshold of 0 corresponds to infinite etch selectivity or total electron beam opacity and is equivalent to analysis method 2.

3.

Results

3.1.

Threshold Develop

Figure 4 shows the qualitative results for the same seed (same initial conditions) under three different acid blur conditions. The opaque surface illustrates the contour of the contact at a deprotection threshold of 0.27. As is apparent from these illustrations, the increased z-blur yields a contact that is smoother in z and perhaps somewhat narrower in x and y. When viewed from the top, the difference is even more stark as only the contact with the increased z-blur has a sizeable clear path from the top to the bottom of the contact.

Fig. 4

Acid PSF and corresponding contact contours. Rows correspond to the xz plane of the blur function, a side profile, and an aerial view of the resulting contact, respectively. Columns correspond to z-blur values of 6, 12, and 24, respectively, with xy blur held constant at 12. As z-blur increases, contact edges become smoother, and the path to from top to bottom becomes clearer.

JM3_20_3_034601_f004.png

The entire set of contacts was analyzed via the two methods laid out in Sec. 2.2. Figure 5(a) shows the results obtained via the volume-average method, and Fig. 5(b) shows the results when analyzed via the minimum CD. It is clear that these two methods yield very different results. When we utilize the volume average method, we see that, if anything, we get smaller contacts with a wider distribution as the z-blur is increased, contrary to the intuition that increased z-blur should yield a more uniform distribution. In contrast, the minimum clear CD method yields quite the opposite result, with the larger z-blur yielding a larger mean CD as well as a narrower distribution. The schematic shown in Fig. 2 is helpful for understanding the difference; in essence, the volume average-method is ambivalent to the presence of closed regions of the contact. The lost volume from a few closed z slices can be made up for using volume from wider deprotected slices elsewhere in the feature. In contrast, the minimum CD method assumes that a z slice below the threshold, or a few slices that are open but misaligned in z, result in a closed contact. Since we believe that so-called “road blocks” do stop the develop process, the minimum-CD analysis is probably closer to reality than the volume-averaged method. That being said, this method provides an overly pessimistic view of missing contact counts; the develop is not a perfect switch that changes at a given threshold, but it has some minimum develop rate that eats at resist material even below the deprotection threshold. In the next section, we examine the impact of one set of develop parameters on our analysis of the role of z-blur.

Fig. 5

CD histograms produced by the (a) volume average and (b) z-projected minimum CD methods.

JM3_20_3_034601_f005.png

3.2.

Develop Model

The develop process was modeled using the method described in Sec. 2.3. For the sake of comparison, the resulting contact contour for the 6-nm blur case is shown in Fig. 6, which can be directly compared with the middle column of Fig. 4. We see that the develop has served to expand the contact in the xy plane relative to the threshold develop model, and from the aerial view, we can see that it has punched through small blockages at the bottom and on the sides of the contact. The impact of these changes is shwon in Fig. 7, which shows that the develop has served to increase the mean CD of all of the z-blur cases examined, as well as to narrow the distributions. Although somewhat less dramatic than the threshold develop case, the conclusion still holds that increased z-blur increases the CD and results in a narrower distribution of CD.

Fig. 6

Contour of 6-nm z-blur contact after develop. This is directly comparable to the threshold develop method used to generate the profiles in the middle column of Fig. 4.

JM3_20_3_034601_f006.png

Fig. 7

Contact CD distribution using the Mack development model.

JM3_20_3_034601_f007.png

3.3.

Exposure Latitude and CD Biasing

Figure 7 shows that changing z-blur has the effect of biasing the mean contact CD. To use the line of thinking presented by Naulleau et al.,6 one may wonder whether the improvement in CD distribution arises from an increase in acid amplification due to the increased z diffusion. This increased amplification could bias the mean CD away from the nonlinear dose response of the resist, which would in turn mitigate the skew of the CD distribution. To test this theory, we simulated the exposure latitude of the resist by running 225 contacts at a range of doses and then employing the develop model and analysis to extract the CD. It is important to note that, using this analysis, the stochastics play a key role in determining the average CD as the interaction between individual z-slices is nonlinear. The resulting dose versus CD plot is shown in Fig. 8(a). As expected based on the histograms, we do see an increase in the mean contact CD at a given dose due to increased z-blur, and the mean CD has been biased somewhat relative to the non-linear response of the resist.

Fig. 8

(a) CD versus dose response and (b) corresponding dose to size for the different z-blur resist models.

JM3_20_3_034601_f008.png

To test whether this bias is at the root of the improved contact uniformity with increased z-blur, we simulated an additional 105 contacts at the smallest and largest z-blurs, rebiasing the mean contact CD to equal the isotropic case by adjusting the dose according to Fig. 8(b). The resulting histograms are shown in Fig. 9. It is clear that, despite requiring about 6% less dose, 24 nm of z-blur still yields a more uniform contact distribution than the isotropic case, and the opposite is true of the 2 nm z-blur case. Thus, the improvement in contact uniformity is not simply a result of additional acid amplification and resulting CD bias, but there is an additional smoothing effect as well.

Fig. 9

Contact CD distribution using the develop model after adjusting dose such that all contacts print to the same average size.

JM3_20_3_034601_f009.png

3.4.

Larger Isotropic Blur

As a control, we wanted to ensure that the benefit of increasing z-blur could not be equivalently obtained using a larger isotropic blur. To test, we used the fact that 12×12×2415.13 to generate an equivalent-volume isotropic acid blur condition. Figure 10(a) shows the CD distribution of 105 contacts simulated under these new conditions. It is apparent from this plot that a larger isotropic blur has an overall detrimental effect on the resulting patterns, with the average CD subtly reduced and the CD distribution dramatically broadened. A quick look at the resulting exposure latitude in Fig. 10(b) shows why: relative to the smaller isotropic blur and increased z-blur conditions, the dose response of the larger isotropic blur has a higher-dose corner and overall steeper slope than the previous conditions, as might be expected based on the additional xy diffusion. Thus, the equivalent volume spread in the effective dose results in a wider and more skewed spread of CDs for the larger isotropic blur. We should note that this actually suggests that the larger isotropic blur performs worse in all of the R, L, and S of the RLS trade-off as compared with the original blur conditions.

Fig. 10

Comparison of (a) CD distribution and (b) dose response of the isotropic (12,12,12) nm, anisotropic (12,12,24) nm, and isotropic (15,15,15) nm blurs.

JM3_20_3_034601_f010.png

3.5.

Etch/Metrology Analysis

As a final area of interest, we investigated how etch or metrology may impact our conclusions about z acid blur. Figure 11 shows the qualitative impact of the thresholding process on the 6-nm z-blur contact shown elsewhere in this paper, and Fig. 12 shows the resulting histograms assuming the etch/metrology can “see” through various resist thicknesses. Figure 12 shows that an increase in the etch/metrology threshold results in an increase in mean CD as well as a narrowing of the CD distribution as the amount of invisible resist increases. In addition, we see that the effect of acid z-blur decreases as the tolerance increases; in essence, the more resist the etch or metrology can see through, the more these processes play the z-averaging role. The narrowing of the CD distribution post etch has been reported previously,14 suggesting that the etch can indeed play this averaging role. Furthermore, pre-etch CD distributions are obtained via SEM metrology, which suggests that etch is more tolerant of small resist blockages than the resist metrology.

Fig. 11

Example of the impact of different etch thresholds on the resulting contact shape. The more resist that is etched, the larger the contact is.

JM3_20_3_034601_f011.png

Fig. 12

Histograms assuming different etch models. (a) Baseline, assuming perfect etch selectivity. (b) Etch 1 nm of resist. (c) Etch 2 nm of resist. (d) Etch 5 nm of resist.

JM3_20_3_034601_f012.png

To further investigate whether increased z-blur still provides benefit when a 5-nm etch is assumed, we performed a new batch of stochastic simulations. First, to counteract the increased average size of the contacts due to the increased etch assumption, we reduced the dose by about 10% to reduce the mean contact size to 18 nm. Second, we ran 3×105 additional simulations at each of 2, 12, and 24 nm z-blurs. The resulting histogram is shown in Fig. 13. Counterintuitively, the increased z-blur case (shown in purple) now displays the widest CD distribution. However, although this would suggest that this blur would result in a higher rate of missing contacts, the opposite is true for this set of simulation data. Despite having the narrowest distribution, the 2-nm z-blur simulations yielded four missing contacts out of the 3×105 simulated, and the isotropic blur simulations yielded 2. The stretched z-blur, in spite of its wider CD distribution, yielded 0. While we acknowledge that it is difficult to draw firm conclusions based on such small numbers of failures, this result suggests that the CD distribution may not tell the whole story when it comes to predicting missing contacts; in spite of resulting in what appears to be a less favorable CD distribution, increased z-blur may help to mitigate the occurrence of missing contacts.

Fig. 13

Rebiased CD histograms with 5 nm etch assumption. While the increased z-blur resist has a wider CD distribution, it also displays 0 missing contacts, unlike the isotropic (two missing) and 2 nm (four missing) z-blur simulations.

JM3_20_3_034601_f013.png

A brief look at the initial acid distribution corresponding to a contact failure at 2 nm of z-blur reveals why increased acid diffusion in z may help to avoid missing contacts. As shown in Fig. 14(b), this particular contact failed to produce a cleared path from the top of the resist to the substrate, leaving about 8 nm of undeveloped material at the bottom that our simple etch model was unable to remedy. Figure 14(a) shows a cutline of the initial acid distribution that gave rise to this contact. In this particular failure, above and below the blockage, there was sufficient acid to clear the contact. Had the z-blur been greater, it may have been possible to join these two regions, clearing a path for the developer to open the contact.

Fig. 14

Profile of a failed contact. (a) Cutline of initial acid distribution. Two unconnected regions of high acid concentration are separated by the blockage-causing region. (b) Resulting postdevelop profile. Bottom 8 nm of undeveloped resist is not able to be cleared by a 5-nm etch.

JM3_20_3_034601_f014.png

4.

Conclusions

We have developed a method for performing large-area 3D simulations of EUV photoresist. The model is useful for examining the impact of different resist parameters on the resulting statistical distributions of features printed using EUV lithography, in particular those resulting from the intrinsically 3D nature of the resist.

In this paper, we examined the role that the z component of acid diffusion plays in the pattern formation process. In general, we showed that increased z-blur allows for more uniform contacts to be printed at a lower dose, a win-win in terms of the RLS trade-off. We also showed that these conclusions are highly dependent on the develop and etch processes with which any resist must be integrated. The developer, having a nonzero minimum develop rate, is capable of removing small blockages within the contact, though at the price of increasing the contact size by eating slowly at the contact walls. Similarly, the etch can play a smoothing role by punching through small regions of undeveloped resist. This work suggests that it is the combination of all three, the resist, etch, and developer, that together must be optimized to achieve desired patterning performance. Furthermore, this required co-optimization highlights the need for improved understanding of the etch and dissolution processes to untangle and identify the failure-causing phenomena in the lithography process.

Moving forward, we plan to examine the role of other inherently 3D effects on the patterning process. In particular, we are interested in the role of the underlayer as a source or sink of photoelectrons. We think our model can provide some interesting insight as to the benefit of engineering these materials along with the resist to achieve better pattern performance.

5.

Appendix A: Numerical Details

The propagation of RVs during the reaction diffusion process requires the solving of a set of coupled partial differential equations (PDEs) for the acid, quencher, and protecting groups. To solve these PDEs, the spatial derivatives corresponding to the diffusive part of the equation are discretized using a second order, seven point finite difference stencil. This stencil is comprised of three one-dimensional stencils of the form (1,2,1), scaled by the appropriate diffusion constant for that dimension. As a result of this procedure, the remaining equations are a set of coupled first-order ordinary differential equations of time only, which are then numerically integrated to simulate the postexposure bake.

An adaptive time-stepping scheme utilizing overlapping Runge–Kutta methods of orders 2 and 3 in accuracy is used for the time integration. The method is akin to Matlab’s ODE23.15 The basic idea is to use two Runge–Kutta methods of differing orders of accuracy to evaluate the new value in a given model cell. The methods are chosen such that they share common midpoint evaluations to reduce the number of required calculations. Then, by taking the difference between the two solutions, an estimate of the integration error is given. If this error is above a specified threshold, the time step can be reduced accordingly, and the procedure is repeated. Conversely, a time step resulting in an error well below the threshold can be increased. This adaptive control is important for our model as the initial conditions are essentially comprised of a series of delta functions of acid and quencher, giving rise to rapidly moving diffusion fronts and reactions. Thus, at short PEB times, it is numerically imperative to take small time steps to capture this phenomena, but these time steps can be expanded as the model smooths. The adaptive time stepping thus helps to achieve a balance between numeric accuracy and model run time.

6.

Appendix B: GPU Acceleration

To accelerate the throughput of our simulations, GPUs were utilized. As is well known, these computing devices are well optimized for simultaneously performing multiple, independent computations in parallel. The original MPPM model code was adapted to 3D and written in C. Then, with the regular C code operational, GPU acceleration was achieved using OpenACC compiler directives. The directives are similar to the OpenMP framework, whereby lines are added to the code that are normally interpreted as simple comments unless the appropriate compiler flags are turned on. To optimize the compilation for the Nvidia GPUs utilized for these simulations, the Nvidia nvc compiler, part of the Nvidia HPC SDK, version 20.11, using CUDA 11.0 drivers, was used. Note that previous versions of this compiler suite were known as PGI. As noted in the main body of the paper, the end result was a code capable of simulating, on average, one contact every 2.5 s per NVIDIA 2080Ti GPU (a single GPU being used to simulate 225 contacts simultaneously). We note that performance can be further improved using newer GPUs given the ever improving computation rate as well as greater memory capacity, which allows for larger domains to be simulated at a given time.

Acknowledgments

This research was sponsored by C-DEN (Center for design-enabled nanofabrication). Member companies include ASML, Carl Zeiss Group, Intel, KLA, Mentor Graphics, and Samsung. This work was performed in part at Lawrence Berkeley National Laboratory, which is operated under the auspices of the Director, Office of Science, U.S. Department of Energy under Contract No. DE-AC02-05CH11231. This material was also based on work supported by the U.S. Department of Energy, Office of Science, Office of Workforce Development for Teachers and Scientists, Office of Science Graduate Student Research (SCGSR) program. The SCGSR program was administered by the Oak Ridge Institute for Science and Education for the DOE under contract number DE-SC0014664. This research used the Savio computational cluster resource provided by the Berkeley Research Computing program at the University of California, Berkeley (supported by the UC Berkeley Chancellor, Vice Chancellor for Research, and Chief Information Officer). This research used the Lawrencium computational cluster resource provided by the IT Division at the Lawrence Berkeley National Laboratory (supported by the Director, Office of Science, Office of Basic Energy Sciences, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231). In addition, the corresponding author would like to specially thank Samuel Elliot, friend, colleague, and HPC expert, for his invaluable help getting the 3D model up and running. This work was originally published in the Proceedings of SPIE, volume 11609, doi 10.1117/12.2589532.

References

1. 

G. M. Gallatin, “Resist blur and line edge roughness,” Proc. SPIE, 5754 38 (2005). https://doi.org/10.1117/12.607233 Google Scholar

2. 

H. Xu et al., “Underlayer designs to enhance the performance of EUV resists,” Proc. SPIE, 7273 72731J (2009). https://doi.org/10.1117/12.814223 Google Scholar

3. 

A. De Silva et al., “Inorganic hardmask development for extreme ultraviolet patterning,” J. Micro/Nanolithogr. MEMS MOEMS, 18 (1), 011004 (2018). https://doi.org/10.1117/1.JMM.18.1.011004 JMMMHG 1932-5134 Google Scholar

4. 

J. Ma, A. R. Neureuther and P. P. Naulleau, “Investigating EUV radiochemistry with condensed phase photoemission,” Proc. SPIE, 10957 109571Y (2019). https://doi.org/10.1117/12.2520391 Google Scholar

5. 

P. Naulleau and G. Gallatin, “Relative importance of various stochastic terms and EUV patterning,” J. Micro/Nanolithogr. MEMS MOEMS, 17 (4), 041015 (2018). https://doi.org/10.1117/1.JMM.17.4.041015 JMMMHG 1932-5134 Google Scholar

6. 

P. Naulleau, S. Bhattarai and A. Neureuther, “Understanding extreme stochastic events in euv resists,” J. Photopolym. Sci. Technol., 30 (6), 695 –701 (2017). https://doi.org/10.2494/photopolymer.30.695 Google Scholar

7. 

L. Long, A. R. Neureuther and P. P. Naulleau, “Modeling of novel resist technologies,” Proc. SPIE, 10960 1096011 (2019). https://doi.org/10.1117/12.2515144 Google Scholar

8. 

G. M. Gallatin, P. Naulleau and R. Brainard, “Fundamental limits to EUV photoresist,” Proc. SPIE, 6519 651911 (2007). https://doi.org/10.1117/12.712346 Google Scholar

9. 

M. Cheng et al., “Improving resist resolution and sensitivity via electric-field enhanced postexposure baking,” J. Vac. Sci. Technol. B, 20 (2), 734 (2002). https://doi.org/10.1116/1.1464835 Google Scholar

10. 

L. Yuan, “Modeling and calibration of resist processes in photolithography,” Berkeley (2005). Google Scholar

11. 

J. A. Sethian, “A fast marching level set method for monotonically advancing fronts,” Proc. Natl. Acad. Sci. U. S. A., 93 (4), 1591 –1595 (1996). https://doi.org/10.1073/pnas.93.4.1591 Google Scholar

12. 

C. A. Mack, “Development of positive photoresists,” J. Electrochem. Soc., 134 (1), 148 –152 (1987). https://doi.org/10.1149/1.2100396 Google Scholar

13. 

Y. Vesters, D. De Simone and S. De Gendt, “Dissolution rate monitor tool to measure EUV photoresist dissolution,” J. Photopolym. Sci. Technol., 30 (6), 675 –681 (2017). https://doi.org/10.2494/photopolymer.30.675 Google Scholar

14. 

M. J. Maslow et al., “Impact of local variability on defect-aware process windows,” Proc. SPIE, 10957 109570H (2019). https://doi.org/10.1117/12.2514719 Google Scholar

15. 

P. Bogacki and L. F. Shampine, “A 3(2) pair of Runge—Kutta formulas,” Appl. Math. Lett., 2 (4), 321 –325 (1989). https://doi.org/10.1016/0893-9659(89)90079-7 Google Scholar

Biography

Luke T. Long received his BS degree in engineering physics from the University of Colorado, Boulder, in 2015. He began his PhD program in fall 2016 in the Physics Department at the University of California, Berkeley. In January 2018, he joined Professor Andrew Neureuther and Dr. Patrick Naulleau in EUV photoresist research. His work has been recognized with the 2020 Nick Cobb Memorial Scholarship, the Department of Energy SCGSR Fellowship, and several SPIE student paper awards.

Andrew R. Neureuther received his PhD in the Antenna Lab in electrical engineering from the University of Illinois, Urbana, in 1966, became an EECS Faculty member at UC Berkeley, and retired in 2007. His work in optical, electron-beam, EUV, and x-ray lithography came about through working at IBM Research in 1972. His awards include the National Academy of Engineering 1995, 2003 IEEE Cledo Brunetti Award, SIA Award 2007, 2011 SPIE Zernike Award, and SPIE Conference Awards.

Patrick P. Naulleau received his BS and MS degrees in electrical engineering from Rochester Institute of Technology, Rochester, and his PhD in electrical engineering from the University of Michigan, Ann Arbor, in 1997. He then joined Berkeley Lab working in EUV lithography and metrology. In April 2010, he became the director of the Center for X-ray Optics at Berkeley Lab. He has over 300 publications and 19 patents and is a fellow of OSA and SPIE.

© 2021 Society of Photo-Optical Instrumentation Engineers (SPIE) 1932-5150/2021/$28.00 © 2021 SPIE
Luke T. Long, Andrew R. Neureuther, and Patrick P. Naulleau "Three-dimensional modeling of EUV photoresist using the multivariate Poisson propagation model," Journal of Micro/Nanopatterning, Materials, and Metrology 20(3), 034601 (2 July 2021). https://doi.org/10.1117/1.JMM.20.3.034601
Received: 1 April 2021; Accepted: 9 June 2021; Published: 2 July 2021
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
3D modeling

Etching

Photoresist materials

Extreme ultraviolet lithography

Diffusion

Photoresist developing

Data modeling


CHORUS Article. This article was made freely available starting 02 July 2022

Back to Top