With continuous shrink in feature dimensions, overlay tolerance for fabrication of transistors is getting more stringent. Achieving good overlay is extremely critical in getting good yield in HVM environment. It is widely understood that good alignment during exposure is critical for better on product overlay [1]. Conventional methods to choose alignment marks on ASML scanners are based on comparing alignment key performance indicators (KPIs) including signal quality, grid repeatability, etc. It is possible that even with good alignment KPIs, OPO is still impacted. In this paper, we propose aspects that need to be monitored to choose proper alignment marks. LIS (Litho In-Sight) alignment, Ideal overlay/APC parameter signatures are used to determine and validate wafer alignment. LIS alignment ‘Target and Profile selection’ analysis enables us to determine best alignment strategy between multiple strategies/marks based on overlay measurements. Analysis includes examining wafer to wafer OPO variation which is key indicator for alignment robustness. Varying overlay parameters within lot would indicate either large process instability or alignment mark signal instability. It is possible that alignment marks depending on their segmentation can be very differently impacted with the process. Ideal overlay/APC signature stability indicates healthy process and wafer alignment. Having similar APC signatures at corresponding layers would mean that there is no major process or alignment issue.
Wafers at FBEOL layers traditionally have higher stress and larger alignment signal variability. ASML’s ATHENA sensor based scanners, commonly used to expose FBEOL layers, have large spot size (~700um). Hence ATHENA captures the signal from larger area compared to the alignment marks which are typically ~40um wide. This results in higher noise in the alignment signal and if the surrounding areas contain periodic product structures, they interfere with the alignment signal causing either alignment rejects or in some cases- misalignment. SMASH alignment sensors with smaller spot size (~40um) and two additional probe lasers have been used to improve alignment quality and hence reduce mark/wafer rejects. However, due to the process variability, alignment issues still persist. For example, the aluminum grain size, alignment mark trench deposition uniformity, alignment mark asymmetry and variation in stack thicknesses all contribute to the alignment signal variability even within a single wafer. Here, a solution using SMASH sensor that involves designing new alignment marks to ensure conformal coating is proposed. Also new techniques and controls during coarse wafer alignment (COWA) and fine wafer alignment (FIWA) including extra controls over wafer shape parameters, longer scan lengths on alignment marks and weighted light source between Far Infra-Red laser (FIR) and Near Infra-Red (NIR) for alignment are presented. All the above mentioned techniques, when implemented, have reduced the wafer alignment reject rate from around 25% to less than 0.1%. Future work includes mark validation based on the signal response from the various laser colors. Finally, process monitoring using alignment parameters is explored.
To further shrink the contact and trench dimensions, Negative Tone Development (NTD) has become the de facto process at these layers. The NTD process uses a positive tone resist and an organic solvent-based negative tone developer which leads to improved image contrast, larger process window and smaller Mask Error Enhancement Factor (MEEF)[1]. The NTD masks have high transmission values leading to lens heating and as observed here wafer heating as well. Both lens and wafer heating will contribute to overlay error, however the effects of lens heating can be mitigated by applying lens heating corrections while no such corrections exist for wafer heating yet. Although the magnitude of overlay error due to wafer heating is low relative to lens heating; ever tightening overlay requirements imply that the distortions due to wafer heating will quickly become a significant part of the overlay budget. In this work the effects, analysis and observations of wafer heating on contact and metal layers of the 14nm node are presented. On product wafers it manifests as a difference in the scan up and scan down signatures between layers. An experiment to further understand wafer heating is performed with a test reticle that is used to monitor scanner performance.
In recent years overlay (OVL) control schemes have become more complicated in order to meet the ever shrinking margins of advanced technology nodes. As a result, this brings up new challenges to be addressed for effective run-to- run OVL control. This work addresses two of these challenges by new advanced analysis techniques: (1) sampling optimization for run-to-run control and (2) bias-variance tradeoff in modeling. The first challenge in a high order OVL control strategy is to optimize the number of measurements and the locations on the wafer, so that the “sample plan” of measurements provides high quality information about the OVL signature on the wafer with acceptable metrology throughput. We solve this tradeoff between accuracy and throughput by using a smart sampling scheme which utilizes various design-based and data-based metrics to increase model accuracy and reduce model uncertainty while avoiding wafer to wafer and within wafer measurement noise caused by metrology, scanner or process. This sort of sampling scheme, combined with an advanced field by field extrapolated modeling algorithm helps to maximize model stability and minimize on product overlay (OPO). Second, the use of higher order overlay models means more degrees of freedom, which enables increased capability to correct for complicated overlay signatures, but also increases sensitivity to process or metrology induced noise. This is also known as the bias-variance trade-off. A high order model that minimizes the bias between the modeled and raw overlay signature on a single wafer will also have a higher variation from wafer to wafer or lot to lot, that is unless an advanced modeling approach is used. In this paper, we characterize the bias-variance trade off to find the optimal scheme. The sampling and modeling solutions proposed in this study are validated by advanced process control (APC) simulations to estimate run-to-run performance, lot-to-lot and wafer-to- wafer model term monitoring to estimate stability and ultimately high volume manufacturing tests to monitor OPO by densely measured OVL data.
We analyze performance of different customized models on baseliner overlay data and demonstrate the reduction in overlay residuals by ~10%. Smart Sampling sets were assessed and compared with the full wafer measurements. We found that performance of the grid can still be maintained by going to one-third of total sampling points, while reducing metrology time by 60%. We also demonstrate the feasibility of achieving time to time matching using scanner fleet manager and thus identify the tool drifts even when the tool monitoring controls are within spec limits. We also explore the scanner feedback constant variation with illumination sources.
In this paper we will present the comparison study of these two methods on programmed errors of critical layers of 14nm technology node. Programmed OVL errors were made on certain fields during the exposure. Full coverage OVL measurements were performed using both IBO and DBO. Linear, HOPC and iHOPC modeling has been done from non-programmed fields. Then modeling has been subtracted from these certain programmed fields, and Reticle contribution was also calculated and subtracted. In this study, metrology measurement accuracy and stability can be feasible and more accurate OVL control is enabled by selecting better OVL measurement techniques.
We demonstrate a cost-effective automated rule based sparse sampling method that can detect the spatial variation of overlay errors as well as the overlay signature of the fields. Our technique satisfies the following three rules: (i) homogeneous distribution of ~200 samples across the wafer, (ii) equal number of samples in scan up and scan down condition and (iii) equal number of sampling on each overlay marks per field. When rule based samplings are implemented on the two products, the differences between the full wafer map sampling and the rule based sampling are within 3.5 nm overlay spec with residuals M+3σ of 2.4 nm (x) and 2.43 nm (y) for Product A and 2.98 nm (x) and 3.32 nm (y) for Product B.
Immersion based 20nm technology node and below becoming very challenging to chip designers, process and integration due to multiple patterning to integrate one design layer . Negative tone development (NTD) processes have been well accepted by industry experts for enabling technologies 20 nm and below. 193i double patterning is the technology solution for pitch down to 80 nm. This imposes tight control in critical dimension(CD) variation in double patterning where design patterns are decomposed in two different masks such as in litho-etch-litho etch (LELE). CD bimodality has been widely studied in LELE double patterning. A portion of CD tolerance budget is significantly consumed by variations in CD in double patterning.
The objective of this work is to study the process variation challenges and resolution in the Negative Tone Develop Process for 20 nm and Below Technology Node. This paper describes the effect of dose slope on CD variation in negative tone develop LELE process. This effect becomes even more challenging with standalone NTD developer process due to q-time driven CD variation. We studied impact of different stacks with combination of binary and attenuated phase shift mask and estimated dose slope contribution individually from stack and mask type. Mask 3D simulation was carried out to understand theoretical aspect. In order to meet the minimum insulator requirement for the worst case on wafer the overlay and critical dimension uniformity (CDU) budget margins have slimmed. Besides the litho process and tool control using enhanced metrology feedback, the variation control has other dependencies too. Color balancing between the two masks in LELE is helpful in countering effects such as iso-dense bias, and pattern shifting. Dummy insertion and the improved decomposition techniques [2] using multiple lower priority constraints can help to a great extent. Innovative color aware routing techniques [3] can also help with achieving more uniform density and color balanced layouts.
CD uniformity requirements at 20nm and more advanced nodes have challenged the precision limits of CD-SEM metrology, conventionally used for scanner qualification and in-line focus/dose monitoring on product wafers. Optical CD metrology has consequently gained adoption for these applications because of its superior precision, but has been limited adopted, due to challenges with long time-to-results and robustness to process variation. Both of these challenges are due to the limitations imposed by geometric modeling of the photoresist (PR) profile as required by conventional RCWA-based scatterometry. Signal Response Metrology (SRM) is a new technique that obviates the need for geometric modeling by directly correlating focus, dose, and CD to the spectral response of a scatterometry tool. Consequently, it suggests superior accuracy and robustness to process variation for focus/dose monitoring, as well as reducing the time to set up a new measurement recipe from days to hours. This work describes the fundamental concepts of SRM and the results of its application to lithography metrology and control. These results include time to results and measurement performance data on Focus, Dose and CD measurements performed on real devices and on design rule metrology targets.
KEYWORDS: Overlay metrology, Semiconducting wafers, Back end of line, Etching, Scatterometry, Metrology, Diffractive optical elements, Front end of line, Optical properties, Inspection
Persistently shrinking design rules and increasing process complexity require tight overlay control thereby making it imperative to choose the most suitable overlay measurement technique and complementary target design. In this paper we describe an assessment of various target designs from FEOL to BEOL on 20-nm process. Both scatterometry and imaging based methodology were reviewed for several key layers on A500LCM tool, which enables the use of both technologies. Different sets of targets were carefully designed and printed, taking into consideration the process and optical properties of each layer. The optimal overlay target for a given layer was chosen based on its measurement performance.
With the introduction of N2x and N1x process nodes, leading-edge factories are facing challenging demands of shrinking design margins. Previously un-corrected high-order signatures, and un-compensated temporal changes of high-order signatures, carry an important potential for improvement of on-product overlay (OPO). Until recently, static corrections per exposure (CPE), applied separately from the main APC correction, have been the industry’s standard for critical layers [1], [2]. This static correction is setup once per device and layer and then updated periodically or when a machine change point generates a new overlay signature. This is a non-ideal setup for two reasons. First, any drift or sudden shift in tool signature between two CPE update periods can cause worse OPO and a higher rework rate, or, even worse, lead to yield loss at end of line. Second, these corrections are made from full map measurements that can be in excess of 1,000 measurements per wafer [3].
Advanced overlay control algorithms utilizing Run-to-Run (R2R) CPE can be used to reduce the overlay signatures on product in High Volume Manufacturing (HVM) environments. In this paper, we demonstrate the results of a R2R CPE control scheme in HVM. The authors show an improvement up to 20% OPO Mean+3Sigma values on several critical immersion layers at the 28nm and 14 nm technology nodes, and a reduction of out-of-spec residual points per wafer (validated on full map). These results are attained by closely tracking process tool signature changes by means of APC, and with an affordable metrology load which is significantly smaller than full wafer measurements.
There are various data mining and analysis tools in use by high-volume semiconductor manufacturers throughout the industry that seek to provide robust monitoring and analysis capabilities for the purpose of maintaining a stable lithography process. These tools exist in both online and offline formats and draw upon data from various sources for monitoring and analysis. This paper explores several possible use cases of run-time scanner data to provide advanced overlay and imaging analysis for scanner lithography signatures. Here we demonstrate several use-cases for analyzing and stabilizing lithography processes. Applications include high order wafer alignment simulations in which an optimal alignment strategy is determined by dynamic wafer selection, reporting statistics data and analysis of the lot report and the sub-recipe as a sort of non-standard lot report, visualization of key lithography process parameters, and scanner fleet management (SFM)
As leading edge lithography moves to advanced nodes in high-mix, high-volume manufacturing environment, automated control of critical dimension (CD) within wafer has become a requirement. Current control methods to improve CD uniformity (CDU) generally rely upon the use of field by field exposure corrections via factory automation or through scanner sub-recipe. Such CDU control methods are limited to lithography step and cannot be extended to etch step. In this paper, a new method to improve CDU at post etch step by optimizing exposure at lithography step is introduced. This new solution utilizes GLOBALFOUNDRIES’ factory automation system and KLA-Tencor’s K-T Analyzer as the infrastructure to calculate and feed the necessary field by field level exposure corrections back to scanner, so as to achieve the optimal CDU at post etch step. CD at post lithography and post etch steps are measured by scatterometry metrology tools respectively and are used by K-T Analyzer as the input for correction calculations. This paper will explain in detail the philosophy as well as the methodology behind this novel CDU control solution. In addition, applications and use cases will be reviewed to demonstrate the capability and potential of this solution. The feasibility of adopting this solution in high-mix, high-volume manufacturing environment will be discussed as well.
As the process nodes continue to shrink, overlay budgets are approaching theoretical performance of the tools. It becomes even more imperative to improve overlay performance in order to maintain the roadmap for advance integrated circuit manufacturing. One of the critical factors in 20nm manufacturing is the overlay performance between the Middle of Line (MOL) and the Poly layer. The margin between these two layers was a process limiter, it was essential that we maintain a very tight overlay control between these layers. Due to various process and metrology related effects, maintaining good overlay control became a challenge. In this paper we describe the various factors affecting overlay performance and the measures taken to mitigate or eliminate said factors to improve overlay performance.
As photolithography will continue with 193nm immersion multiple patterning technologies for the leading edge HVM process node, the production overlay requirement for critical layers in logic devices has almost reached the scanner hardware performance limit. To meet the extreme overlay requirements in HVM production environment, this study investigates a new integrated overlay control concept for leading edge technology nodes that combines the run-to-run (R2R) linear or high order control loop, the periodic field-by-field or correction per exposure (CPE) wafer process signature control loop, and the scanner baseline control loop into a single integrated overlay control path through the fab host APC system. The goal is to meet the fab requirements for overlay performance, lower the cost of ownership, and provide freedom of control methodology. In this paper, a detailed implementation of this concept will be discussed, along with some preliminary results.
The NTD (Negative Tone Developer) process has been embraced as a viable alternative to traditionally, more conventional, positive tone develop processes. Advanced technology nodes have necessitated the adopting of NTD processes to achieve such tight design specifications in critical dimensions. Dark field contact layers are prime candidates for NTD processing due to its high imaging contrast. However, reticles used in NTD processes are highly transparent. The transmission rate of those masks can be over 85%. Consequently, lens heating effects result in a non-trivial impact that can limit NTD usability in a high volume mass production environment. At the same time, Source Mask Optimized (SMO) freeform pupils have become popular. This can also result in untoward lens heating effects which are localized in the lens. This can result in a unique drift behavior with each Zernike throughout the exposing of wafers. In this paper, we present our experience and lessons learned from lens heating with NTD processes. The results of this study indicate that lens heating makes impact on drift behavior of each Zernike during exposure while source pupil shape make an impact on the amplitude of Zernike drift. Existing lens models should be finely tuned to establish the correct compensation for drift. Computational modeling for lens heating can be considered as one of these opportunities. Pattern shapes, such as dense and iso pattern, can have different drift behavior during lens heating.
Advanced thermal annealing processes used for transistor enhancing for the state of the art process nodes induce wafer grid deformations. RTA (Rapid Thermal Anneal) and LSA (Laser Scanning Anneal) processes are a few examples. High Order Wafer Alignment (HOWA) method is an effective wafer alignment strategy for wafers with distorted grid signature especially when wafer-to-wafer grid distortion variations are also present. However, usage of HOWA in high volume production environment requires 1) careful initial determination of optimum polynomial order and alignment sampling to be implemented, and 2) matched tool monitoring and controlling strategies and infrastructures to avoid potential HOWA induced drawbacks (i.e. alignment walking).
KEYWORDS: Metrology, Semiconducting wafers, Critical dimension metrology, Transmission electron microscopy, Etching, Process control, Reactive ion etching, Data modeling, High volume manufacturing
Metrology tools are increasingly challenged by the continuing decrease in the device dimensions, combined with complex disruptive materials and architectures. These demands are not being met appropriately by existing/forthcoming metrology techniques individually. Hybrid Metrology (HM) – the practice to combine measurements from multiple toolset types in order to enable or improve the measurement of one or more critical parameters – is being incorporated by the industry to resolve these challenges. Continuing our previous work we now take the HM from the lab into the fab. This paper presents the first-in-industry implementation of HM within a High Volume Manufacturing (HVM) environment. Advanced 3D applications are the first to use HM: 20nm Contact etch and 14nm FinFET poly etch. The concept and main components of this Phase-1 Host-based implementation are discussed. We show examples of communication protocols/standards that have been specially constructed for HM for sharing data between the metrology tools and fab host in GLOBALFOUNDRIES, as well as the HM recipe setup and HVM results. Finally we discuss our vision and phased progression/roadmap for Phase-2 HM implementation to fully reap the benefits of hybridization.
The objective of this work was to study the trench and contact hole shrink mechanism in negative tone develop resist processes and its manufacturability challenges associated for 20nm technology nodes and beyond. Process delay from post-exposure to develop, or “queue time”, is studied in detail. The impact of time link delay on resolved critical dimension (CD) is fully characterized for patterned resist and etched geometries as a function of various process changes. In this study, we assembled a detailed, theoretical model and performed experimental work to correlated time link delay to acid diffusion within the resist polymer matrix. Acid diffusion is determined using both a modulation transfer function for diffusion and simple approximation based on Fick’s law of diffusion.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.