The Stochastic Behavior of Optical Images and Its Impact on Resolution

The Picture Of Light

When we consider an optical image, we usually think of it as a pure, whole entity, with no doubt as to whether it or any part of it is there or not. This is in fact, part of the classical picture of light, which treats it most simply as a ray or more rigorously as an electromagnetic wave. Modern quantum physics has augmented the picture to consider light as consisting of particles (photons). By giving light a particulate nature, it becomes possible to quantify the amount of light. Each photon carries a precise amount of energy; the total amount or dose of light that is to illuminate an object immediately gives the number of photons. Single photons can even be detected, e.g., here: https://phys.org/news/2017-12-single-photon-detector.html. It is not difficult to accept that a brighter image means more photons than a dimmer image. At the same time, it is harder to recognize fine details in a dim image.

Resolution becomes statistical

Resolution is, in the strictest sense, a metric for detection ability. Detection itself implies a threshold, which may be associated with neuron firing, photographic film or photoresist exposure, or a photocurrent. The classical expression for a resolution limit is the Rayleigh criterion: 0.61 wavelength/numerical aperture, where the numerical aperture is the sine of the largest angle with the optical axis that can be captured by the focusing lens. This expression does not relate to the photon number at all. In actuality, though, photon number becomes critical at low levels, but in a probabilistic and statistical sense. The sample size, i.e., the number of observations of the resolution, becomes key. Furthermore, since the photon arrival at the detector is random, a specific formalism is used to describe their arrival distribution, known as the Poisson distribution. Already, light has taken on a stochastic nature at this level.

The Poisson distribution and Shot Noise

The Poisson distribution is used to model the number of times an event (such as photon arrival) occurs in a given interval of time or space. When we make a measurement, we get the average rate of occurrence. However, there is always a random variation in the actual number, and this is given by the standard deviation, which is the square root of the average value. When we have a large number of observations (which we usually do), the Poisson distribution is essentially a normal distribution. A normal distribution quantifies rare events by how small a fraction lies so many standard deviations from the average. For example, 68% of the measurements would fall within 1 standard deviation from the average, while 2 out of a billion would fall outside 6 standard deviations from the average. We may expect, in a billion measurements, one would 6 standard deviations above the mean, another would be 6 standard deviations below the mean. Since the standard deviation is related to the average value, we can calculate what % error would be expected for points that are so many deviations away from the mean. For an average of 100 photons detected, the standard deviation is 10 photons, so 6 standard deviations would entail an excess or deficiency of 60 photons or 60% of the mean. On the other hand, if we had 10000 photons detected, 6 standard deviations would entail an excess or deficiency of 600 photons. While this is greater number of photons, it is only 6% of the mean. So a larger average photon number detected means a smaller % error that is noticed within the measurement population.

This excess or deficiency of photons, described by Poisson statistics, is also known as shot noise.

The Impact on Lithographic Resolution

With the ruthless march of Moore's Law, the "7nm" node is now upon us. The smallest feature sizes used for 7nm are already too small for the lithography tools using ArF lasers operating at 193 nm wavelength to print in one shot. A new wavelength, 13.5 nm (also known as extreme ultraviolet or EUV), has finally emerged in tools installed at foundries such as TSMC and Samsung. By the Rayleigh criterion, the EUV tool, with numerical aperture = 0.33, has a better resolution than the immersion tool, with numerical aperture = 1.35. However, EUV photons are very difficult to produce. Consequently, the number of EUV photons producing images is much smaller than the number of 193 nm photons producing images.

To get an idea of the difference, we consider a typical dose used in lithography, 30 mJ/cm2. Per square nanometer receiving the light, this corresponds to 292 ArF photons of 193 nm wavelength (energy = 6.425 eV) or a mere 20 photons of EUV (13.5 nm wavelength or 91.85 eV energy), almost a factor of 15 less! In lithography, it is also important to maintain proper dose control within a small area, say, 10% of the critical dimension (CD). Since 193 nm lithography is limited to feature sizes of ~40 nm, we can get a visual estimate of the impact of the photon shot noise. The image produced by the lithography is actually projected from a photomask. Due to the resolution limit of the optics, at the 40 nm level, the actual photon-based image (also known as the aerial image) is actually sinusoidal. The 30 mJ/cm2 here is the threshold dose level at which the photoresist either washes away or is retained.

The arrow indicates a region of the image (at x=-20 nm) which we choose to focus on to sample the photon number within the local 4 nm x 4 nm (4 nm is 10% of the 40 nm CD target). We can imagine than in a chip there are easily a billion such images, with an average of around 4700 photons. Such a large number results in very little shot noise, and so the stochastic impact is small. The blue curve gives the classic expected image, while the gray and orange curves give the bands for -6 and +6 standard deviations. The red dotted curve gives the extreme case of -6 standard deviations at our sampling point (x=-20 nm); we see it is quite small and hardly moves the feature edge after thresholding.

The situation is very different for the same dose for an EUV application. For the 7nm node, the CD could be on the order of 20 nm, and our sampling area is also smaller, 2 nm x 2 nm. Consequently, we expect more significant photon shot noise, from the fewer photons for the same energy dose as well as from the smaller sampled area.

Here, we choose a sampling location within the above-threshold region, at x=-8 nm. We find that if this region is at the -6s level, the photon number deviation causes it to fall below threshold. In other words, we now have locally unexposed regions where we would classically expect the exposure threshold to be cleared. Depending on the resist, it could lead to some severe effects including failure to print.

The dose may be increased to reduce the shot noise. We take it up to 80 mJ/cm2, which is at the high end of the range being tested; the tool gets too slow at higher doses. We see that although points sampled in the center are now safe at the 6s (1 ppb) level, the drop-below-threshold effect persists near the edge.

We see that EUV would require significantly higher doses to make up enough photons in the smaller sampled area. In fact, we can estimate the required enhancement factor as (4 nm x 4 nm)/(2 nm x 2 nm) * (193/13.5) ~ 57. It would require a much higher EUV power source than presently available to enable the required dose of 1.7 J/cm2.

The alternative is to consider the practical resolution at the same dose. This can also be estimated. The original 193 nm wavelength sampling area, 4 nm x 4 nm, would have to be enhanced by 193/13.5 ~ 14.3 times, giving roughly 15 nm x 15 nm. In other words, at 30 mJ/cm2, the practical resolution for EUV is 150 nm to match the same level of stochastic disturbance (6 standard deviations) from photon shot noise as the 193 nm wavelength at 40 nm. This is not fitting for the use of EUV, from the classical non-stochastic expectation.

However, the stochastic nature of EUV lithography is already acknowledged, such as in work by IMEC (presented from 2018 Semicon Korea):

For such a critical application, the stochastic nature of EUV photon shot noise needs to be well understood.

Frederick Chen

Looking at tech differently

4y

A 2D top view rendering with pixel-by-pixel stochastic calculations gives a more realistic picture: https://www.linkedin.com/posts/frederick-chen-7648bb7_lithography-euv-ler-activity-6817029351813607425-KEUg

  • No alternative text description for this image
Frederick Chen

Looking at tech differently

6y

A third article will attempt to explain the two-sided pattern cliff occurring from the existence of two types of stochastic defects. The high dose cliff may forestall the necessary dose increase to reduce the % variation.

Like
Reply
Frederick Chen

Looking at tech differently

6y

A second article focused on the impact to the use of subresolution assist features, as these features are supposed to be too small to resolve, hence smaller area,  lower dose, i.e., fewer photons. https://www.linkedin.com/pulse/stochastic-printing-sub-resolution-assist-features-frederick-chen/

Like
Reply
Moshe Dolejsi

Integration Engineer @ Intel Corporation formerly PhD in Directed Self Assembly @ UChicago

6y

Thanks for the overview. Do you happen to have a link to the paper presented by imec at semicon 2018? I'd be interested to compare defectivity numbers of EUV vs DSA.  Also I'm sure you saw that Samsung seems to be using a dose of about 50 in order to hit 63 WPH on their NXE 3400s (1500 wpd/24 = 63 wph use the NXE 3XXX chart with a source power of 250 gives you a power to dose of ~ 5). Any thoughts on the losses they are taking running the tools like this? Is it just a matter of avoiding sunk costs from their buy in?

Miles Gehm

Process Engineering/Foundry Transfer/Engineering Management

6y

Very thoughtful and well written. Thanks.

To view or add a comment, sign in

More articles by Frederick Chen

Others also viewed

Explore topics