Tennant’s Law

It’s hard to make things small. It’s even harder to make things small cheaply.

I was recently re-reading Tim Brunner’s wonderful paper from 2003, “Why optical lithography will live forever” [1] when I was reminded of Tennant’s Law [2,3]. Don Tennant spent 27 years working in lithography-related fields at Bell Labs, and has been running the Cornell NanoScale Science and Technology Facility (CNF) for the last five. In 1999 he plotted up an interesting trend for direct-write-like lithography technologies: There is a power-law relationship between areal throughput (the area of a wafer that can be printed per unit time) and the resolution that can be obtained. Putting resolution (R) in nm and areal throughput (At) in nm2/s, his empirically observed relationship looks like this:

At = 4.3 R^5

Even though the proportionality constant (4.3) represents a snapshot of technology capability circa 1995, this is not a good trend. When cutting the resolution in half (at a given level of technology capability), the throughput decreases by a factor of 32. Yikes. That is not good for manufacturing.

What’s behind Tennant’s Law, and is there any way around it? The first and most obvious problem with direct-write lithography is the pixel problem. Defining one pixel element as the resolution squared, a constant rate of writing pixels will lead to a throughput that goes as R^2. In this scenario, we always get an areal throughput hit when improving resolution just because we are increasing the number of pixels we have to write. Dramatic increases in pixel writing speed must accompany resolution improvement just to keep the throughput constant.

But Tennant’s Law shows us that we don’t keep the pixel writing rate constant. In fact, the pixel throughput (At/R^2) goes as R^3. In other words, writing a small pixel takes much longer than writing a big pixel. Why? While the answer depends on the specific direct-write technology, there are two general reasons. First, the sensitivity of the photoresist goes down as the resolution improves. For electron-beam lithography, higher resolution comes from using a higher energy (at least to a point), since higher-energy electrons exhibit less forward scattering, and thus less blurring within the resist. But higher-energy electrons also transfer less energy to the resist, thus lowering resist sensitivity. The relationship is fundamental: scattering, the mechanism that allows an electron to impart energy to the photoresist, also causes a blurring of the image and a loss of resolution. Thus, reducing the blurring to improve resolution necessarily results in lower sensitivity and thus lower throughput.

(As an aside, higher electron energy results in greater backscattering, so there is a limit to how far resolution can be improved by going to higher energy.)

Chemically amplified (CA) resists have their own throughput versus resolution trade-off. CA resists can be made more sensitive by increasing the amount of baking done after exposure. But this necessarily results in a longer diffusion length of the reactive species (the acid generated by exposure). The greater sensitivity comes from one acid (the result of exposure) diffusing around and finding multiple polymer sites to react with, thus “amplifying” the effects of exposure and improving sensitivity. But increased diffusion worsens resolution – the diffusion length must be kept smaller than the feature size in order to form a feature.

Charged particle beam systems have another throughput/resolution problem: like charges repel. Cranking up the current to get more electrons to the resist faster (that is, increasing the electron flux) crowds the electrons together, increasing the amount of electron-electron repulsion and blurring the resulting image. These space-charge effects ultimately doomed the otherwise intriguing SCALPEL projection e-beam lithography approach [4].

The second reason that smaller pixels require more write time has to do with the greater precision required when writing a small pixel. Since lithography control requirements scale as the feature size (a typical specification for linewidth control is ±10%), one can’t simply write a smaller pixel with the same level of care as a larger one. And it’s hard to be careful and fast at the same time.

One reason why smaller pixels are harder to control is the stochastic effects of exposure: as you decrease the number of electrons (or photons) per pixel, the statistical uncertainty in the number of electrons or photons actually used goes up. The uncertainty produces linewidth errors, most readily observed as linewidth roughness (LWR). To combat the growing uncertainty in smaller pixels, a higher dose is required.

Other throughput limiters can also come into play for direct-write lithography, such as the data rate (one must be able to supply the information as to which pixels are on or off at a rate at least as fast as the pixel writing rate), or stage motion speed. But assuming that these limiters can be swept away with good engineering, Tennant’s Law still leaves us with two important dilemmas: as we improve resolution we are forced to write more pixels, and the time to write each pixel increases.

For proponents of direct-write lithography, the solution to its throughput problems lies with multiple beams. Setting aside the immense engineering challenges involved with controlling hundreds or thousands of beams to a manufacturing level of precision and reliability, does a multiple-beam approach really get us around Tennant’s Law? Not easily. We still have the same two problems. Every IC technology node increases the number of pixels that need to be written by a factor of 2 over the previous node, necessitating a machine with at least twice the number of beams. But since each smaller pixel takes longer to write, the real increase in the number of beams is likely to be much larger (more likely a factor of 4 rather than 2). Even if the economics of multi-beam lithography can be made to work for one technology node, it will look very bad for the next technology node. In other words, writing one pixel at a time does not scale well, even when using multiple beams.

In a future post, I’ll talk about why Tennant’s Law has not been a factor in optical lithography – until now.

[1] T. A. Brunner, “Why optical lithography will live forever”, JVST B 21(6), p. 2632 (2003).
[2] Donald M. Tennant, Chapter 4, “Limits of Conventional Lithography”, in Nanotechnology, Gregory Timp Ed., Springer (1999) p. 164.
[3] Not to be confused with Roy Tennant’s Law of Library Science: “Only librarians like to search, everyone else likes to find.”
[4] J.A. Liddle, et al., “Space-charge effects in projection electron-beam lithography: Results from the SCALPEL proof-of-lithography system”, JVST B 19(2), p. 476 (2001).

Ritual Wedding Readings

Ah, the beauty of the “pick and choose” school of theology.

I can’t even begin to count all the weddings I have been to that included a reading of what is probably the most popular wedding Bible verse ever: 1 Corinthians 13. In part, it reads

“Love is patient, love is kind. It does not envy, it does not boast, it is not proud … Love never fails. … And now these three remain: faith, hope and love. But the greatest of these is love.” (NIV)

I agree, the sentiments of this passage are quite beautiful. But I wonder how many soon-to-be-married couples have read all of 1 Corinthians and know of Paul’s opinions on marriage. Paul is not exactly the guy I would propose to give a toast at my wedding. For example, 1 Corinthians 7 says

“Now to the unmarried and the widows I say: It is good for them to stay unmarried, as I do. But if they cannot control themselves, they should marry, for it is better to marry than to burn with passion. … Are you pledged to a woman? Do not seek to be released. Are you free from such a commitment? Do not look for a wife. But if you do marry, you have not sinned … But those who marry will face many troubles in this life, and I want to spare you this.”

You don’t hear that one read at too many weddings. I admit that I have faced a few troubles in this life, and the occasional passion did burn (when I was a bit younger). But all in all, I don’t look to Paul for advice in marriage. I think he could have benefited from a little therapy.

Quote of the Day

This morning my daughter Sarah said to me “‘Have’ is my favorite word.” Caught a bit off guard, but used to the non sequiturs that come with being six years old, I replied “Oh really, why is that?” “Well,” she answered, “it’s because ‘have’ breaks the rules.”

I’m so proud of my little girl.

A view from the top (20)

It is an article of faith among semiconductor industry watchers that the last 20 years have seen considerable consolidation among semiconductor makers, with further consolidation all but inevitable. Of course, we can all point to mergers (TI and National being the latest) and players exiting from the market (NEC was the #1 chipmaker in the world in 1991, but now is out of the business). But does the data support this view of rampant consolidation?

I’ve been looking over 24 years of annual top 20 semiconductor company revenue data compiled by Gartner Dataquest (1987 – 1999) and iSupply (2000 – 2010), and the results show a more nuanced picture. As I noted in my last post on this topic, foundries are excluded from this accounting – their revenue is attributed to the companies placing the orders. Thus, this is a semiconductor product-based top-20 list, not a semiconductor fab-based top-20 list. With that in mind, let’s look at the trends.

Consider first, the fraction of the total semiconductor market controlled by the top 20 semiconductor companies. The trendline shows a 15% drop in market share over 24 years for the top 20, or about a 0.7% decline on average each year. In other words, the rest of the semiconductor companies (those not in the top 20) saw their market share grow dramatically, from 23% to 38% or so.

Semiconductor Top 20 Market Share

Likewise, the top 10 semiconductor companies saw their market share drop by ten points, from about 56% to 46% (or about 0.45% per year). The top five companies, on the other hand, kept about a constant share of 1/3 of the market since 1987. The trendline has a slope not significantly different from zero (-0.1% per year).

Semiconductor Top 5 Market Share

But it’s the top two semiconductor makers that show the most interesting trend. The top 2 have seen a 6% rise in their market share, to 22% today, for an increase of about 0.3% per year. The top three makers have seen a more modest 0.15% increase in market share per year since 1987. Thus, consolidation of market share has only come at the very top of the market, the top 2 to be specific. For the rest of the industry, there has be spreading out of the market among more players. Those top 2 players are now, of course, Intel and Samsung. But in 1987 they were NEC and Toshiba (Intel was #10 then, and Samsung wasn’t on the list).

Semiconductor Top 2 Market Share

So is the megatrend of semiconductor industry consolidation a myth? Yes and no. From a product perspective, the data is clear. The top two companies have grown in dominance, but for the remaining 80% of the market or so revenue is being spread over a wider array of companies over time. Foundries can be given some credit for the increased democratization of the market, but the trends were in place before foundries even came into existence. In fact, it is more accurate to say that foundries are a result rather than a cause of this democratization. It is the nature of the semiconductor product itself which has driven this increase in the long tail of the distribution of companies.

While there have always been a few blockbuster product categories (memory and microprocessors) where size matters, the vast majority of semiconductor revenue comes from niche (or at least small market share) products. Big companies don’t excel at making lots of niche products. Thus, small to medium-sized companies who stay close to their customers are able to compete well against their larger rivals. It is likely that this trend will continue so long as Moore’s Law continues.

Moore’s Law keeps the few big players still able to invest in new fabs quite busy, and they need big market categories to justify their big investments. There has been considerable consolidation in the industry if you consider fabs rather than products, since there are now only about five companies that are likely to stay at the front of Moore’s Law over the next few years. And these top five manufacturers have seen growth in their share of fab output. But I doubt that a smaller number of fabs competing at the very high-end of the market will somehow reverse the trend of dispersion for the other 80% of the market. That is, until Moore’s Law ends. Then, these big companies with their big fabs are likely to turn their attentions to markets that seemed to diffuse to worry about. What happens then, in a post-Moore’s Law world, is anyone’s guess.

On the Road

I’m jealous. My friend Ben Woodard has decided to move from Austin back to his home state of California. Don’t get me wrong, I have no desire to leave Austin or move to California. I’m jealous of the way he has decided to go – in style. He is driving across the country, using no highways, in his DeLorean. With his friend Michael for company, the two of them are off on an adventure of unknown proportions. I’m looking forward to his modern version of On the Road, full of blog posts along the way, no doubt. Good luck, Ben!

Ben's DeLorean 1
Ben's DeLorean 2
Ben's DeLorean 3

Happy 7E9 Day!

In honor of Halloween, here is a fright for you: today is the day, according to one UN estimate, that the world population reached 7 billion. (Estimates vary – the US census bureau puts that date as next March). The population is growing by about a million people every five days. If current trends continue, we’ll see 8 billion people in 15 years and 9 billion people by 2043. That means that, if I am lucky enough to live that long, I will see a tripling of the world population in my lifetime (it was 3 billion in 1960 when I was born).

For this auspicious date I’ve updated my population essay and data spreadsheet with the latest numbers. Enjoy.

The top 20 ain’t what it used to be

Looking back on data of the annual top 20 semiconductor companies since 1987, it’s amazing how much has changed. In my last post on this topic, I looked at all the companies that went bankrupt, spun-out, or merged their way into or out of the top 20 list. Change is definitely a constant in this field. Now, let’s look at the makeup of the 2010 list of top semiconductor companies. Here is the list, as generated by iSuppli.

1 Intel Corporation
2 Samsung Electronics
3 Toshiba Semiconductor
4 Texas Instruments
5 Renesas Electronics
6 Hynix
7 STMicroelectronics
8 Micron Technology
9 Qualcomm
10 Broadcom
11 Elpida Memory
12 Advanced Micro Devices
13 Infineon Technologies
14 Sony
15 Panasonic Corporation
16 Freescale Semiconductor
17 NXP
18 Marvell Technology Group
19 MediaTek
20 NVIDIA

It’s important to note that foundries are excluded from this accounting – their revenue is attributed to the companies placing the orders. Thus, this is a semiconductor product-based top-20 list, not a semiconductor maker-based top-20 list.

And that distinction is obvious when looking at the make-up of the 2010 top-20. Six of the top 20 companies are fabless. Another seven are “fab-lite”, meaning they have stopped investing in new fabs or leading-edge manufacturing. That leaves just seven leading-edge semiconductor manufacturers in the top 20. Of those, four make mostly memory (80% of Samsung’s revenue came from memory), two make mostly logic, and one (Toshiba) makes a fair amount of both.

As a point of reference, if TSMC’s revenue were attributed to TSMC rather than their customers, they would be in fourth place, just barely behind Toshiba. The next two largest foundries, UMC and GlobalFoundries, would find themselves near the bottom of the top 20.

So, we have seven semiconductor manufacturers and three foundries that claim to still want to invest in leading-edge manufacturing capacity. That’s a far cry from just 10 years ago, when all 20 of the top 20 semiconductor companies were committed to building new leading-edge fabs. And even this list of 10 companies can’t really afford to play at the bleeding edge. Only five of them (Intel, Samsung, Toshiba, TSMC, and Hynix) have over $10B/year in semiconductor revenue, probably the minimum needed to build that next $5B mega fab. Add EUV and 450mm wafers into the mix, and you can see that there will be very few players at this ultra-high end of manufacturing.

It is conventional wisdom that the last decade has been one of extreme consolidation in the semiconductor business. Next, I’ll look at the numbers to see how well that conventional wisdom holds up.

A Summer in Austin to Remember

It has finally cooled down in Austin (it only got up to about 88F today), so it is probably safe to talk about the brutally hot summer that is now behind us. It was the hottest summer on record (using the Camp Mabry records, since 1897), and on September 29, we had our 90th day at or above 100F. Wow. For fun, I looked up some other statistics about the Austin summers:

Average number of 100 degree days each year in Austin: 12.3
Greatest number of 100 degree days in one year (before this year): 69 (1925). In 2009 we had 68.
Average date of the first 100 degree day: July 11th
Average date of the last 100 degree day: August 20th
Earliest 100 degree day: May 4th (1984)
Latest 100 degree day: October 2nd (1938)
Years without a 100 degree day: 10 (1987, 1979, 1975, 1973, 1968, 1919, 1908, 1907, 1906, 1904)
Highest temperature of record was 112 on September 5, 2000 and August 28, 2011

Of course, our big fear is that this year is the new normal. A scary thought.

Quote of the Day

“This world is a strange madhouse. Currently, every coachman and every waiter is debating whether relativity theory is correct. Belief in this matter depends on political party affiliation.”

– Albert Einstein (1920)

(Amazing how some things never change.)

Musings of a Gentleman Scientist