Moore's Law is not Ending Soon and the Reason May Surprise You
Jim Keller recently gave a fascinating and far ranging interview on the AI Podcast. You can find it at Moore's Law, Microprocessors, Abstractions, and First Principles.
One of the many topics of discussion was the often predicted death of Moore's Law. In case you've never heard of Jim Keller before, from this intro you can immediately understand why he may have special insight on the topic:
Jim Keller is a legendary microprocessor engineer, having worked at AMD, Apple, Tesla, and now Intel. He's known for his work on the AMD K7, K8, K12 and Zen microarchitectures, Apple A4, A5 processors, and co-author of the specifications for the x86-64 instruction set and HyperTransport interconnect.
Before we can understand why Moore's Law is not ending soon, we need to understand the idea of a diminishing return curve (this is a gloss of the talk, any errors or omissions are mine, but I tried to get the feel of it):
A project first goes up and then shows diminishing returns over time. To get to the next level you need to start a new project. The initial starting point of that new project will be lower than the return of the old project, but it will end higher. You have two kinds of fear: short term disaster and long term disaster. People with a quarter by quarter business objective are terrified of changing anything. People who are building for a long term objective know that the short term limitations block them from long term success. You can do multiple projects at the same time, but you can't make everyone happy.
I wasn't sure where he could go with this idea, but he uses diminishing returns to give an insiders view on how microprocessors evolve over time and why people keep errantly declaring the death of Moore's Law.
Like Schrödinger’s Cat, the death of Moore's Law depends on the observer:
People think Moore's Law is one thing, transistors get smaller. But under the sheets there's literally thousands of innovations that each have their own diminishing return curve. The result has been an exponential improvement. We keep inventing new innovations. If you're an expert on one of those diminishing return curves and you can see its plateau you'll probably tell people this is done. Meanwhile some other group of people are doing something different. That's just normal.
We still have a lot of runway on chip evolution:
A modern transistor is 1000 x 1000 x 1000 atoms. You get quantum effects down at 2-10 atoms. You can imagine a transistor down to 10 x 10 x 10 atoms. That's a million times smaller. There are techniques now to put down atoms at a single atomic layer. You can place atoms if you want to. It's just that from a manufacturing process perspective if placing an atom takes 10 minutes and you need to put 10^23 atoms together to make a computer it would take a long time. The innovation stack is very broad.
The near future, at least, is still bright:
I'm expecting more transistors every 2-3 years by a number large enough that how you think of computer architecture has to change.
Bright and complex:
People say computers should be simple and clean, but it turns out the market for simple and clean slow computers is zero.
If you want to make a lot of progress in computer architecture you need to start from scratch every 5 years.
Jim Keller has a lot wisdom to share and not just on tech topics. I think you'll find the rest of the the interview equally as interesting.
Oh, and before you complain about the title, most of my titles are boring by design. I finally got a chance to do something different, so I took it.
Reader Comments (1)
Well we sort of knew the future is still bright, We know we have 5nm > 3nm > 2nm > 1.4nm > 1nm, and research are already on sub 1nm ( in terms of node size, not actually sub 1nm ).
What Jim Keller didn't comment on was the cost model of the "combined" Moore's law was also on that curve. If it cost $2B for 2nm, and continues to grow, at what point will companies pull out from leading edge Node and simply follows mainstream.
His theory was that every-time the previous model of transistor approaches law diminishing return curve ( in terms of economics model ), there will be next big thing to pick up the pace, before it was PC, and GPU, for now it is still Smartphone. Do we see anything that is bigger than Smartphone? And requires as much leading edge transistor?
1.2B of Annual Smartphone Unit shipment, 5G upgrade cycle,