I've been following this topic for quite some time. Probably because I always wanted to be an EE/HPC engineer, but never was. When you listen to what guys from IBM, Intel, Applied Materials etc. have to say, then it seems that 5nm will probably be the last silicon based process. III-V (Boron, Nitrogen group) type of materials will then be next, with most likely candidate being GaAs (Gallium arsenide) which is already being used in niche productions.
What is more interesting than semiconductor base used is lithography process. Deep UV (immersion, using refraction of medium such as water) lithography currently used is showing its limits for ever smaller features. So, they came up with Extreme UV lithography which uses mirrors to project features onto surface, because at that short wavelength everything is opaque. Applied materials has a machine with EUVL apparently (there is one on their site and a nice video with description) as well as ASML (their machine reportedly costs $88 million).
There is a probability that with new semiconductor base (group III-V) there will be a reset back to 22nm or similar and a race down to bottom once again. We'll see.
GaN, SiC and Diamond are also very good candidate materials.
Also don't forget multi-patterning, which is what has allowed us to get to 14nm in the first place. I reckon Intel will ditch their immersion lithography which they love so dearly and move on to other techniques, which is when I will actually get excited about these techniques. X-Ray Lithography is also being investigated and seems to hold some promise.
The crazy thing about all these advancements is that no one knows what the future will hold, and right now we are essentially living under a renaissance era, witnessing the death of Silicon. I just hope the stuff I'm working on succeeds so I can make that cash money.
Intel already said they are moving to EUV from immersion lithography multi patterning, but tech isn't ready for them in another year or two. X-ray lithography has been used in the past, if I'm not mistaken. I didn't know there is an interest to bring it back. I know there is some research in Electron beam lithography, but jitter and slowness of it is a major hurdle.
What are you working on - graphene? In any case, very very interesting times ahead. Projections are that 2020 will see the last of the cycles (5nm) for Silicon. And that's only a few years away!
You don't know if Intel's tech is ready. It's classified for the machine manufacturers to disclose if Intel's their customer. I tried really hard to ask this Intel guy at IEDM what they're using for 10nm research and he wouldn't say anything.
E-beam lithography has been the work horse for research in fabrication for decades now. You can't get better resolution than E-beam lithography. The wavelength of an electron at 5kV acceleration is something like 0.017 nm. There are some crazy people who are trying to make E-beam systems for production processes (multiple beams, etc.), but I don't think E-beam will ever see use outside of research.
It might be due to my limited scope of knowledge, but how would one scale E-beam to a production capacity anyways?
Thanks for the link. I'll have to cross-read it with a lot of info though since it's 'a bit' over my head. I'm already lost at 1D and how it relates to, well, anything. I still can't warp my head around 1D geometry.
Hah, I've heard that! It's a really interesting field that blends physics, maths, chemistry, electronic and electrical engineering, computer science and other fields. Really amazing even to the outsiders like me, and it's not too hard to follow.
One of the cooler research topics going on now is the use of block copolymers for even more precise lithography, at the nano-scale. Really interesting stuff!
There's so much hype over big data, analytics, machine learning, predictive analytics etc. In silicon valley, we're predisposed to think this disruption is the result of startups, tech companies (google/yahoo certainly get a lot of credit), and top engineering universities. But the truth is the only company effectively doing, and more importantly, selling this stuff at scale is IBM. Everyone in the data management space is hoping to get just a piece of IBM's pie. While we talk about the latest updates in hadoop and spark, IBM is closing $50m deals with telcos, banks, and governments all over the world. If you're a startup in analytics and you don't think of IBM as a competitor, you're probably naive or your market isn't big enough.
I think the strategy is not to compete with IBM at first and there are plenty of potential clients out there.
Banks and big government won't switch to trust a startup after working with IBM for twenty years. The risk of a failing startup and buyout is a real risk. Even should you do such transition, it will take a long time. Maybe for small components you can switch within months.
I like PG's advise to startup founders: go out there, sell your product, do the necessary customer supports (do demo, train users, troubleshoot, etc) and stop the hype because big companies like IBM and VMWare can sell their product with ease. I think the right attitude is to start small, build the reputation and make progress.
(oh, and, the pet theory also includes occult Illuminati-ish inauguration rituals for C-level executives. "swear to $DEITY that you will spend lots of dough on fun fundamental research or forever stay the meager mid-level manager you are". Candles, catacombs, robes, distant humming male background choir, that sort of stuff)
(they'll keep the shareholders in check by a mixture of nonsense and honesty about why IBM really needs to do so much fun fundamental research. if all C-level people talk the talk the same, and otherwise don't suck at what they do, then i bet they can keep the shareholders from getting all too grumpy about spending $3B on something that might pay off in 20 years, maybe)
No, their goal is to make lots of money with boring IT consultancy so they can plow it back into stock buybacks and other financial engineering tricks.
IBM has traditionally had a lot of very strong research labs including IBM Almaden. That said, they also have a reputation for driving out relations with academics and local researchers in the name of "budget" -- very similar to the overall outsourcing regiment Cringely has outlined.
I'd like to know which of the labs are getting this money and what hiring they are doing contrasted with any downsizing they previously did.
Interesting. The article makes the following claim:
As the leader in advanced schemes that point beyond traditional silicon-based computing, IBM holds over 500 patents for technologies that will drive advancements at 7nm and beyond silicon — more than twice the nearest competitor.
I would have guessed that Intel was the big player here in the low-level hardware space. What has put IBM at that position, besides sheer age?
Fundamental research. For example, Karl Alex Müller discovered super-conductivity while working at IBM's Swiss research lab and later won the Nobel prize for that (http://www.uzh.ch/about/portrait/nobelprize_en.html). Also, they invented the scanning electron microscope, along with other discoveries relevant at nano-scale.
IBM is one of the very few companies that does fundamental research with a time-horizon of decades, and not just one product cycle.
(1) As the ROI on R&D falls, R&D will be drastically cut and the industry will become even more commoditized. Less R&D spending means chips get cheaper.
(2) Capital equipment will last longer because it won't need to be replaced with new versions every couple years. Fewer equipment purchases means chips get cheaper.
(3) As performance/$ wanes, demand will fall in the short term. At first prices may crash if capacity is overbuilt, but in the medium term, production will stabilize at a smaller level than before. Because of economies of scale, prices will be higher at this smaller scale. (I believe this is what happened to the RAM market a few years ago.) This force may cause chip prices to rise.
(4) As software continues to improve, the value of chips will rise, even as the chips themselves stagnate. This will cause the market to grow, and economies of scale will drive prices down.
(5) As has been happening for years, the industry will continue to consolidate. Less competition between producers will mean higher profit margins and higher prices.
Overall, (1), (2), and (4) will lower chip prices and (3) and (5) will raise chip prices. But I think the overall trend will be lower prices, though perhaps not as the drastic rate to which we've become accustomed.
The reason Moore's law worked so well is largely because of this. When we began creating transistors on integrated circuits, the only thing preventing us from making them exactly as small as they are today were things like cost and the capabilities of fabs to make them reliable and stable. So ever since we started making them we've stepped down the size by about half, rather than something smaller that would be more difficult and expensive to make. The entire manufacturing industry gradually became more efficient, increasing reliability and reducing the cost of fabricating the integrated circuits.
The history of performance speedup hasn't been solely due to doubling the transistors. Other types of architectural improvements in memory, buses, parallelism, multiple cores/threads have yielded performance doubling or more. Memory access has always been a huge latency and there are lots of ways to improve that without just making the transistors smaller.
We have a lot of work to do still! The majority of our computing has been on the x86 architecture, which basically stems from the 8080 which was designed in the mid 70s. We can still discover new and better architectures and materials. It's just the standard shrinkage of IC's that was the basis of Moore's Law originally has reached its limits.
The most interesting piece of this puzzle to me is initiatives like MirageOS, which let you run your application very close to the metal—you give up a lot of convenience when you dump the traditional OS, but you give up a lot of inefficiency, too.
Even if you make particularly efficient code, the giant tower of abstractions we all operate on top of make running your code dozens of times slower than it could possibly be running on bare metal.
Decrease in progress in CMOS will diffuse as pressure in other areas, new materials, new techniques, new structures, new software paradigms... or maybe just different usage. Do we really need that much computing power ?
It's slightly ranty but not entirely. First spreadsheets were used to fit in 64k.
For the average user an old laptop would do fine, in that area progress is self fulfilling marketing prophecy.
For enterprises .. maybe (the few I worked at had very low grade network and database infrastructure, so opportunity to evolve)
Hard science and medicine, they deserve smart and fast tech.
A reasonable question; I don't know why you were downvoted. Per the article, the spending will be in the US and Europe, but I don't know which European countries. If you're engaged in research then your best bet would be to contact them through your institution.
IBM just did massive layoffs in hardware. I used to visit essex junction plant a bunch and they had a lot of layoffs there. I don't know how much commitment Management there has toward hardware.
There is probably more interest in developing the technology and then licensing it. It seems like IBM doesn't want to be in the hardware market unless it's high margin.
I don't know this in particular, but these days just about any analog circuit can be implemented in digital with better resolution, smaller die area, and lower power consumption.
(In analog domain, your resolution is limited by SNR, in digital, it's limited by number of bits).
Except for digital to analog, and analog to digital conversors.
As far as I know, our neurons are belived to have analogic interfaces, thus they could not be interfaced by a digital circuit. If somebody found out that they are digital, that'd be very newsworth, at least for me.
What is more interesting than semiconductor base used is lithography process. Deep UV (immersion, using refraction of medium such as water) lithography currently used is showing its limits for ever smaller features. So, they came up with Extreme UV lithography which uses mirrors to project features onto surface, because at that short wavelength everything is opaque. Applied materials has a machine with EUVL apparently (there is one on their site and a nice video with description) as well as ASML (their machine reportedly costs $88 million).
There is a probability that with new semiconductor base (group III-V) there will be a reset back to 22nm or similar and a race down to bottom once again. We'll see.