It was 1965. The first computers based on integrated circuits had recently been launched. And Gordon Moore, a young executive in the fledgling semiconductor business, was asked by Electronics Magazine to jot down his thoughts on the future of the silicon chip.
Fifty years on and the Intel founder’s observations — neatly repackaged as Moore’s Law — have become shorthand for technological change. His prediction that the number of transistors on a chip will double every two years, has never stopped reflecting the pace of change in a business that has gone from start-up to a $200bn (£114bn) a year industry in half a century.
It is an astonishing record — which other industry churns out products that double in performance, halve in size and double in efficiency once every two years? Just imagine if the car industry could make a similar boast.
But today the computer business stands at a crossroads. As engineers and scientists strive for the breakthroughs that will keep Moore’s observations intact, they are pushing the technology to a point where its key components will become so tiny that they consist of just a few atoms. And, as anyone with half an eye on the quantum world will know, when devices get this small, strange things begin to happen.
So how long has the industry got before the laws of physics step in and prevent traditional technology from taking another step?
According to Prof Erol Gelenbe, Imperial College Londonelectronics engineer, Moore’s Law will continue to hold in the short term, thanks to the type of breakthroughs in materials science, fabrication technology and chip design that have kept it going thus far.
He said: ‘Fabrication processes have become more and more accurate with much better materials. As the materials improve and become purer there are fewer possibilities of errors during fabrication.
‘And as things become more accurate you can become much smaller, you can do much finer etching because you are etching on a better material. Plus, the computer-aided design process of electronic circuits has also improved dramatically. There are far more accurate models and also computer technology allows us to simulate much larger models. Interestingly the modelling and computer simulation itself depends upon Moore’s Law, with more and more powerful computers enabling scientists to simulate larger circuits.’
Although these improvements are incremental they are no less mind-boggling. Last year Intel launched a new generation of chips featuring technology so tiny and efficient that it was hailed by Moore as the biggest transistor advance in 40 years.
These so-called high-k metal gate chips, made using a new 45nm lithography process (this refers to half the distance between identical components on a chip) have nearly twice the transistor density of previous chips built on the company’s earlier 65nm technology.
Intel’s silicon modulator under test, above left, and in close up, centre. Above right, the company’s single modulator chip
While the world’s first transistor radio contained just four transistors, the 1cm2 chips built with the 45nm technology can contain up to 820 million, each measuring about 100 atoms across, and featuring gates that are about five atoms wide.
Indeed, the shrinking size of gate dielectrics, critical elements of transistors, has become a particular problem in recent years, with gaps of just a few atoms allowing current to leak from the transistors and reducing the effectiveness of chips.
Nick Knupffer, Intel spokesman, said the new generation of chips counters this with a highly ductile chemical element called Hafnium that has been used to reduce leakage and enable a smaller, more energy-efficient and faster chip.
But impressive as the new chip is, things move fast in Silicon Valley and Intel is already applying the finishing touches to a chip that will be produced next year using a 32nm process. While the current chips are produced using dry lithography — in which there is an air gap between the lens and the surface of the wafer — these new chips will be made using the increasingly popular technique of immersion lithography, which uses a layer of water between the lens and the surface to boosts the resolution of the process.
Beyond this, Intel is aiming for 22nm by 2011, and even hopes to develop an 11nm process by 2015, but these will require some highly unusual new technologies. Knupffer said: ‘Moving past 32 we’re looking at a host of more exotic materials and techniques, such as three-dimensional transistors, nanowire (5nm wide silicon wire), carbon nanotubes, and the 3D stacking of chips.’
One particularly exotic material under the spotlight is indium antimonide (InSb), a material at the heart of a joint Intel-Qinetiq project to develop so-called quantum well transistors. It is thought that such devices, which confine the movement of particles to two dimensions, could effectively help neutralise the strange quantum phenomena that will occur as transistors shrink further.
Knupffer added that Intel researchers are also excited at the possibilities offered by ‘silicon photonics’, a technology that could replace wires with a tiny silicon version of fibreoptics. ‘Currently data travels on a processor and in between processors using copper wires. Fibreoptic communications is currently only used for big, long-distance applications but we’re looking at making it a chip-to-chip product.’
Engineers working on this project have already developed a range of devices, including a hybrid silicon laser, light guides (waveguides in silicon that can direct light, as in an optical fibre), and silicon modulators that can encourage light into stops and starts.
‘We’ve got all we need to create silicon photonic interconnects between chips on the motherboard,’ said Knupffer, who believes the technique could revolutionise the way devices are made. ‘On a motherboard you have the RAM right next to the processors, because copper tracers can’t be too far away, but that issue doesn’t exist with light. You could have all your memory in a different rack or part of the building, and this could have a massive impact on server design.’ Intel is developing the technology into a product that should emerge in the next 12 months, he added.
The Raman laser is based on Intel’s silicon photonics technology (above left), and IBM’s approach to cooling is to use microscopic rivers of water
Intel is not the only company in the chip-shrinking business. Earlier this year, IBM also announced the development of a 45nm high-k metal gate chip and, like Intel, is investigating other methods of keeping Moore’s Law afloat.
One technology thought to hold promise for near-term improvements is the development of 3D chips that enable transistors to be packed more tightly together. IBM is working on the development of 3D stacking technology, which replaces traditional metal wires with so-called ‘through-silicon vias’, vertical connections etched through the silicon wafer and filled with metal. It allows chips and memory devices that traditionally sit side by side on a silicon wafer to be stacked on top of one another, reducing the size of the chip package and boosting the speed at which data flows among the functions on the chip.
IBM claims the technique reduces the distance that information on a chip travels by a factor of 1,000. The company is running chips using technology in its manufacturing line, and plans to enter production with them this year.
Initial applications are expected to be in wireless communications chips, although IBM is also reported to be converting the chip that powers its Blue Gene supercomputer into a 3D stacked chip.
This approach is not without its problems, however. Stacking chips on top of each other makes it harder to get the heat out. So in parallel with its stacking project, IBM has joined researchers at Germany’s Fraunhofer Institute to develop a cooling technique that uses a network of 50-micron pipes to circulate rivers of water between each layer in the stack.
It is all highly impressive, but as Imperial’s Gelenbe explained, it is designed to eke out a core technology that is approaching a more fundamental physical obstacle than any in its history. ‘As we make smaller and smaller circuits that are more and more densely packed, the number of atoms that you use per component is being reduced. As you go down there is a fundamental physical process which changes: your circuits are not deterministic any more, they are probabilistic. They become random, and as they become random they become less reliable.’
Gelenbe believes this atomic hurdle is already beginning to impact on the semiconductor industry’s roadmap. ‘If you look at the industry conference themes, there’s been a shift toward probabilistic circuits.
‘The question is, despite the fact that these individual circuits are less predictable, can we still build systems that are predictable and reliable using them?’
And according to Gelenbe, there is at least one good piece of existing evidence that probabilistic computing works and works well: the human brain.
‘You and I are highly unreliable organisms, our brains make mistakes but we get most of it right — just think of the amount of computing we do as we walk down the street and we do it reliably, despite the fact that all of our underlying mechanisms are unreliable.’
Intel’s Knupffer agrees with Gelenbe’s analysis, but declined to make a Moore-style prediction about what the future might hold.
‘We are reaching the limits of the CMOS [complementary metal-oxide- semiconductor] process. We haven’t announced anything beyond 11nm — future options could include spintronics (a technique that exploits the spin of electrons), optical or photonic computing, or quantum computing. Work is ongoing in all these areas, it could be none of them or it could be all of them. The one thing we are saying is that we intend to use silicon as a building block.’
And though the challenges facing the semiconductor industry are huge, its optimism in a future that it admits it cannot predict might just keep Moore’s Law afloat. Knuppfer said: ‘Every time we build a fab, it’s a $3bn or $4bn investment. We’re taking a giant step, because we’re building a fab that will create processors that haven’t been designed yet on a process that hasn’t been invented yet for a market that doesn’t exist yet.
‘It’s a pretty big bet.’
UK productivity hindered by digital skills deficit – report
This is a bit of a nebulous subject. There are several sub-disciplines of 'digital skills' which all need different approaches. ...