The future of chip technology

It was an inevitable declaration, a revision that fundamental physics was going to extort from Gordon Moore sooner or later. Silicon-based computers can’t continue to double in speed every two years forever. Moore, who laid out that remarkably accurate roadmap 35 years ago, concedes to that point.

“Chip doubling will go from every couple of years to five years,” he said without hesitation.

The revision to Moore’s Law, which he has been quietly making lately, is made from an Intel office in Santa Clara, a short highway drive from the Palo Alto office where, in 1965, he made the initial observation.

At the time, Moore worked at Fairchild Semiconductor, one of Silicon Valley’s first start-up companies. This was three years before he confounded Intel with the late Bob Noyce.

“I extrapolated the idea that we would go from 60 components on a chip to 60,000 over 10 years,” Moore recalled. The initial law, named by Carver Mead, a professor at California Institute of Technology, predicted a doubling every year, but 10 years later Moore revised the timeline to every two years.

The revision is necessary. Experts in the semiconductor industry believe there are only about 10 to 15 years left before chipmakers run out of ways to pack more transistors on chip, the technique to make chips faster.

“You can imagine there will be a great difficulty if you try to print a line (on a silicon wafer) that is smaller than the space between the atoms in silicon. Physics gets in the way after a while,” said Howard High, who for 21 years has watched Moore’s Law in action from inside Intel as the company’s strategic communications manager.

The 42 million transistors on the new Pentium 4 chip are each 0.18 microns wide or 180 nanometers. A human hair is 100 microns wide.

By 2011, it will be possible to shrink line widths to 0.025 microns or 25 nanometers, estimates the International Technology Roadmap for Semiconductors, published in 1999.

More transistors mean faster chips. By then, a billion transistors on a chip should be possible. That should produce chips that will run well over 10 GHz. “Maybe you’re going to get 12, 15, 20 gigahertz,” said High.

It could go even smaller. “We have proof of concept in the 20 nanometer range,” said Ralph Cavin, senior vice-president of research at the Semiconductor Research Corporation in Research Triangle Park, N.C.

Experts know the numbers they use are speculative. High pointed to a 1989 research paper project that looked ahead at what microprocessors might look like today. “At that time, they worked at 25 to 40 MHz,” he said. “The prediction was they would reach 250 MHz by 2000, but here we are, at the threshold of 2 GHz,” eight times faster than the vision of 11 years ago.

If scientists lose miniaturization as tool to boost chip speeds, Moore believes new ways to speed up silicon chips will be found.

“We will have to push technology as far as we can by shrinking, then we will start making bigger chips,” he said.

In a quest for more transistors, chipmakers could also start sandwiching chips together, place two chips together face-to-face or put multiple chips together.

They could also try to move data around the chip faster. Intel has been experimenting with ways of pulsing data across chips using super-circuits made of industrial diamonds.

Even if scientific barriers can be overcome, another force may halt progress: economics.

“Every time someone does a study on when we will hit those (scientific) limits, we don’t ever hit them. You hit other, more important, financial barriers,” said Russ Lange, chief technologist at IBM Microelectronics.

As chips get smaller, the cost to build them rises. A new, more expensive factory has to be built each time chip architecture changes.

There’s a lot of to be done if Moore’s Law is going to stay on course over the next five years. Line widths need to shrink to 100 nanometers by 2005.

Chip engineers still face some difficult problems in getting there. As transistors get smaller, they need a high concentration of chemical impurities added to the wafer to help hold an electrical charge called dopants but, at high concentrations, dopants clump together and become electrically inactive. Engineers still have to solve this challenge.

Fluctuations in dopant concentration are also a factor. Chips with larger transistors are unaffected, but as transistors get smaller, they become more susceptible and could behave unpredictably.

And then, there are “gates”, the tiny barriers – one or two nanometers – controlling the flow of electrons in a transistor. A gate denotes whether a chip counts one or zero. An open gate lets an electron through as zero. A closed gate blocks it to count as one. This is the basis for binary math, the engine for all computer calculations.

Electrons go rogue occasionally. They bolt through a gate and appear on the other side. It’s a quantum physics oddity called “tunneling”, causing errors in calculation. With no known solutions, chip experts still believe they can solve these problems.

“There was time when people swore we couldn’t get lower than one micron. That was around 1979,” said High.

New manufacturing techniques will drive the next big breakthrough in chip miniaturization.

The current method, optical lithography, uses ultraviolet light to record the image of a circuit on silicon. Intel intends to replace that with extreme ultraviolet lithography, a laser technology out of the Star Wars program, which was a U.S. government project to launch a network of laser-armed satellites that would destroy nuclear missiles fired at the U.S. and its allies.

As the Cold War ended and government funding dried up, an Intel-led consortium of semiconductor companies came together to fund extreme ultraviolet research. The first working extreme ultraviolet tool is now available. By 2005, commercial machines using the technology have started building the chips that allow 0.07-micron (and lower) chip technology.

IBM has a competing technology, Prevail, that it is developing with Nikon. Lucent Technologies also competes, with Scalpel. Yet another technology considered is ion beam projection.

After these problems are solved, there will still be some fundamental problems with chip design.

Since their birth, microprocessors have been built around a clock. With each tick, the entire state of the chip changes. The clock-speed of a microprocessor determines how many instructions per second it can execute a one-GHz clock-speed represents a billion cycles per second.

“That is very cumbersome,” said Steven Hillenius, director of the silicon device research department at Lucent Technologies’ Bell Labs. “The clock signal has to be communicating to the whole chip at the same time.” A new design could change that. “Instead of having individual binary logic, we will start looking at logic that’s not binary, like neural networks,” said Hillenius. A neural network is a type of artificial intelligence that imitates the brain. “If you could make a device that looks more like an animal’s brain, it would work better in silicon than in carbon,” Hillenius added. Instead of counting ones and zeroes, a neural network is a series of interconnected processing elements, the computer equivalent of a brain’s neurons and synapses.

“When you compare a super-computer recognizing a fly and producing a response, with a frog doing the same thing, it becomes very clear that the frog can do it better,” suggested Hillenius. Bill Ditto, a professor of biomedical engineering at the Georgia Institute of Technology, calls biology-based computers “the next, next generation of computers”. Using neurons to perform computing is exciting, he said, because unlike silicon-based chips, an upgrade to the next level of processor isn’t necessary to get more speed. “It does computations through making more connections and adding more neurons,” Ditto said.

While scientists see evidence that biological computers will work, they still haven’t found a way to program them. That’s what Ditto is working on. His research team has succeeded in using leech neurons to do arithmetic. They hooked up the neurons to a personal computer. They stimulated the cells, using the principles of chaos theory. The PC then used the biochemical response to do simple addition.

“It’s a first step. Think back to the early days (of silicon computing) when you had giant transistors. You know the writing is on wall but you don’t know if a use for the technology is five years out or 30,” Ditto said.

Two other promising developments include work at the California Institute of Technology where neurons were successfully connected to a computer to control a simulated animal. At Northwestern University, researchers have been using a lamprey brain stem to detect light from an artificial eye to help steer a mobile robot. “My interest is not to create a cyborg,” said Sandro Mussa-Ivaldi, associate professor at Northwestern’s department of physiology, “but to use this behavior to understand connections in the brain.” Such research will help build a functioning bio-computing device one day. Ditto’s hopes to have proof of concept for such a device within five years. He believes something useful will come within ten years. “We are shooting to have a box with something living inside it that can solve a problem a hell of a lot faster than a conventional computer by then,” Ditto said.

While fascinating strides are being made with living cells, some researchers are going even smaller in the field of molecular computing.

Current computers use switches etched in silicon but future computers might use molecules, clusters of atoms. That would mean that molecular electronics – or moletronics – could replace transistors, diodes, and conductors in conventional microelectronic circuitry.

Mark Reed, chairman of electrical engineering at Yale University, and James Tour, an organic chemist at Rice University, are heading a team in this research. They have developed a one-molecule on-off switch that works at room temperature. Strings of molecules would be assembled together to form simple logic gates that function like today’s silicon transistors.

In the June 2000 issue of Scientific American magazine, Reed and Tour wrote: “If the conventional transistor were scaled up so that it occupied the printed page you are reading, a molecular device would be the period at the end of this sentence.

Even in a dozen years, when industry projections suggest that silicon transistors will have shrunk to about 120 nanometers in length, they will still be more than 60,000 times larger in area than molecular electronic devices.”

The size advantage means a molecular computer would consume very little power. It also “has the potential of vaster computing power,” said Reed, though he cautioned, “it’s a field in initial stages of development,” adding estimates of when the technology could be commercialized are pure speculation.

Similar research is going on at Xerox in Mississauga, a Toronto suburb.

Chemists are engineering transistors made of molecules that will be strung together into nano-circuits by scientists at Xerox PARC, in Palo Alto, California. Dow Chemical and Motorola are also involved.

The resulting “plastic” circuits aren’t designed to replace silicon microprocessors, said Sophie Vandebroek, vice-president, Xerox Research and Technology, but they could provide new display technologies, control electronic paper, and work with silicon microprocessors.

One of the offshoots of molecular computing is DNA computing. DNA (deoxyribonucleic acid) refers to the double helix molecules that are the blueprints of an organism.

Researchers believe it is possible to build microscopic ultra-fast devices with awesome computing power out of DNA.

“DNA computing is the most manageable form of molecular computing that we know of,” said Nadrian Seeman, a chemist at New York University who has made cubes, rings, octahedrons, and other unusual shapes from DNA molecules.

What’s exciting about these building blocks is they can be programmed to do nano-assembly, meaning they can be used in the construction of ultra-small devices. That includes computer circuits only nanometers in size storing information in a single molecule. Miniature medical robots reproducing themselves by the billions that scour a patient’s body to assassinate viruses could also be built.

Then there’s quantum computing, where the sub-atomic world is used to do basic math using the bizarre and often counterintuitive principles in quantum physics.

A conventional computer does binary math using switches that are either on or off. Quantum computing uses switches that are not only on or off, but also on and off at the same time and every state in between.

This doesn’t seem possible in the real world, but it’s very real in quantum physics.

Researchers are using a variety of particles at the sub-atomic level. They’re looking at electrically charged atoms, ions, photons (particles of light), as well as the nucleus of an atom. They are particularly keen on using an atomic nucleus to achieve quantum computing, using a machine not unlike a hospital’s MRI device to measure and manipulate the spin of a nucleus.

In conventional computing, a one or zero is called a bit. In quantum computing, states are called qubits.

Because a qubit is not just a one or a zero but can be both as well as all the states in between, it becomes an enormously powerful way to do parallel computing. Many parts of a computer all work at once on a problem instead of waiting for each other to finish before proceeding.

Research in quantum computing is in its infancy.

A quantum computer wouldn’t necessarily replace silicon-based computers: “If it ever gets to the point of being useful, then it could coexist in a peaceful manner with conventional computers. Or it could be a special purpose computer that feeds data into a conventional computer,” said Nabil Amer, manager and strategist of physics of information, IBM Research Division. Cryptology is an exciting and frightening application of quantum computing.

A quantum computer would have the power to break almost any code. In the wrong hands, that would put military secrets at risk. Technology that scrambles credit card data in e-commerce would be easy to defeat, but the computational power could help scientists unlock the secrets of the universe.

Scientists say it’s important to understand that the new paradigms are theoretical technologies but, barring any unseen drawbacks, they will eventually make their way into some form of computer technology of the future. Meanwhile, Moore believes silicon will continue to be the basis of future computers.

“I will admit to being a skeptic,” he said. “The view that something will suddenly come along to replace silicon technology – that is very naïve. This is a technology that cumulatively is a $100-billion industry and to believe that something will come in and in one leap get ahead of that, I find pretty hard to swallow.”