Your independent source for Harvard news since 1898

Your independent source for Harvard news since 1898

Right Now | Hardware at the Nanoscale

Liquid Computing

November-December 2001

Imagine a computer, suspended in a flask of liquid, which assembles itself when the liquid is poured onto a desktop. Sound like science fiction? Hyman professor of chemistry Charles Lieber is making it happen in his laboratory, where researchers have already created tiny logic circuits and memory--the two main components of a computer--in just this manner. And these circuits are tiny, just a few atoms across.

Lieber and his team of chemists have done a kind of end-run around the silicon-based microelectronics industry, which for the last 35 years has been making transistors--tiny switches that can be either on or off--exponentially smaller every 18 to 24 months. Intel chairman emeritus Gordon Moore observed this doubling of computing capacity as early as 1965, and his observation became codified as "Moore's Law." However, says Lieber, "continued shrinkage ultimately becomes problematic in terms of just how one achieves [it]." Scientists anticipate that we will reach the limits of our ability to create silicon chips using standard fabrication line methods sometime between 2012 and 2017.

That's because manufacturers today create microelectronic circuits either by depositing silicon on a surface or by etching it away (for example, with acid). But just as metal after it rusts "is sort of rough," says Lieber, current methods for working with silicon leave rough surfaces that, on the nanometer scale (a nanometer is one billionth of a meter, or one hundred-thousandth the width of a human hair), constitute an ever greater proportion of the tiny wires that make up those circuits. "Ultimately, you can't keep using those methods," he says, "because things will be very non-uniform on a small scale. The smaller circuits become, the more imperfections in the manufacturing process begin to play a role in their performance."

Lieber has "philosophical differences" with the industry's "top-down" approach to nanotechnology--taking big things and making them smaller. "The way to truly revolutionize the future," he says, "is to take a completely different approach: build things from the bottom up." He has done that by starting with the smallest of building blocks--wires only three nanometers across that can be produced relatively cheaply on a bench top with a few thousand dollars' worth of equipment.

Lieber makes the building blocks using a catalyst that favors growth in only one direction. A key characteristic of the process he developed is that it enables nanowires to be prepared in virtually any "flavor" (i.e., with specific conductive properties). Mixing and matching flavors can then lead to different types of devices. The devices are made in an equally simple manner: an alcohol solution of a specific nanowire flavor is poured through a grooved channel in a polymer block to produce an array of parallel wires. Another set of wires can be laid perpendicular to the first simply by rotating the apparatus 90 degrees. Already, his lab has produced a transistor just 10 atoms across.

The potential application in microelectronics is obvious: the minute size of these building blocks allows for higher transistor densities, which could lead, at least in principle, to more highly integrated and powerful computers. In 10 or 20 years there might be no more need for hard disks, because solid-state memory could store so much data. The nanowire computers of the future will be quite different from those we use today because they will require new kinds of computer architecture and software. Ultimately, the most exciting thing about nanotechnologies is not the sheer power that such a computer could provide, says Lieber, but the fact that "you get fundamentally new properties that you can't even conceive of when dealing with conventional materials by scaling them down."

In very small objects, for example, the ratio of the surface area to the interior volume is much larger. "Things that happen at the surface can therefore affect the whole structure," says Lieber. While an electrical engineer might regard that as a problem, it is a property that can be used to advantage. "Normally a molecule binding to the surface of a transistor wouldn't have a big effect," he explains, "but imagine a protein with a charge on it coming up to something very small, where the surface is a big component. You bring this charged body up, and it biologically or chemically switches the transistor. In essence, you can electrically detect when you have a protein, a nucleic acid, or anything else." What you have created is a sensor.

Hence, Lieber is now working on a "proof of concept" for the National Cancer Institute that will demonstrate the use of nanowire sensors for early detection of prostate cancer. In principle, he says, you could design a centimeter-square chip to detect a billion things simultaneously, even variations in an individual's DNA. An undergraduate student of his is taking this idea even further, and working to create a biological computing interface.

Another unusual property of Lieber's nanowires is ballistic conductivity--that is, when you introduce an electron into such a system, it travels through the conductor without losing energy. This property could help reduce the heating that occurs when electrons flow through normal wires--a serious problem in highly integrated electronics. One of Lieber's graduate students has combined nanowires to create light sources and detectors. This would allow optical circuits--"light is always much faster than electrons," says Lieber--to be integrated into a nanowire-based computer. "Who knows?" he says. "This may be a way of enabling the concept of quantum computing."

In classical computers, transistors or bits must be either on or off, set to one or to zero. But in a quantum computer, the bits are simultaneously both one and zero. This is called a superposition. Light exhibits this property in the sense that it is both a wave and a particle: it is a wave, or kind of superposition, until it is detected; at that moment, it becomes a particle, a single photon in a single place. Superposition theoretically allows quantum computers to solve complex algorithms (such as those used in cryptography) that would be impossible for a conventional computer to tackle. The time may be ripe for a new motto: Think small. Really small.

~Jonathan Shaw

 

Charles Lieber e-mail address: cml@cmliris.Harvard.edu

website: cmliris.Harvard.edu/