Friday, February 2, 2007

Running Out of Room at the Bottom

Let's write about something SFish.

We're running out of room at the bottom.

That's a heavy-handed allusion to Richard Feynman's paper, "Plenty of Room at the Bottom," which has assumed near Biblical stature to the field (and cult) of nanotechnology. You can find it on the Web (unlike Hugh Everett's paper on the many worlds interpretation of quantum mechanics, which is another much-cited paper that few people have actually read). You might want to go read the Feynman paper. I'll be here when you get back...

Er, hrm. Anyway....

A present day reader of Feynman's paper might notice a few things, not least being that we got to one ultimate goal quite a while back: the Scanning Tunneling Microscope can be used to observe -- and manipulate -- single atoms. This is seriously cool, even if the first thing the IBM guys who invented the technique did with it was to spell out "IBM."

Short aside: The original "IBM" was spelled out in xenon atoms, which are very unreactive, so they behaved themselves. Some while later, someone tried putting carbon atoms next to oxygen atoms using an STM. They were reasonably sure that the appropriate chemical reaction then took place, but they could never find the resultant carbon monoxide because the reaction produced so much energy that the molecule jumped somewhere beyond the vision of the STM.

But manipulation of things at the "nanoscale" has taken on a lot of baggage. I referred to it as both a field and a cult, and the cult is older than the field. The nanotech cult is the brainchild of Eric Drexler, and was the source of one of the great "magic wands" in science fiction. SF magic wands are notion that can be used to generate practically any result (provided you don't pay too careful attention to the science, and really how many SF writers do that?).

Another short aside (worthy of an essay all its own): SF magic wands let SF writers do "hard SF" that is chemically indistinguishable from fantasy. Past and present examples include esp/psi, alternate worlds, virtual reality, and the Singularity. Discuss.

Anyway, the tension between the two kinds of nanotechnology was enshrined in a debate (actually an exchange of letters) between Drexler and Richard Smalley that was sponsored by the American Chemical Society, and published in the Dec. 1, 2003 Chemical & Engineering News (C&EN) cover story. The debate was triggered by a critical article Smalley had published in the 2001 issue of Scientific American. I'm a member of the ACS, and I'm not coy about where my sympathies lie, but I'm going to talk about something different from what Drexler and Smalley debated (the debate can be found on the Web if you look hard enough; reviews can be found more easily).

What's the dividing line between nanotechnology and microtechnology? Let me note how interesting the micron (a millionth of a meter) is. A (very) few bacteria are as small as 0.2 microns, but most are about a micron in size. Most eukaryotic cells are larger still (organelles take up room), with the "typical" human cell being maybe 10 microns in size.

An air guy aside: Micron-sized particles are well-suited for scattering light via "Mie scattering," which is different from Rayleigh scattering, also called molecular scattering for the usual reasons. Mie scattering particles are larger than a wavelenth of light, larger than 400-800 nm in other words, so 1 micron is near the Mie limit. Mie scattering is how smoke scatters light, because smoke particles tend to be from 1-10 microns in size. Particles that
are much smaller than a micron in fact tend to "coagulate," i.e. cling together when they bump into one another, so sub-micron particles don't last long in the air. Particles larger than 10 microns tend to fall out of the air pretty quickly).

Since the real "magic wand" part of Drexler's sort of nanotechnology is all about remaking biology, let's take 1 micron as the logical barrier to a "nanomachine." Any bigger, and maybe you have to call it a micromachine, right? In any case, it certainly seems difficult to do things in the interiors of cells with something that is bigger than the cell. Maybe you can do it for a few cells, but you're not going to be able to fit a lot of the larger-than-micron machines alongside your cells, there's just not that much room.

I published a story a few years back: "Flower in the Void," that basically had a nanomachine as the protagonist. I'll call it that, because it was the closest thing to a character in the story; there weren't any people in it; that was sort of the point. I was also making a subtle argument (and when I get subtle, you can almost bet that I'm mostly talking to myself). Let me make that argument here, a lot more explicitly.

How big is an atom? That varies, but not by as much as you might think. Here's a table:

Substancedensity mol. Weight size
gm/cc)(gm/mole)(nm)
Be1.8490.2
He1.24940.17
W19.3183.90.25
Si2.3328.10.27
graphite2.26120.21
diamond3.51120.18
U18.952380.28
Li0.536.90.28
Fe7.8655.80.23
B2.3410.80.2
S2.06732.10.3
Cl3.12350.27
NaCl2.16558.40.28
benzene0.8786780.23
H2O1180.22



I put in the last two to show that it putting atoms in molecules doesn't let you pack them in much. Basically, we're talking about a quarter of a nanometer per atom. If we treat the atoms like stacked bricks (i.e. as if they were cubic), and put them into a cubic micron, we could get 4000*4000*4000 atoms into the cubic micron, or 6.4*10^13 atoms. It we wanted to fill only a sphere with a 1 micron diameter, we'd only get about half that, but we could get another 50%
in if we packed spherical atoms into the space in a tight (face centered cubic) structure. I don't care, really, since this is an order-of-magnitude thing here.

Now 64 million million (or 64 thousand billion) is a pretty big number, butlet's compare it to some things we deal with every day. Take the computer on your desktop. Does it have 1 gig of RAM and a 100 gig drive? If not, the next one you buy probably will. And those are gigabytes not bits, 8 bits per byte. So that 100 gigabyte hard drive, the one that sells for somewhere around $100, has pretty close to a thousand billion bits to it. Think we'll ever be able to encode 1 bit per atom? I doubt it; how much of the disk drive is actual recording medium and how much is control, power handling, protective shell -- overhead in other words?

But even with zero overhead, 800 billion atoms take up a volume of a quarter of a micron, larger than some bacteria.

Add in sensors; you can't do anything without I/O. How about a power supply? The biggest, heaviest thing in a PC is the power supply. Cooling system? Just how much computing can you do before the thing gets hot?

All in all, the standard desktop workstation is now probably well beyond the theoretical computing capabilities of a micron-sized molecular machine. That's what I mean by running out of room at the bottom.

You want to network the nanites? Go right ahead. Just remember how few computing problems lend themselves to distributed computing, not to mention the overhead of the network itself.

So, you want a nanomachine to go into your bloodstream and fix some part of your anatomy? How about first teaching your desktop to fix your car? And if you haven't managed that yet, tell me how you expect things to get easier when they are much, much, smaller. Making things smaller usually means that it is more difficult, and you don't make problems easier by making them harder.

No comments: