Tuesday, May 12, 2009

The Origins of Nanotechnology

Luis Gonzalez

Even though nanotechnology is a relatively new topic in today’s world, the development of it dates back to times we weren’t even born.
In 1965 Gordon Moore, one of the founding fathers of Intel Corporation, predicted that the number of transistors will double up every 18 months for every ten years. Today, this is called the Moore’s law, because this phenomenon happened, and it’s still happening until this day. Just to watch how fast its increasing, an example: in 1974 computer processors had 4004 transistors, now in the Core 2 Duo processors that most computers runs now they have around 700 million transistors. That means that 35 years have passed, and the number of transistors have more than duplicated, we can almost say that it has increased exponentially decade after decade. And because transistors have increased, the numbers of individual electronic elements have decreased, going from millimeters in the 1960’s to nanometers in modern circuitry.
Historical Background of Nanotechnology:
For thousands of years, we humans have used nanotechnology for over a thousand years, unwittingly. Make steel, paintings and vulcanizing rubber are just some examples of our uses of nanotechnology. Each of these processes rely on the properties of stochastically-formed atomic ensembles mere nanometers in size, and are distinguished from chemistry in that they don't rely on the properties of individual molecules. But the development of the body of concepts now subsumed under the term nanotechnology has been slower.
The first mention of some of the distinguishing concepts in nanotechnology was in 1867 by James Clerk Maxwell when he proposed as a thought experiment a tiny entity known as Maxwell's Demon able to handle individual molecules. The first observations and size measurements of nano-particles was made during first decade of 20th century. They are mostly associated with Richard Adolf Zsigmondy who made detail study of gold sols and other nanomaterials with sizes down to 10 nm and less. He published a book in 1914. He used an ultra microscope that employs dark field method for seeing particles with sizes much less than light wavelength. Zsigmondy was also the first who used nanometer explicitly for characterizing particle size. He determined it as   part of a millimeter. He developed a first system classification based on particle size in nanometer range.
Conceptual origins of Nanotechnology:
The topic of nanotechnology was again touched upon by "There's Plenty of Room at the Bottom," a talk given by physicist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important. This basic idea appears feasible, and exponential assembly enhances it with parallelism to produce a useful quantity of end products. At the meeting, Feynman announced two challenges, and he offered a prize of $1000 for the first individuals to solve each one. The first challenge involved the construction of a nanomotor, which, to Feynman's surprise, was achieved by November of 1960 by William McLellan. The second challenge involved the possibility of scaling down letters small enough so as to be able to fit the entire Encyclopedia Britannica on the head of a pin; this prize was claimed in 1985 by Tom Newman. 
Experimental advances:
Nanotechnology and nanoscience got a boost in the early 1980s with two major developments: the birth of cluster science and the invention of the scanning tunneling microscope (STM). This development led to the discovery of fullerenes in 1985 and the structural assignment of carbon nanotubes a few years later. In another development, the synthesis and properties of semiconductor nanocrystals were studied. This led to a fast increasing number of semiconductor nanoparticles of quantum dots.
At present in 2007 the practice of nanotechnology embraces both stochastic approaches are manipulated on substrate surfaces by deterministic methods comprising nudging them with STM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive. Since the mid 1990s, thousands of surface scientists and thin film technocrats have latched on to the nanotechnology bandwagon and redefined their disciplines as nanotechnology. This has caused much confusion in the field and has spawned thousands of "nano"-papers on the peer reviewed literature. Most of these reports are extensions of the more ordinary research done in the parent fields.
For the future, some means has to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by John Gall: "A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a system that works." A breakthrough in MNT is needed which proceeds from the simple atomic ensembles which can be built with, e.g., an STM to complex MNT systems via a process of design evolution. A handicap in this process is the difficulty of seeing and manipulation at the nanoscale compared to the macroscale which makes deterministic selection of successful trials difficult; in contrast biological evolution proceeds via action of what Richard Dawkins has called the "blind watchmaker" comprising random molecular variation and deterministic reproduction/extinction.

References:
1. Indian craftsmen, artisans used nanotech 2000 years ago
2. Zsigmondy, R. "Colloids and the Ultra microscope", J.Wiley and Sons, NY, (1914)
3. Derjaguin, B.V. Discuss. Faraday Soc., No. 18, 24-27, 182-187, 198, 211, 215-219 (1954)
4. Efremov, I.F. "Periodic Colloidal Structures", in "Surface and Colloid Science", vol. 8, Wiley, NY (1975)
5. Lyklema, J. "Fundamentals of Interface and Colloid Science", vol.1-5 Academic Press, (1995-2000)
6. Gribbin, John. "Richard Feynman: A Life in Science" Dutton 1997, pg 170.
7. Norio Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974.
8. Gall, John, (1986) Systemantics: How Systems Really Work and How They Fail, 2nd ed. Ann Arbor, MI : The General Systemantics Press.
9. Richard Dawkins, The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe Without Design, W. W. Norton; Reissue edition (September 19, 1996)

No comments:

Post a Comment