Nanotechnologies: Visions under the microscope

Nanometer (nm) = one billionth of a meter. To begin to grasp this order of magnitude, we must agree that it refers to objects invisible to the “naked” human eye. Further emphasizing popular analogies, we can show that this size is terrifyingly small: human fingernails grow by one nanometer every second, the thickness of a human hair can be measured at 80,000 nanometers, while a gold atom is estimated at something less than 0.2 nanometers.

If something is invisible and therefore not at all perceptible through everyday experience, the technological applications arising from this something can easily provoke fear or/and repulsion. At the same time, the intense transformations of contemporary capitalist reality, the generalized tertiarization of production in the “developed” world, and the new forms of labor can be superficially approached and invested with theories such as those trying to convince us that capitalism is now immaterial. With such approaches and theoretical nonsense, one can happily ignore facts concerning the material/technological infrastructure of capitalism that have been evident for more than 70 years; taking, at the very least, as a starting point the mobilization of the smallest known particles, whose exploitability was recognized up to that point, in the most destructive invention ever used on this planet: the atomic bomb.

Since then, the strategies of states and capital have consistently and continuously focused on the possibilities of colonizing and exploiting increasingly microscopic particles-materials that can be set in motion. From imaging the shape of DNA, molecular biology and genetic engineering, to the most modern material technologies, nanotechnologies, the even deeper analysis of atoms into sub-atomic particles and quantum mechanics, the “setting in motion” translates into the ability for these profoundly small-elementary material units (molecules, atoms, sub-atomic particles) to be able to “work” on behalf of capital or in other words to be able to be incorporated into exploitable technological applications.

In order to explore and “discover” even more subatomic particles (such as the well-known boson), a massive 27-kilometer ring1 is required, where particles are accelerated and collide with each other under controlled(?) yet dizzying conditions. On the other hand, certain atomic, molecular, and supramolecular structures have been studied for years as more “stable” systems, and many of their properties have been examined. Such structures are what nanotechnologists aspire to use as “raw materials.”

According to the National Nanotechnology Initiative, which serves as the official federal agency for nanotechnology research and development in the U.S., a general and descriptive definition2 includes the following:

The essence of nanotechnology is the ability to work at the molecular level, atom by atom, to create large structures with fundamentally new molecular organization. Compared to the behavior of isolated atoms at 1 nm or bulk materials, the behavior of structural features in the range of 1 to 100 nm exhibits significant changes. Nanotechnology involves materials and systems whose structures and components exhibit novel and significantly improved physical, chemical, and biological properties, phenomena, and processes due to their nanoscale size. The goal is to exploit these properties by gaining control of structures and arrangements at the atomic, molecular, and supramolecular level and to learn to construct and utilize these arrangements efficiently. Maintaining the stability of interfaces and integrating these “nanostructures” at the microscale and macroscale is another objective.

[…] As it becomes possible to control the size of individual characteristics, it also becomes possible to enhance the properties of materials and the functions of devices beyond what we currently know or still consider feasible. The reduction of structure dimensions leads to entities such as carbon nanotubes, quantum wires and dots, thin films, DNA-based structures, laser emitters, which have unique properties. Such new forms of materials herald a revolutionary era for science and technology, since we can discover and fully utilize the underlying principles.

Half wishful thinking, half reality (as it ultimately refers to a list of real technologies), this definition attempts to encompass, in its generality, anything that could conceivably be perceived as nanotechnology: For instance, from the perspective of materials technology and chemistry, it is known that transitioning from larger scales to the nanoscale through the application of appropriate techniques can lead to the formation of molecular structures with different properties. With a different approach, the essence of nanotechnology is (bottom-up) atom-by-atom construction, otherwise referred to as: molecular nanotechnology. According to this version, molecular nano-machines could be programmed or react in a controlled manner and assemble atom-by-atom new nano-structures and molecular nano-machines. In any case, the control, exploitation, and efficient use of atoms and molecules define the intent of research at the nanoscale.

It is really very difficult to find a single definition for nanotechnology3. And this is because the expectations of scientists and state organizations for a new “technological revolution” and the corresponding visions, the contemporary technological reality of new materials and their commercial applications, the current production of electronic components at the nanoscale and the biochemical synthesis of new molecular structures, scientific imagination and the dystopia of a world occupied and consumed by uncontrollable nano-robots intertwine and create confusion even among discussions of the technoscientists and engineers themselves.

In fact, this confusion is not absent from the dominant narrative of the history of nanotechnology, nor from the research and development roadmaps of states, companies and academic institutions that actually invest in the relevant fields. We assume that such issues are equally confusing for those who have been or are in the corresponding laboratories of academic institutions and work intensively measuring, calculating and analysing the properties of matter at the nanoscale.

Graphene is one of the most famous achievements of nanotechnology in the field of new materials. With exceptional properties, this allotrope of carbon (excellent conductor of heat and electricity, tens of times more resistant to impact compared to iron, transparent) was created in the laboratory only in 2003. And since then, its uses have been constantly expanding.
From visions to the microscope

Many times it happens that the visions of scientists and technologists regarding what the future developments in one or another field will be, precede the actual theoretical and practical developments and transformations by several years. For this reason, texts that have been written and publicly exposed in the past can remain in obscurity or even insignificance, until they need to be mobilized within the framework of capitalist technological restructuring. In this way, a continuity is constructed in the narrative of the history of sciences and their technological achievements, with an obvious ideological role for citizens. The least that happens is the obscuring of the historical conditions and necessities of capital, through which research and technology are promoted and produced.

Under these circumstances, a speech by the renowned theoretical physicist Richard Feynman4 given in 1959 titled “There is plenty of room at the bottom” regarding the possibility of manipulating individual atoms was retrospectively considered as the theoretical starting point of nanotechnology. We will quote here an excerpt that concerns the impact that the relatively recent biological “discoveries” of that era had on this 1959 speech.

The biological example of recording information on a small scale has inspired me to think of something that could be feasible. Biology is not merely the recording of information; it also does something with it. A biological system can be extremely small. Many cells are very microscopic, yet they are also very active; they manufacture various substances; they move around; they twist spasmodically; and they do all kinds of wonderful things – all of this on a very small scale. Moreover, they store information. Consider the possibility that we could make something very small which does what we want – that we could construct an object that operates at this level!

Biology and information theory had already begun to synthesize tendencies towards the new bioinformatic paradigm. And if the cells of living organisms can “encode information” and work towards this purpose, why couldn’t the same be true for any particle of the same or even smaller scale? The idea of machines that could mechanically construct, atom by atom or molecule by molecule, other machines and materials, thus acquires its own metaphysics through the imitation of biological processes, once these have been interpreted and analyzed as bioinformatic mechanisms.

At another point in the same text, Feynman, referring to possible practical applications, presents the “wild ideas” (as he himself characterizes them) of his friend and former doctoral student Albert Hibbs:

It would be interesting for surgery, if we could swallow the surgeon. You place the mechanical surgeon inside a blood vessel and it goes into the heart to “take a look” […] it finds which valve is defective, takes a small knife and cuts it. Other small machines could be permanently embedded in the body to assist an organ that functions inadequately.
And now comes the interesting question: How can we create such a microscopic organism? […]

On the verge of science fiction, for its time, this text was logical and expected to remain obscure for several years. However, it was enough to later give the already famous Professor Feynman yet another title – that of “father of nanotechnology.” In fact, the term ‘nanotechnology’ was used quite a few years later, first by the Japanese Norio Taniguchi in 1974, regarding techniques for manufacturing semiconductor materials at the nanometer scale. But the widespread adoption of the term is placed in 1986 and is attributed to engineer Eric Dexler, who, inspired by Feynman’s visions, describes his own version of nanotechnology in his book “Engines of Creation: The Coming Era of Nanotechnology”. Moving between the then contemporary achievements of molecular biology, chemistry, and materials technology, and reaching science fiction scenarios of mass destruction, Dexler envisions a new technological breakthrough. Such a breakthrough involves self-replicating nano-assemblers controlled by artificial intelligence systems, capable of constructing and repairing everything: spacecraft, skyscrapers, the natural environment, human cells – ensuring super-longevity, if not immortality – and naturally (!), these machines would be capable of rendering (once again in history!) human labor obsolete. Moreover, according to Dexler, the state that first achieves this technological leap will leave decades behind those other states that fail to follow developments to a satisfactory extent, not to mention the risk of new weapons of mass destruction and so on…

This particular book of popularized futuristic mechanics could be compared to several earlier or later science fiction novels and lose. It might not have been worth mentioning if it hadn’t given part of its title to a plethora of modern technological applications. Nevertheless, the ability to find efficient techniques for constructing with atomic precision is in some way the holy grail of the matter, while funding for nanotechnology research and development is also directed toward this end.

We started by saying that the nanoscale is invisible to the human eye. In fact, it is equally invisible when looking through any optical microscope. This is because the spectrum of frequencies that can be perceived by the human eye inherently limits its resolving power to some fractions of a millimeter (mm – one thousandth of a meter), while using optical microscopes we can see down to the scale of some fractions of a micrometer (μm – one millionth of a meter). The nanometer scale (1-100nm) required and requires indirect methods of imaging and characterizing the properties of matter. The X-ray diffraction used also for the “discovery” of the double helix structure of the DNA molecule required, on one hand, the rearrangement of molecular structures into a crystalline form repeating in 3 dimensions and, on the other hand, complex mathematical relationships so that the patterns resulting from the diffraction of rays could be mapped into a decent image.

As long as the nanoscale could not be represented in a sufficiently convincing way for the human eye, the possibilities of handling matter at this level remained a pious wish. Although the electron microscope allowed – under strict conditions (e.g. high vacuum conditions and sample modifications) – the tracing of nanostructures within cells and inorganic matter and found widespread use in biology and medicine, its use requires high specialization while the result is not always usable and successful.

In 1981, the Scanning Tunneling Microscope was invented in IBM’s laboratories in Zurich. Utilizing the quantum tunneling phenomenon and a tip with an atomic-scale sharp point, this tool performs electro-mechanical scanning of surfaces atom by atom. It is not merely an evolution of previous techniques based on radiation or electron emission, but something radically different that should not even be called a “microscope”; a more fitting name would be an atomic precision electro-mechanical quantum scanner. A few years later, in 1986, atomic force microscopes were invented, which also allow mechanical scanning of surfaces and their imaging using a computer. In 1989, in another IBM laboratory, using a specially modified scanning tunneling microscope, 35 individual xenon atoms were placed onto a copper surface using the microscope’s tip in such a way that one could read the company’s initials using the same tool, symbolizing in this way the new possibilities of mechanical manipulation of matter at the nanoscale. We say “symbolizing” because, in reality, at such scales, the factors affecting motion and the forces between particles differ significantly from anything we consider as mechanical handling at the human scale.

 Of course, it is not within our scope to provide an analytical account of all the “microscopes” that belong to the aforementioned families of tools, nor of the achievements that emerged from their use. We refer to them primarily to show what were the tools that allowed the placement of visions for nanotechnology under the microscope, something that several years later evolved (and continues to evolve) into an intense race of investments and technological “innovations”.

In the early ’90s, the emergence of some pioneering companies related to what was considered nanotechnology began to appear. During this decade, through the laboratories of companies and academic institutions, a series of “innovations” began to be published in several related sectors – mainly in materials technology, nanoelectronic systems and chemical engineering.

During the first decade of the 21st century, a plethora of consumer goods began to appear on the market as nanotechnology products: scratch-resistant car protectors, golf balls, tennis rackets, baseball bats, antibacterial socks, stain-resistant and wrinkle-free clothing, cosmetics, scratch-resistant glasses, faster-charging batteries, better screens, mobile phones, digital cameras… (Phew!! And life somehow became better this way!)

We don’t know if these products are miraculous thanks to the nanotechnologies they carry or more so because of the hype surrounding them. The story of strategic investments in nanotechnologies began somewhere in the early part of the previous decade and, of course, does not concern only trousers, shirts and screens.

Strategic investments in harsh weather conditions

Exactly during that period, in the early 21st century, the visions of 40-year-olds and advanced imaging tools at the nanoscale, already in use since the 1980s, were mobilized into national planning and systematically incorporated into state budgets for research and technology. It is the same period when the outbreak of the current capitalist crisis had already begun to become visible to corporate specialists. Consequently, capital’s need for more intensive research aimed at expanding into new markets through the development of materials and technologies that could radically reshape its technological infrastructure emerged at that time, providing the initial impetus for the advancement, among other things, of nanotechnology. In the first phase of these national programs, the competition for developing a knowledge base could perhaps, somewhat abusively, be considered as a process of primary accumulation of technological know-how.

 The “national strategies” regarding the funding and promotion of new technologies are nothing new in the history of capitalism, but rather the rule. They become even more the rule when the possibilities for exploiting productive forces begin to narrow under the threat of a serious economic crisis. Under these conditions, research into new technologies and the need to expand into new markets could not be left “unchecked” to the “laissez faire” of the markets, but must constitute yet another central pillar of the state and its plans for managing the crisis and overcoming it.

In January 2000, the National Nanotechnology Initiative was established in the USA by then President Clinton with an initial annual funding of 495 million dollars. As stated in the official inaugural report of this organization:

“The President makes the National Nanotechnology Initiative a top priority. Nanotechnology builds on current advances in chemistry, physics, biology, engineering, medicine, and materials research and contributes to the interdisciplinary education of the 21st century’s science and technology workforce. The Administration believes that nanotechnology will have a profound impact on our economy and society, perhaps comparable to that of information technologies or cellular, genetic, and molecular biology.”

According to the same text, the impact of nanotechnology is expected to include the following sectors:

  • Materials and production methods
  • Nanoelectronics and computer technology
  • Medicine and health system
  • Aeronautics and space exploration
  • Environment and energy
  • Biotechnology and agriculture
  • National security and other government applications
  • Science and education
  • Global trade and competitiveness.

Four years later (2004), in another reference text by the same organization5, regarding the progress of production methods at the nanoscale, the roadmap of expected results from the national research and development program is outlined:


The capabilities of nanotechnology for systematic control and production at the nanoscale have been categorized in its evolution into four overlapping generations of nanotechnology products:
– 1st Generation (starting around 2000): passive (stable-function) nanostructures, such as nanostructured coatings, nanoparticles, nanowires, and large-sized materials resulting from nanostructures;
– 2nd Generation (starting around 2005): active (evolving-function) nanostructures, such as transistors, amplifiers, targeted pharmaceuticals and chemicals, sensors, actuators, and adaptive structures;
– 3rd Generation (starting around 2010): Three-dimensional nanosystems and nanosystem-of-systems using diverse synthesis and fabrication techniques such as bio-fabrication, nanoscale robotics, nanoscale networking, and multi-scale size architecture;
– 4th Generation (starting around 2015): Heterogeneous molecular nanosystems, where each molecule in the nanosystem has a specific structure and plays a different role. Molecules will be used as devices and fundamentally new functions will emerge from the designed structures and architectures.

We cannot know whether American laboratories have adhered strictly to the roadmap. Most likely not. After all, it is extremely difficult to constantly monitor developments across a plethora of technological sectors and socio-political areas that aim to penetrate ever more deeply into the nanoscale. The fact that U.S. annual government spending on nanotechnology gradually increased to $1 billion by 2006 from the initial $500 million is an indication that the first generation (that of nanomaterials) has fared well. Moreover, the majority of newly announced “innovations” related to nanotechnologies primarily concern the field of materials technology. At the same time, however, the production of publications originating from research laboratories also pertains to the next generations of products. In a recent White House announcement regarding the creation of initiatives for the commercialization of nanotechnology on 20/05/2015, it is estimated that the amount spent in the U.S. from 2000 until today exceeds $20 billion. The fact that it is troubling how difficult it is for the results of these costly research efforts to leave the laboratories and enter production may mean that the subsequent generations (2nd, 3rd, 4th), despite any successes in the laboratories, have not yet (been able to) find widespread commercial applications.

At a comparable wavelength, the European Commission in 2004 through the report “Towards a European Strategy for Nanotechnology” set as a goal the cooperation of member states for competitive (with the USA and Japan as main rivals) research and development in nanotechnology. According to a European Commission report6 the amount spent from 2000 to 2007/2008 in the EU is estimated at 6-7 billion euros, an amount comparable to that spent by the main global competitors7. Through the 7th Framework Programme (FP7) during 2007-2011 (the program ended in 2013), the amount spent exceeded 2.5 billion euros, distributed across 1400 different projects covering various sectors. This trend is expected to continue with the programs announced within the framework of the EU’s financial program, which began in 2014, named Horizon 2020.

In any case and to summarize, the visions were not lost; as for the investments and billions of dollars and euros spent on research for all kinds of nanotechnologies, ways must be found so that they do not end up being wasted money. This should be achieved through the commercialization of research applications so far and their release as technological products.
Also, the usual course of introducing various technological “innovations” into widespread consumption should be considered a given, even schematically: they usually start from military or/and aerospace applications and surveillance technologies8, then they are advertised as concerning “the good of humanity” through some medical applications, and finally they move on to the production of mass technological consumer products.

Within this framework, the promises of visionaries have been transferred to leaks and announcements regarding the technologies of a not-so-distant future. The possibilities of integration, that is, the addition of nanotechnological products inside the body – something like the surgeon Hibbs mentioned above – straight out of the X lab laboratory of the well-known Google – were presented in October 2014 somewhat like this:


Essentially, the idea is simple; you just swallow a pill with nanoparticles, which carry antibodies or molecules that detect other molecules […] these travel through the body, and because the cores of these particles are magnetic, you can retrieve them somewhere and ask them what they saw. […] If you look at your wrist, you can see these supernatural veins – just by placing a magnet to trap the nanoparticles. We ask them: What did you see? Did you find cancer? Did you see something that looks like a fragile plaque for heart attack? Did you see excessive sodium?

An extremely simple idea, isn’t it?

Beyond the always delayed curses of every bio/nano-ethics regarding the potential toxicity of specific products or the hypothetical (as of yet) risks of mass nanobot production, nanotechnologies will continue to occupy us. First and foremost as a fundamental postmodern technological supplement to the new bioinformatics capitalist model and its expansion.

Program Error
cyborg #03 – 06/2015

  1. We are referring to the Large Hadron Collider (LHC) of CERN. On the CERN website, the LHC is triumphantly referred to as ‘the largest machine in the world’, even though the particles it examines are the smallest possible… ↩︎
  2. This definition comes from the official founding report regarding this organization, titled: “National Technology Initiative: Leading to the Next Industrial Revolution” and issued in February 2000. ↩︎
  3. Officially, the term “nanotechnology” is usually used in the singular. In the rest of the text as well as in the title, we use the term in the plural, since, as we want to show from the beginning, it does not refer to a single field of technology, but includes a variety of different technologies and application areas. ↩︎
  4. Besides the Nobel Prize in Physics for his work on quantum electrodynamics, Feynman’s biography also includes his participation in the Manhattan Project; that is, the effort to construct the first atomic bomb. ↩︎
  5. “Manufacturing at the Nanoscale, Report of the National Nanotechnology Initiative Workshops 2002-2004”, USA 2007. ↩︎
  6. Nanotechnology: the invisible giant tackling Europe’s future challenges, European Commission, 2013 ↩︎
  7. The total amount spent globally on nanotechnology research is estimated at $67 billion for the period 2000-2013. ↩︎
  8. An example of this is the “Institute for Soldier Nanotechnologies” at MIT. We quote from its homepage: “The Institute for Soldier Nanotechnologies is a collaborative group – MIT, the Army, and industry – working to discover and apply technologies that dramatically improve soldier protection and survivability”. University, army, and industry; not strange at all… ↩︎