Exploratory engineering

Last updated

Exploratory engineering is a term coined by K. Eric Drexler to describe the process of designing and analyzing detailed hypothetical models of systems that are not feasible with current technologies or methods, but do seem to be clearly within the bounds of what science considers to be possible within the narrowly defined scope of operation of the hypothetical system model. It usually results in paper or video prototypes, or (more likely nowadays) computer simulations that are as convincing as possible to those that know the relevant science, given the lack of experimental confirmation. By analogy with protoscience, it might be considered a form of protoengineering.

Contents

Usage

Due to the difficulty and necessity of anticipating results in such areas as genetic modification, climate change, molecular engineering, and megascale engineering, parallel fields such as bioethics, climate engineering and hypothetical molecular nanotechnology sometimes emerge to develop and examine hypotheses, define limits, and express potential solutions to the anticipated technological problems. Proponents of exploratory engineering contend that it is an appropriate initial approach to such problems.

Engineering is concerned with the design of a solution to a practical problem. A scientist may ask "why?" and proceed to research the answer to the question. By contrast, engineers want to know how to solve a problem, and how to implement that solution. Exploratory engineering often posits that a highly detailed solution exists, and explores the putative characteristics of such a solution, while holding in abeyance the question of how to implement that solution. If a point can be reached where the attempted implementation of the solution is addressed using the principles of engineering physics, the activity transitions from protoengineering to actual engineering, and results in success or failure to implement the design.

Requirements

Unlike the scientific method which relies on peer reviewed experiments which attempt to prove or disprove a falsifiable hypothesis, exploratory engineering relies on peer review, simulation and other methods employed by scientists, but applies them to some hypothetical artifact, a specific and detailed hypothesized design or process, rather than to an abstract model or theory. Because of the inherent lack of experimental falsifiability in exploratory engineering, its practitioners must take particular care to avoid falling into practices analogous to cargo cult science, pseudoscience, and pathological science.

Criticism

Exploratory engineering has its critics, who dismiss the activity as mere armchair speculation, albeit with computer assist. A boundary which would take exploratory engineering out of the realm of mere speculation and define it as a realistic design activity is often indiscernible to such critics, and at the same time is often inexpressible by the proponents of exploratory engineering. While both critics and proponents often agree that much of the highly detailed simulation effort in the field may never result in a physical device, the dichotomy between the two groups is exemplified by the situation in which proponents of molecular nanotechnology contend that many complicated molecular machinery designs will be realizable after an unspecified "assembler breakthrough" envisioned by K. Eric Drexler, while critics contend that this attitude embodies wishful thinking equivalent to that in the famous Sidney Harris cartoon ( ISBN   0-913232-39-4) "And then a miracle occurs" published in the American Scientist magazine. In summary the critics contend that a hypothetical model which is both self-consistent and consistent with the laws of science concerning its operation, in the absence of a path to build the device modeled, provides no evidence that the desired device can be built. Proponents contend that there are so many potential ways to build the desired device that surely at least one of those ways will not display a critical flaw preventing the device from being built.

Science fiction

Both proponents and critics often point to science fiction stories as the origin of exploratory engineering. On the positive side of the science fiction ledger, the ocean-going submarine, the telecommunications satellite, and other inventions were anticipated in such stories before they could be built. [1] On the negative side of the same ledger, other science fiction devices such as the space elevator may be forever impossible because of basic strength of materials issues or due to other difficulties, either anticipated or unanticipated.

See also

Related Research Articles

K. Eric Drexler American engineer

Kim Eric Drexler is an American engineer best known for studies of the potential of molecular nanotechnology (MNT), from the 1970s and 1980s. His 1991 doctoral thesis at Massachusetts Institute of Technology was revised and published as the book Nanosystems: Molecular Machinery Manufacturing and Computation (1992), which received the Association of American Publishers award for Best Computer Science Book of 1992.

Molecular nanotechnology Technology

Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis. This is distinct from nanoscale materials. Based on Richard Feynman's vision of miniature factories using nanomachines to build complex products, this advanced form of nanotechnology would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.

Nanotechnology Field of applied science whose theme is the control of matter on atomic and (supra)molecular scale

Nanotechnology, also shortened to nanotech, is the use of matter on an atomic, molecular, and supramolecular scale for industrial purposes. The earliest, widespread description of nanotechnology referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defined nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form "nanotechnologies" as well as "nanoscale technologies" to refer to the broad range of research and applications whose common trait is size.

Computer science is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is the ACM Computing Classification System devised by the Association for Computing Machinery.

Gray goo is a hypothetical global catastrophic scenario involving molecular nanotechnology in which out-of-control self-replicating machines consume all biomass on Earth while building more of themselves, a scenario that has been called ecophagy. The original idea assumed machines were designed to have this capability, while popularizations have assumed that machines might somehow gain this capability by accident.

Molecular engineering Field of study in molecular properties

Molecular engineering is an emerging field of study concerned with the design and testing of molecular properties, behavior and interactions in order to assemble better materials, systems, and processes for specific functions. This approach, in which observable properties of a macroscopic system are influenced by direct alteration of a molecular structure, falls into the broader category of “bottom-up” design.

Molecular assembler Proposed nanotechnological device

A molecular assembler, as defined by K. Eric Drexler, is a "proposed device able to guide chemical reactions by positioning reactive molecules with atomic precision". A molecular assembler is a kind of molecular machine. Some biological molecules such as ribosomes fit this definition. This is because they receive instructions from messenger RNA and then assemble specific sequences of amino acids to construct protein molecules. However, the term "molecular assembler" usually refers to theoretical human-made devices.

Megascale engineering is a form of exploratory engineering concerned with the construction of structures on an enormous scale. Typically these structures are at least 1,000 km (620 mi) in length—in other words, at least one megameter, hence the name. Such large-scale structures are termed megastructures.

Computational science, also known as scientific computing or scientific computation (SC), is a rapidly growing field that uses advanced computing capabilities to understand and solve complex problems. It is an area of science which spans many disciplines, but at its core, it involves the development of models and simulations to understand natural systems.

Outline of transhumanism Topical guide to transhumanism

The following outline provides an overview of and a topical guide to transhumanism, an international intellectual and cultural movement that affirms the possibility and desirability of fundamentally transforming the human condition by developing and making widely available technologies to eliminate aging and to greatly enhance human intellectual, physical and psychological capacities. Transhumanist thinkers study the potential benefits and dangers of emerging and hypothetical technologies that could overcome fundamental human limitations as well as study the ethical matters involved in developing and using such technologies. They predict that human beings may eventually be able to transform themselves into beings with such greatly expanded abilities as to merit the label posthuman.

The history of nanotechnology traces the development of the concepts and experimental work falling under the broad category of nanotechnology. Although nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time. The emergence of nanotechnology in the 1980s was caused by the convergence of experimental advances such as the invention of the scanning tunneling microscope in 1981 and the discovery of fullerenes in 1985, with the elucidation and popularization of a conceptual framework for the goals of nanotechnology beginning with the 1986 publication of the book Engines of Creation. The field was subject to growing public awareness and controversy in the early 2000s, with prominent debates about both its potential implications as well as the feasibility of the applications envisioned by advocates of molecular nanotechnology, and with governments moving to promote and fund research into nanotechnology. The early 2000s also saw the beginnings of commercial applications of nanotechnology, although these were limited to bulk applications of nanomaterials rather than the transformative applications envisioned by the field.

The following outline is provided as an overview of and topical guide to nanotechnology:

The societal impact of nanotechnology are the potential benefits and challenges that the introduction of novel nanotechnological devices and materials may hold for society and human interaction. The term is sometimes expanded to also include nanotechnology's health and environmental impact, but this article will only consider the social and political impact of nanotechnology.

Modeling and simulation (M&S) is the use of models as a basis for simulations to develop data utilized for managerial or technical decision making.

Wet nanotechnology involves working up to large masses from small ones.

Drexler–Smalley debate on molecular nanotechnology

The Drexler–Smalley debate on molecular nanotechnology was a public dispute between K. Eric Drexler, the originator of the conceptual basis of molecular nanotechnology, and Richard Smalley, a recipient of the 1996 Nobel prize in Chemistry for the discovery of the nanomaterial buckminsterfullerene. The dispute was about the feasibility of constructing molecular assemblers, which are molecular machines which could robotically assemble molecular materials and devices by manipulating individual atoms or molecules. The concept of molecular assemblers was central to Drexler's conception of molecular nanotechnology, but Smalley argued that fundamental physical principles would prevent them from ever being possible. The two also traded accusations that the other's conception of nanotechnology was harmful to public perception of the field and threatened continued public support for nanotechnology research.

Alejandro Strachan

Alejandro Strachan is a scientist in the field of computational materials and a professor of materials engineering at Purdue University. Before joining Purdue University, he was a staff member at Los Alamos National Laboratory.

Andres Jaramillo-Botero is a Colombian-American scientist and professor, working in Computational Chemical Physics, known for his contributions to first-principles based modeling, design and characterization of nanoscale materials and devices.

This glossary of nanotechnology is a list of definitions of terms and concepts relevant to nanotechnology, its sub-disciplines, and related fields.

References

2. Eric Drexler : "Physical Laws and the future of nanotechnology". Inaugural Lecture of the Oxford Martin Program, Feb,2012. https://www.youtube.com/watch?v=zQHA-UaUAe0