Telerobotics is the area of robotics concerned with the control of semi-autonomous robots from a distance, chiefly using television, wireless networks (like Wi-Fi, Bluetooth and the Deep Space Network) or tethered connections. It is a combination of two major subfields, which are teleoperation and telepresence.
Teleoperation indicates operation of a machine at a distance. It is similar in meaning to the phrase "remote control" but is usually encountered in research, academic and technical environments. It is most commonly associated with robotics and mobile robots but can be applied to a whole range of circumstances in which a device or machine is operated by a person from a distance. [1]
Teleoperation is the most standard term, used both in research and technical communities, for referring to operation at a distance. This is opposed to "telepresence", which refers to the subset of telerobotic systems configured with an immersive interface such that the operator feels present in the remote environment, projecting their presence through the remote robot. One of the first telepresence systems that enabled operators to feel present in a remote environment through all of the primary senses (sight, sound, and touch) was the Virtual Fixtures system developed at US Air Force Research Laboratories in the early 1990s. The system enabled operators to perform dexterous tasks (inserting pegs into holes) remotely such that the operator would feel as if he or she was inserting the pegs when in fact it was a robot remotely performing the task. [2] [3] [4]
A telemanipulator (or teleoperator) is a device that is controlled remotely by a human operator. In simple cases the controlling operator's command actions correspond directly to actions in the device controlled, as for example in a radio-controlled model aircraft or a tethered deep submergence vehicle. Where communications delays make direct control impractical (such as a remote planetary rover), or it is desired to reduce operator workload (as in a remotely controlled spy or attack aircraft), the device will not be controlled directly, instead being commanded to follow a specified path. At increasing levels of sophistication the device may operate somewhat independently in matters such as obstacle avoidance, also commonly employed in planetary rovers.
Devices designed to allow the operator to control a robot at a distance are sometimes called telecheric robotics.
Two major components of telerobotics and telepresence are the visual and control applications. A remote camera provides a visual representation of the view from the robot. Placing the robotic camera in a perspective that allows intuitive control is a recent technique that although based in Science Fiction (Robert A. Heinlein's 1942 short story "Waldo") has not been fruitful as the speed, resolution and bandwidth have only recently been adequate to the task of being able to control the robot camera in a meaningful way. Using a head mounted display, the control of the camera can be facilitated by tracking the head as shown in the figure below.
This only works if the user feels comfortable with the latency of the system, the lag in the response to movements, the visual representation. Any issues such as, inadequate resolution, latency of the video image, lag in the mechanical and computer processing of the movement and response, and optical distortion due to camera lens and head mounted display lenses, can cause the user 'simulator sickness' that is exacerbated by the lack of vestibular stimulation with visual representation of motion.
Mismatch between the users motions such as registration errors, lag in movement response due to overfiltering, inadequate resolution for small movements, and slow speed can contribute to these problems.
The same technology can control the robot, but then the eye–hand coordination issues become even more pervasive through the system, and user tension or frustration can make the system difficult to use.[ citation needed ]
The tendency to build robots has been to minimize the degrees of freedom because that reduces the control problems. Recent improvements in computers has shifted the emphasis to more degrees of freedom, allowing robotic devices that seem more intelligent and more human in their motions. This also allows more direct teleoperation as the user can control the robot with their own motions. [5]
A telerobotic interface can be as simple as a common MMK (monitor-mouse-keyboard) interface. While this is not immersive, it is inexpensive. Telerobotics driven by internet connections are often of this type. A valuable modification to MMK is a joystick, which provides a more intuitive navigation scheme for the planar robot movement.
Dedicated telepresence setups utilize a head-mounted display with either single or dual eye display, and an ergonomically matched interface with joystick and related button, slider, trigger controls.
Other interfaces merge fully immersive virtual reality interfaces and real-time video instead of computer-generated images. [6] Another example would be to use an omnidirectional treadmill with an immersive display system so that the robot is driven by the person walking or running. Additional modifications may include merged data displays such as Infrared thermal imaging, real-time threat assessment, or device schematics.[ citation needed ]
With the exception of the Apollo program, most space exploration has been conducted with telerobotic space probes. Most space-based astronomy, for example, has been conducted with telerobotic telescopes. The Russian Lunokhod-1 mission, for example, put a remotely driven rover on the Moon, which was driven in real time (with a 2.5-second lightspeed time delay) by human operators on the ground. Robotic planetary exploration programs use spacecraft that are programmed by humans at ground stations, essentially achieving a long-time-delay form of telerobotic operation. Recent noteworthy examples include the Mars exploration rovers (MER) and the Curiosity rover. In the case of the MER mission, the spacecraft and the rover operated on stored programs, with the rover drivers on the ground programming each day's operation. The International Space Station (ISS) uses a two-armed telemanipulator called Dextre. More recently, a humanoid robot Robonaut [8] has been added to the space station for telerobotic experiments.
NASA has proposed use of highly capable telerobotic systems [9] for future planetary exploration using human exploration from orbit. In a concept for Mars Exploration proposed by Landis, a precursor mission to Mars could be done in which the human vehicle brings a crew to Mars, but remains in orbit rather than landing on the surface, while a highly capable remote robot is operated in real time on the surface. [10] Such a system would go beyond the simple long time delay robotics and move to a regime of virtual telepresence on the planet. One study of this concept, the Human Exploration using Real-time Robotic Operations (HERRO) concept, suggested that such a mission could be used to explore a wide variety of planetary destinations. [7]
The prevalence of high quality video conferencing using mobile devices, tablets and portable computers has enabled a drastic growth in telepresence robots to help give a better sense of remote physical presence for communication and collaboration in the office, home, school, etc. when one cannot be there in person. The robot avatar can move or look around at the command of the remote person. [11] [12]
There have been two primary approaches that both utilize videoconferencing on a display.
Traditional videoconferencing systems and telepresence rooms generally offer pan-tilt-zoom cameras with far end control. The ability for the remote user to turn the device's head and look around naturally during a meeting is often seen as the strongest feature of a telepresence robot. For this reason, the developers have emerged in the new category of desktop telepresence robots that concentrate on this strongest feature to create a much lower cost robot. The desktop telepresence robots, also called "head-and-neck robots" [14] allow users to look around during a meeting and are small enough to be carried from location to location, eliminating the need for remote navigation. [15]
Some telepresence robots are highly helpful for some children with long-term illnesses, who were unable to attend school regularly. Latest innovative technologies can bring people together, and it allows them to stay connected to each other, which significantly help them to overcome loneliness. [16]
Marine remotely operated vehicles (ROVs) are widely used to work in water too deep or too dangerous for divers. They repair offshore oil platforms and attach cables to sunken ships to hoist them. They are usually attached by a tether to a control center on a surface ship. The wreck of the Titanic was explored by an ROV, as well as by a crew-operated vessel.
Additionally, a lot of telerobotic research is being done in the field of medical devices, and minimally invasive surgical systems. With a robotic surgery system, a surgeon can work inside the body through tiny holes just big enough for the manipulator, with no need to open up the chest cavity to allow hands inside.
NIST maintains a set of test standards used for Emergency Response [17] and law enforcement telerobotic systems. [18] [19]
Remote manipulators are used to handle radioactive materials.
Telerobotics has been used in installation art pieces; Telegarden is an example of a project where a robot was operated by users through the Web.
Uncrewed spacecraft or robotic spacecraft are spacecraft without people on board. Uncrewed spacecraft may have varying levels of autonomy from human input, such as remote control, or remote guidance. They may also be autonomous, in which they have a pre-programmed list of operations that will be executed unless otherwise instructed. A robotic spacecraft for scientific measurements is often called a space probe or space observatory.
The Nomad rover is an uncrewed vehicle designed as a test for such a vehicle to ride on other planets.
Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.
Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance or effect of being present via telerobotics, at a place other than their true location.
Remote surgery is the ability for a doctor to perform surgery on a patient even though they are not physically in the same location. It is a form of telepresence. A robot surgical system generally consists of one or more arms, a master controller (console), and a sensory system giving feedback to the user. Remote surgery combines elements of robotics, telecommunications such as high-speed data connections and elements of management information systems. While the field of robotic surgery is fairly well established, most of these robots are controlled by surgeons at the location of the surgery. Remote surgery is remote work for surgeons, where the physical distance between the surgeon and the patient is less relevant. It promises to allow the expertise of specialized surgeons to be available to patients worldwide, without the need for patients to travel beyond their local hospital.
A remote-control vehicle, is defined as any vehicle that is teleoperated by a means that does not restrict its motion with an origin external to the device. This is often a radio-control device, a cable between the controller and the vehicle, or an infrared controller.
A robonaut is a humanoid robot, part of a development project conducted by the Dexterous Robotics Laboratory at NASA's Lyndon B. Johnson Space Center (JSC) in Houston, Texas. Robonaut differs from other current space-faring robots in that, while most current space robotic systems are designed to move large objects, Robonaut's tasks require more dexterity.
Teleoperation indicates operation of a system or machine at a distance. It is similar in meaning to the phrase "remote control" but is usually encountered in research, academia and technology. It is most commonly associated with robotics and mobile robots but can be applied to a whole range of circumstances in which a device or machine is operated by a person from a distance.
A remote manipulator, also known as a telefactor, telemanipulator, or waldo, is a device which, through electronic, hydraulic, or mechanical linkages, allows a hand-like mechanism to be controlled by a human operator. The purpose of such a device is usually to move or manipulate hazardous materials for reasons of safety, similar to the operation and play of a claw crane game.
Eric Paulos is an American computer scientist, artist, and inventor, best known for his early work on internet robotic teleoperation and is considered a founder of the field of Urban Computing, coining the term "urban computing" in 2004. His current work is in the areas of emancipation fabrication, cosmetic computing, citizen science, New Making Renaissance, Critical Making, Robotics, DIY Biology, DIY culture, Micro-volunteering, and the cultural critique of such technologies through New Media strategies.
A virtual fixture is an overlay of augmented sensory information upon a user's perception of a real environment in order to improve human performance in both direct and remotely manipulated tasks. Developed in the early 1990s by Louis Rosenberg at the U.S. Air Force Research Laboratory (AFRL), Virtual Fixtures was a pioneering platform in virtual reality and augmented reality technologies.
The idea of sending humans to Mars has been the subject of aerospace engineering and scientific studies since the late 1940s as part of the broader exploration of Mars. Long-term proposals have included sending settlers and terraforming the planet. Currently, only robotic landers and rovers have been on Mars. The farthest humans have been beyond Earth is the Moon, under the NASA's Apollo program.
Interplanetary contamination refers to biological contamination of a planetary body by a space probe or spacecraft, either deliberate or unintentional.
K10 are rovers used to explore planetary surfaces. Each third-generation K10 has four-wheel drive, all-wheel steering and a passive averaging suspension. This helps reduce the motion induced by travel over uneven ground. The K10 has mounting points on its front, back, and bottom that allows for antennas, sensors, and other scientific instruments to be attached. The K10 controller runs on a Linux laptop and communicates via 802.11g wireless, or a Tropos mesh wireless.
Adaptive collaborative control is the decision-making approach used in hybrid models consisting of finite-state machines with functional models as subcomponents to simulate behavior of systems formed through the partnerships of multiple agents for the execution of tasks and the development of work products. The term “collaborative control” originated from work developed in the late 1990s and early 2000 by Fong, Thorpe, and Baur (1999). It is important to note that according to Fong et al. in order for robots to function in collaborative control, they must be self-reliant, aware, and adaptive. In literature, the adjective “adaptive” is not always shown but is noted in the official sense as it is an important element of collaborative control. The adaptation of traditional applications of control theory in teleoperations sought initially to reduce the sovereignty of “humans as controllers/robots as tools” and had humans and robots working as peers, collaborating to perform tasks and to achieve common goals. Early implementations of adaptive collaborative control centered on vehicle teleoperation. Recent uses of adaptive collaborative control cover training, analysis, and engineering applications in teleoperations between humans and multiple robots, multiple robots collaborating among themselves, unmanned vehicle control, and fault tolerant controller design.
MiroSurge is a presently prototypic robotic system designed mainly for research in minimally invasive telesurgery. In the described configuration, the system is designed according to the master slave principle and enables the operator to remotely control minimally invasive surgical instruments including force/torque feedback. The scenario is developed at the Institute of Robotics and Mechatronics within the German Aerospace Center (DLR).
The Telenoid R1 is a remote-controlled telepresence android created by Japanese roboticist Hiroshi Ishiguro. The R1 model, released in August 2010, is approximately 80 cm tall, weighs 5 kg and is made out of silicone rubber. The primary usage of the Telenoid R1 is an audio and movement transmitter through which people can relay messages over long distances. The purpose is for the user to feel as though they are communicating with a far-away acquaintance. Cameras and microphones capture the voice and movements of an operator which are projected through the Telenoid R1 to the user.
Marsokhod is a Russian Mars rover project that was intended for use in the Mars-96 mission. Instead it was used for experiments into improving rover technology.
PrOP-M were two Soviet Mars rovers that were launched on the unsuccessful Mars 2 and Mars 3 missions in 1971. PrOP-M were the first rovers to be launched to Mars, 26 years before the first successful rover mission of NASA's Sojourner in 1997. Because the Mars 2 and Mars 3 missions failed, the existence of the rovers was kept secret for nearly 20 years.