The Johns Hopkins Beast was a mobile automaton, an early pre-robot, built in the 1960s at the Johns Hopkins University Applied Physics Laboratory. The machine had a rudimentary intelligence and the ability to survive on its own. As it wandered through the white halls of the laboratory, it would seek black wall outlets. When it found one it would plug in and recharge.
The robot was cybernetic. It did not use a computer. Its control circuitry consisted of dozens of transistors controlling analog voltages. It used photocell optics and sonar to navigate. The 2N404 transistors were used to create NOR logic gates that implemented the Boolean logic to tell it what to do when a specific sensor was activated. The 2N404 transistors were also used to create timing gates to tell it how long to do something. 2N1040 Power transistors were used to control the power to the motion treads, the boom, and the charging mechanism.
The original sensors in Mod I were physical touch only. The wall socket was detected by physical switches on the arm that followed the wall. Once detected, two electrical prongs were extended until they entered the wall socket and made the electrical connection to charge the vehicle. The stairway, doors, and pipes on the hall wall were also detected by physical switches and recognized by appropriate logic.
The sonar guidance system was developed for Mod I and improved for Mod II. It used two ultrasonic transducers to determine distance, location within the halls, and obstructions in its path. This provided "The Beast" with bat-like guidance. At this point, it could detect obstructions in the hallway, such as people. Once an obstruction was detected, the Beast would slow down and then decide whether to stop or divert around the obstruction. It could also ultrasonically recognize the stairway and doorways to take appropriate action.
An optical guidance system was added to Mod II. This provided, among other capabilities, the ability to optically identify the black wall sockets that contrasted with the white wall.
The Hopkins Beast Autonomous Robot Mod II link below was written by Dr. Ronald McConnell, at that time a co-op student and one of the designers for Mod II.
An autonomous robot is a robot that acts without recourse to human control. Historic examples include space probes. Modern examples include self-driving vacuums and cars.
An autofocus (AF) optical system uses a sensor, a control system and a motor to focus on an automatically or manually selected point or area. An electronic rangefinder has a display instead of the motor; the adjustment of the optical system has to be done manually until indication. Autofocus methods are distinguished as active, passive or hybrid types.
Laser guidance directs a robotics system to a target position by means of a laser beam. The laser guidance of a robot is accomplished by projecting a laser light, image processing and communication to improve the accuracy of guidance. The key idea is to show goal positions to the robot by laser light projection instead of communicating them numerically. This intuitive interface simplifies directing the robot while the visual feedback improves the positioning accuracy and allows for implicit localization. The guidance system may serve also as a mediator for cooperative multiple robots. Examples of proof-of-concept experiments of directing a robot by a laser pointer are shown on video. Laser guidance spans areas of robotics, computer vision, user interface, video games, communication and smart home technologies.
An autonomous underwater vehicle (AUV) is a robot that travels underwater without requiring continuous input from an operator. AUVs constitute part of a larger group of undersea systems known as unmanned underwater vehicles, a classification that includes non-autonomous remotely operated underwater vehicles (ROVs) – controlled and powered from the surface by an operator/pilot via an umbilical or using remote control. In military applications an AUV is more often referred to as an unmanned undersea vehicle (UUV). Underwater gliders are a subclass of AUVs. Homing torpedoes can also be considered as a subclass of AUVs.
The Johns Hopkins University Applied Physics Laboratory is a not-for-profit university-affiliated research center (UARC) in Howard County, Maryland. It is affiliated with Johns Hopkins University and employs 8,700 people as of 2024. APL is the nation's largest UARC.
The Electrolux Trilobite is a robotic vacuum cleaner manufactured by the Swedish corporation Electrolux. It takes its name from the extinct arthropod, which scoured the ocean's floor.
A motion detector is an electrical device that utilizes a sensor to detect nearby motion. Such a device is often integrated as a component of a system that automatically performs a task or alerts a user of motion in an area. They form a vital component of security, automated lighting control, home control, energy efficiency, and other useful systems. It can be achieved by either mechanical or electronic methods. When it is done by natural organisms, it is called motion perception.
A mobile robot is an automatic machine that is capable of locomotion. Mobile robotics is usually considered to be a subfield of robotics and information engineering.
The Mark 37 torpedo is a torpedo with electrical propulsion, developed for the US Navy after World War II. It entered service with the US Navy in the early 1950s, with over 3,300 produced. It was phased out of service with the US Navy during the 1970s, and the stockpiles were sold to foreign navies.
Lego Mindstorms NXT is a programmable robotics kit released by Lego on August 2, 2006. It replaced the Robotics Invention System, the first-generation Lego Mindstorms kit. The base kit ships in two versions: the retail version and the education base set. It comes with the NXT-G programming software or the optional LabVIEW for Lego Mindstorms. A variety of unofficial languages exist, such as NXC, NBC, leJOS NXJ, and RobotC. A second-generation set, Lego Mindstorms NXT 2.0, was released on August 1, 2009, with a color sensor and other upgrades. The third-generation EV3 was released in September 2013.
Robotics is the branch of technology that deals with the design, construction, operation, structural disposition, manufacture and application of robots. Robotics is related to the sciences of electronics, engineering, mechanics, and software. The word "robot" was introduced to the public by Czech writer Karel Čapek in his play R.U.R., published in 1920. The term "robotics" was coined by Isaac Asimov in his 1941 science fiction short-story "Liar!"
Obstacle avoidance, in robotics, is a critical aspect of autonomous navigation and control systems. It is the capability of a robot or an autonomous system/machine to detect and circumvent obstacles in its path to reach a predefined destination. This technology plays a pivotal role in various fields, including industrial automation, self-driving cars, drones, and even space exploration. Obstacle avoidance enables robots to operate safely and efficiently in dynamic and complex environments, reducing the risk of collisions and damage.
Curb feelers or curb finders are springs or wires installed on a vehicle that act as "whiskers" to alert drivers when they are at the right distance from the curb while parking.
A proximity sensor is a sensor able to detect the presence of nearby objects without any physical contact.
Ultrasonic transducers and ultrasonic sensors are devices that generate or sense ultrasound energy. They can be divided into three broad categories: transmitters, receivers and transceivers. Transmitters convert electrical signals into ultrasound, receivers convert ultrasound into electrical signals, and transceivers can both transmit and receive ultrasound.
Robot localization denotes the robot's ability to establish its own position and orientation within the frame of reference. Path planning is effectively an extension of localization, in that it requires the determination of the robot's current position and a position of a goal location, both within the same frame of reference or coordinates. Map building can be in the shape of a metric map or any notation describing locations in the robot frame of reference.
Allen was a robot introduced by Rodney Brooks and his team in the late 1980s, and was their first robot based on subsumption architecture. It had sonar distance and odometry on board, and used an offboard lisp machine to simulate subsumption architecture. It resembled a footstool on wheels.
In computing, an input device is a piece of equipment used to provide data and control signals to an information processing system, such as a computer or information appliance. Examples of input devices include keyboards, computer mice, scanners, cameras, joysticks, and microphones.
The following outline is provided as an overview of and topical guide to robotics:
Flakey the robot was a research robot created at SRI International's Artificial Intelligence Center and was the successor to Shakey the robot. It is featured in a Scientific American Frontiers episode.