Open-source robotics is a branch of robotics where robots are developed with open-source hardware and free and open-source software, publicly sharing blueprints, schematics, and source code. It is thus closely related to the open design movement, the maker movement [1] and open science.
Open source robotics means that information about the hardware is easily discerned, so that others can easily rebuild it. In turn, this requires design to use only easily available standard subcomponents and tools, and for the build process to be documented in detail including a bill of materials and detailed ('Ikea style') step-by-step building and testing instructions. (A CAD file alone is not sufficient, as it does not show the steps for performing or testing the build). These requirements are standard to open source hardware in general, and are formalised by various licences, certifications, especially those defined by the peer-reviewed journals HardwareX and Journal of Open Hardware.
Licensing requirements for software are the same as for any open source software. But in addition, for software components to be of practical use in real robot systems, they need to be compatible with other software, usually as defined by some robotics middleware community standard.
Applications to date include:
Most open source hardware definitions allow non-open subcomponents to be used in modular design, as long as they are easily available. However many designs try to push openness down into as many subcomponents as possible, with the aim of ultimately reaching fully open designs.
Open hardware manual-drive vehicles and their subcomponents, such as from Open Source Ecology, are often used as starting points and extended with automation systems.
Open subcomponents can include open-source computing hardware as subcomponents, such as Arduino and RISC-V, as well as open source motors and drivers such as the Open Source Motor Controller and ODrive.
Open source robots are often used together with, so are designed to interface to, the open source robotics middleware Robot Operating System and various open source simulators such as Gazebo, running on the open source Linux operating system.
Robotics middleware is software which links multiple other software components together. In robotics, this specifically means real-time communication systems with standardized message passing protocols. The predominant open source middleware is ROS, the robot operating system. Other alternatives include YARP -- used in the iCub, URBI, and Orca.
Most robot sensors and actuators require software drivers. There is little standardization of open source software at this level, because each hardware device is different. Creating open drivers for closed hardware is difficult as it requires both low level programming and reverse engineering.
Open source robotics simulators include Gazebo, MuJoCo and Webots. Open source 3D game engines such as Godot are also sometimes used as simulators, when equipped with suitable middleware interfaces.[ citation needed ]
At the level of AI, many standard algorithms have open source software implementations, mostly in ROS. Major components include:
The first signs of the increasing popularity of building robots yourself were found with the DIY community. What began with small competitions for remote operated vehicles (e.g. Robot combat), soon developed to the building of autonomous telepresence robots such as Sparky and then true robots (being able to take decisions themselves) as the Open Automaton Project. Several commercial companies now also produce kits for making simple robots.
The community has adopted open source hardware licenses, certifications, and peer-reviewed publications, which check that source has been made correctly and permanently available under community definitions, and which validate that this has been done. These processes have become critically important due to many historical projects claiming to be open source but them reverting on the promise due to commercialisation or other pressures.
As with other forms of open source hardware, the community continues to debate precise criteria for 'ease of build'. A common standard is that designs should be buildable by a technical university student, in a few days, using typical fablab tools, but definitions of all of these subterms can also be debated.
Compared to other forms of open source hardware, open source robotics typically includes a large software element, so involves software as well as hardware engineers. Open source concepts are more established in open source software than hardware, so robotics is a field in which those concepts can be shared and transferred from software to hardware.
While the community in open source robotics is multi-faceted with a wide range of backgrounds, a sizable sub-community uses the ROS middleware and meets annually at the ROSCon [21] conference to discuss development of ROS itself and automation components built on it.
Open-source hardware consists of physical artifacts of technology designed and offered by the open-design movement. Both free and open-source software (FOSS) and open-source hardware are created by this open-source culture movement and apply a like concept to a variety of components. It is sometimes, thus, referred to as FOSH. The term usually means that information about the hardware is easily discerned so that others can make it – coupling it closely to the maker movement. Hardware design, in addition to the software that drives the hardware, are all released under free/libre terms. The original sharer gains feedback and potentially improvements on the design from the FOSH community. There is now significant evidence that such sharing can drive a high return on investment for the scientific community.
Quantum programming is the process of designing or assembling sequences of instructions, called quantum circuits, using gates, switches, and operators to manipulate a quantum system for a desired outcome or results of a given experiment. Quantum circuit algorithms can be implemented on integrated circuits, conducted with instrumentation, or written in a programming language for use with a quantum computer or a quantum processor.
Physical computing involves interactive systems that can sense and respond to the world around them. While this definition is broad enough to encompass systems such as smart automotive traffic control systems or factory automation processes, it is not commonly used to describe them. In a broader sense, physical computing is a creative framework for understanding human beings' relationship to the digital world. In practical use, the term most often describes handmade art, design or DIY hobby projects that use sensors and microcontrollers to translate analog input to a software system, and/or control electro-mechanical devices such as motors, servos, lighting or other hardware.
Laboratory automation is a multi-disciplinary strategy to research, develop, optimize and capitalize on technologies in the laboratory that enable new and improved processes. Laboratory automation professionals are academic, commercial and government researchers, scientists and engineers who conduct research and develop new technologies to increase productivity, elevate experimental data quality, reduce lab process cycle times, or enable experimentation that otherwise would be impossible.
Modular self-reconfiguring robotic systems or self-reconfigurable modular robots are autonomous kinematic machines with variable morphology. Beyond conventional actuation, sensing and control typically found in fixed-morphology robots, self-reconfiguring robots are also able to deliberately change their own shape by rearranging the connectivity of their parts, in order to adapt to new circumstances, perform new tasks, or recover from damage.
The Player Project creates free and open-source software for research into robotics and sensor systems. Its components include the Player network server and the Stage platform robotics simulators. Although accurate statistics are hard to obtain, Player is one of the most popular open-source robot interfaces in research and post-secondary education.
Adaptable Robotics refers to a field of robotics with a focus on creating robotic systems capable of adjusting their hardware and software components to perform a wide range of tasks while adapting to varying environments. The 1960s introduced robotics into the industrial field. Since then, the need to make robots with new forms of actuation, adaptability, sensing and perception, and even the ability to learn stemmed the field of adaptable robotics. Significant developments such as the PUMA robot, manipulation research, soft robotics, swarm robotics, AI, cobots, bio-inspired approaches, and more ongoing research have advanced the adaptable robotics field tremendously. Adaptable robots are usually associated with their development kit, typically used to create autonomous mobile robots. In some cases, an adaptable kit will still be functional even when certain components break.
Webots is a free and open-source 3D robot simulator used in industry, education and research.
Robot Operating System is an open-source robotics middleware suite. Although ROS is not an operating system (OS) but a set of software frameworks for robot software development, it provides services designed for a heterogeneous computer cluster such as hardware abstraction, low-level device control, implementation of commonly used functionality, message-passing between processes, and package management. Running sets of ROS-based processes are represented in a graph architecture where processing takes place in nodes that may receive, post, and multiplex sensor data, control, state, planning, actuator, and other messages. Despite the importance of reactivity and low latency in robot control, ROS is not a real-time operating system (RTOS). However, it is possible to integrate ROS with real-time computing code. The lack of support for real-time systems has been addressed in the creation of ROS 2, a major revision of the ROS API which will take advantage of modern libraries and technologies for core ROS functions and add support for real-time code and embedded system hardware.
Robotics middleware is middleware to be used in complex robot control software systems.
Willow Garage was a robotics research lab and technology incubator devoted to developing hardware and open source software for personal robotics applications. The company was best known for its open source software suite Robot Operating System (ROS), which rapidly became a common, standard tool among robotics researchers upon its initial release in 2010. It was begun in late 2006 by Scott Hassan, who had worked with Larry Page and Sergey Brin to develop the technology that became the Google Search engine. Steve Cousins was the president and CEO. Willow Garage was located in Menlo Park, California.
Tendon-driven robots (TDR) are robots whose limbs mimic biological musculoskeletal systems. They use plastic straps to mimic muscles and tendons. Such robots are claimed to move in a "more natural" way than traditional robots that use rigid metal or plastic limbs controlled by geared actuators. TDRs can also help understand how biomechanics relates to embodied intelligence and cognition.
OMPL is a software package for computing motion plans using sampling-based algorithms. The content of the library is limited to motion planning algorithms, which means there is no environment specification, no collision detection or visualization. This is intentional as the library is designed to be easily integrated into systems that already provide the additional needed components. For example, OMPL is integrated with ROS and MoveIt!. In 2012 OMPL won the Grand Prize at the Open Source Software World Challenge.
Cloud robotics is a field of robotics that attempts to invoke cloud technologies such as cloud computing, cloud storage, and other Internet technologies centered on the benefits of converged infrastructure and shared services for robotics. When connected to the cloud, robots can benefit from the powerful computation, storage, and communication resources of modern data center in the cloud, which can process and share information from various robots or agent. Humans can also delegate tasks to robots remotely through networks. Cloud computing technologies enable robot systems to be endowed with powerful capability whilst reducing costs through cloud technologies. Thus, it is possible to build lightweight, low-cost, smarter robots with an intelligent "brain" in the cloud. The "brain" consists of data center, knowledge base, task planners, deep learning, information processing, environment models, communication support, etc.
Fog robotics can be defined as an architecture which consists of storage, networking functions, control with fog computing closer to robots.
AirSim is an open-source, cross platform simulator for drones, ground vehicles such as cars and various other objects, built on Epic Games’ proprietary Unreal Engine 4 as a platform for AI research. It is developed by Microsoft and can be used to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. This allows testing of autonomous solutions without worrying about real-world damage.
Open Robotics is a nonprofit corporation headquartered in Mountain View, California. It is the primary maintainer of the Robot Operating System, and the Gazebo simulator. Its stated mission is to support "the development, distribution and adoption of open source software for use in robotics research, education, and product development".
Vladlen Koltun is an Israeli-American computer scientist and intelligent systems researcher. He currently serves as distinguished scientist at Apple Inc. His main areas of research are artificial intelligence, computer vision, machine learning, and pattern recognition. He also made a significant contribution to robotics and autonomous driving.