Air-Cobot

Last updated

Air-Cobot
Logo Air-Cobot.png
CountryFrance
Type Cobot
Website aircobot.akka.eu

Air-Cobot (Aircraft Inspection enhanced by smaRt & Collaborative rOBOT) is a French research and development project of a wheeled collaborative mobile robot able to inspect aircraft during maintenance operations. This multi-partner project involves research laboratories and industry. Research around this prototype was developed in three domains: autonomous navigation, human-robot collaboration and nondestructive testing.

Contents

Air-Cobot is presented as the first wheeled robot able to perform visual inspections of aircraft. Inspection robots using other types of sensors have been considered before, such as the European project Robair. Since the launch of the project, other solutions based on image processing began to be developed, such as EasyJet with a drone, the swarm of drones from Toulouse company Donecle and the Aircam project of the aerospace manufacturer Airbus.

Since the beginning of the project in 2013, the Air-Cobot robot is dedicated to inspect the lower parts of an aircraft. In the continuation of the project, there is the prospect of coupling with a drone to inspect an aircraft's upper parts. In October 2016, Airbus Group launched its research project on the hangar of the future in Singapore. The robots from the Air-Cobot and Aircam projects are included in it.

Project description

Objectives

Launched in January 2013, [1] the project is part of the Interministerial Fund program of Aerospace Valley, a business cluster in southwestern France. [2] With a budget of over one million euros, [3] Air-Cobot aims to develop an innovative collaborative mobile robot, autonomous in its movements and able to perform the inspection of an aircraft with nondestructive testing sensors during preflight or during maintenance operations in a hangar. [2] [4] Testing has been performed at the premises of Airbus and Air France Industries. [5]

Partners

Air-Cobot has been tested on Airbus A320s in the premises of Airbus and Air France Industries. Air France Airbus A320-214; F-GKXC@MIA;17.10.2011 626bf (6446642243).jpg
Air-Cobot has been tested on Airbus A320s in the premises of Airbus and Air France Industries.

The project leader is Akka Technologies. There are two academic partners; Akka Technologies and four other companies make up the five commercial partners. [6]

Academic partners
Industrial partners

Project finance

Project finance is provided by banque publique d'investissement, the Aquitaine Regional Council, the Pyrénées-Atlantiques Departemental Council, the Midi-Pyrénées Regional Council and by the European Union. [12]

Expected benefits

Aircraft are inspected during maintenance operations either outdoors on an airport between flights, or in a hangar for longer-duration inspections. These inspections are conducted mainly by human operators, visually and sometimes using tools to assess defects. [A 1] The project aims to improve inspections of aircraft and traceability. A database dedicated to each aircraft type, containing images and three-dimensional scans, will be updated after each maintenance. This allows for example to assess the propagation of a crack. [4] [13]

The human operator's eyes fatigue over time while an automatic solution ensures reliability and repeatability of inspections. The decrease in time taken for inspections is a major objective for aircraft manufacturers and airlines. If maintenance operations are faster, this will optimize the availability of aircraft and reduce maintenance operating costs. [4] [13]

Robot equipment

Air-Cobot in a hangar of Air France Industries. AFI 03 2016 Air-Cobot in hangar.png
Air-Cobot in a hangar of Air France Industries.

All electronics equipment is carried by the 4MOB mobile platform manufactured by Sterela. The off-road platform, equipped with four-wheel drive, can move at a speed of 2 metres per second (7.2 kilometres per hour (4.47 mph)). [11] Its lithium-ion battery allows an operating time of eight hours. Two bumpers are located at the front and at the rear. These are obstacle detection bumpers. They stop the platform if they are compressed. [11]

The cobot weighs 230 kilograms (507 lb). It has two computers, one running Linux for the autonomous navigation module and the other Windows for the non-destructive testing module. The robot is equipped with several sensors. The pan-tilt-zoom camera manufactured by Axis Communications and Eva 3D scanner manufactured by Artec 3D are dedicated to inspection. The sensors for navigation are an inertial measurement unit; two benches, each equipped with two PointGrey cameras; two Hokuyo laser range finders; and a GPS unit developed by M3 Systems that allows for geofencing tasks in outdoor environments. [3] [7]

Autonomous navigation

The autonomous navigation of the Air-Cobot robot is in two phases. The first, navigation in the airport or the factory, allows the robot to move close to the aircraft. The second navigation, around the aircraft, allows the robot to position itself at control points referenced in the aircraft virtual model. In addition, the robot must insert itself in a dynamic environment where humans and vehicles are moving. To address this problem, it has an obstacle avoidance module. Many navigation algorithms are constantly running on the robot with real time constraints. Searches are conducted on optimizing the computing time.[ citation needed ][ clarification needed ]

In an outdoor environment, the robot is able to go to the inspection site by localizing through Global Positioning System (GPS) data. The GPS device developed by M3 Systems allows geofencing. At the airport, the robot operates in dedicated navigation corridors respecting speed limits. Alerts are sent to the operator if the robot enters a prohibited area or exceeds a given speed. [10] [A 2]

Another algorithm based on computer vision provides, in real-time, a lane marking detection. When visible, painted lanes on the ground can provide complementary data to the positioning system to have safer trajectories. [A 3] If in an indoor environment or an outdoor environment where GPS information is not available, the cobot can be switch to follower mode to move behind the human operator and follow her or him to the aircraft to inspect. [14] [A 2]

To perform the inspection, the robot has to navigate around the aircraft and get to the checkpoints called up in the aircraft virtual model. The position of the aircraft in the airport or factory is not known precisely; the cobot needs to detect the aircraft in order to know its position and orientation relative to the aircraft. To do this, the robot is able to locate itself, either with the laser data from its laser range finders, [A 4] or with image data from its cameras. [A 1] [A 5]

Near the aircraft, a point cloud in three dimensions is acquired by changing the orientation of the laser scanning sensors fixed on pan-tilt units. After filtering data to remove floor- or insufficiently large dot clusters, a registration technique with the model of the aircraft is used to estimate the static orientation of the robot. The robot moves and holds this orientation by considering its wheel odometry, its inertial unit and visual odometry. [A 4]

Air-Cobot can estimate its position relative to an aircraft by using visual landmarks on the fuselage. Airbus A320 (Air France) (5796695561).jpg
Air-Cobot can estimate its position relative to an aircraft by using visual landmarks on the fuselage.

Laser data are also used horizontally in two dimensions. An algorithm provides a real-time position estimation of the robot when enough elements from the landing gears and engines are visible. A confidence index is calculated based on the number of items collected by lasers. If good data confidence is achieved, the position is updated. This mode is particularly used when the robot moves beneath the aircraft. [A 4]

For visual localization, the robot estimates its position relative to the aircraft using visual elements (doors, windows, tires, static ports etc.) of the aircraft. During the evolution of the robot, these visual elements are extracted from a three-dimensional virtual model of the aircraft and projected in the image plane of the cameras. The projected shapes are used for pattern recognition to detect those visual elements. [A 5] The other detection method used is based on the extraction of features with a Speeded Up Robust Features (SURF) approach. A pairing is performed between images of each element to be detected and the actual scene experienced. [A 1]

By detecting and tracking visual landmarks, in addition to estimating its position relative to the aircraft, the robot can perform a visual servoing. [A 6] Research in vision is also conducted on simultaneous localization and mapping (SLAM). [A 7] [A 8] A merger of information between the two methods of acquisition and laser vision is being considered. Artificial intelligence arbitrating various locations is also under consideration. [A 4] [A 1]

Obstacle avoidance

In both navigation modes, Air-Cobot is also able to detect, track, identify and avoid obstacles that are in its way. The laser data from laser range sensors and visual data from the cameras can be used for detection, monitoring and identification of the obstacles. The detection and monitoring are better in the two-dimensional laser data, while identification is easier in the images from the cameras; the two methods are complementary. Information from laser data can be used to delimit work areas in the image. [A 6] [A 9] [A 10]

The robot has several possible responses to any obstacles. These will depend on its environment (navigation corridor, tarmac area without many obstacles, cluttered indoor environment etc.) at the time of the encounter with an obstacle. It can stop and wait for a gap in traffic, or avoid an obstacle by using a technique based on a spiral, or perform path planning trajectories. [A 6] [A 10]

Computing time optimization

Given the number of navigation algorithms calculating simultaneously to provide all the information in real time, research has been conducted to improve the computation time of some numerical methods using field-programmable gate arrays. [A 11] [A 12] [A 13] The research focused on visual perception. The first part was focused on the simultaneous localization and mapping with an extended Kalman filter that estimates the state of a dynamic system from a series of noisy or incomplete measures. [A 11] [A 13] The second focused on the location and the detection of obstacles. [A 12]

Non-destructive testing

Air-Cobot can inspect the blades of a turbofan engine. Kiefer Lufthansa Airbus A320 D-AIZQ (13315338774).jpg
Air-Cobot can inspect the blades of a turbofan engine.

Image analysis

After having positioned to perform a visual inspection, the robot performs an acquisition with a pan-tilt-zoom camera. Several steps take place: pointing the camera, sensing the element to be inspected, if needed repointing and zooming with the camera, image acquisition and inspection. Image analysis is used on doors to determine whether they are open or closed; on the presence or absence of protection for certain equipment; the state of turbofan blades or the wear of landing gear tires. [A 14] [A 15] [A 16] [A 17]

The detection uses pattern recognition of regular shapes (rectangles, circles, ellipses). The 3D model of the element to be inspected can be projected in the image plane for more complex shapes. The evaluation is based on indices such as the uniformity of segmented regions, convexity of their forms, or periodicity of the image pixels' intensity. [A 14]

The feature extraction using speeded up robust features (SURF) is also able to perform the inspection of certain elements having two possible states, such as pitot probes or static ports being covered or not covered. A pairing is performed between images of the element to be inspected in different states and that present on the scene. For these simple items to be inspected, an analysis during navigation is possible and preferable due to its time saving. [A 1] [A 18]

Point cloud analysis

After having positioned to perform a scan inspection, the pantograph elevates the 3D scanner at the fuselage. A pan-tilt unit moves the scan device to acquire the hull. By comparing the data acquired to the three-dimensional model of the aircraft, algorithms are able to diagnose any faults in the fuselage structure and provide information on their shape, size and depth. [15] [A 19] [A 20]

By moving the pan-tilt units of the laser range finders, it is also possible to obtain a point cloud in three dimensions. Technical readjustment between the model of the aircraft and the scene point cloud is already used in navigation to estimate the static placement of the robot. It is planned to make targeted acquisitions, simpler in terms of movement, to verify the absence of chocks in front of the landing gear wheels, or the proper closing of engine cowling latches. [A 4]

Collaboration human-robot

As the project name suggests, the mobile robot is a cobot – a collaborative robot. During phases of navigation and inspection, a human operator accompanies the robot; he can take control if necessary, add inspection tasks, note a defect that is not in the list of robot checks, or validate the results. In the case of pre-flight inspections, the diagnosis of the walk-around is sent to the pilot who decides whether or not to take off. [7] [14] [A 21]

Other robotic inspection solutions

The drones can inspect upper parts of the aircraft such as the tail and simplify maintenance checks. Aircraft maintenance dashQ400.JPG
The drones can inspect upper parts of the aircraft such as the tail and simplify maintenance checks.

European project Robair

The inspection robot of the European project Robair, funded from 2001 to 2003, is designed to mount on the wings and fuselage of an aircraft to inspect rows of rivets. To move, the robot uses a flexible network of pneumatic suction cups that are adjustable to the surface. It can inspect the lines of rivets with ultrasonic waves, eddy current and thermographic techniques. It detects loose rivets and cracks. [16] [17] [18]

EasyJet drone

Airline EasyJet is interested in the inspection of aircraft with drones. It made a first inspection in 2015. Equipped with laser sensors and high resolution camera, the drone performs autonomous flight around the aeroplane. It generates a three-dimensional image of the aircraft and transmits it to a technician. The operator can then navigate in this representation and zoom to display a high-resolution picture of some parts of the aircraft. The operator must then visually diagnose the presence or absence of defects. This approach avoids the use of platforms to observe the upper parts of the aeroplane. [19]

Donecle drone

Donecle's autonomous drone inspecting an aircraft. AFI 05 2017 Donecle drone 003.jpg
Donecle's autonomous drone inspecting an aircraft.

Founded in 2015, Donecle, a Toulouse start-up company, has also launched a drone approach which was initially specialized in the detection of lightning strikes on aeroplanes. [20] [21] Performed by five people equipped with harnesses and platforms, this inspection usually takes about eight hours. The immobilization of the aircraft and the staff are costly for the airlines, estimated at $10 000 per hour. The solution proposed by the start-up lasts twenty minutes. [21]

Donecle uses a swarm of drones equipped with laser sensors and micro-cameras. The algorithms for automatic detection of defects, trained on existing images database with a machine learning software, are able to identify various elements: texture irregularities, pitot probes, rivets, openings, text, defects, corrosion, oil stains. A damage report is sent on the operator's touch pad with each area of interest and the proposed classification with a probability percentage. After reviewing the images, the verdict is pronounced by a qualified inspector. [21]

Project continuation

In 2015, in an interview given to the French weekly magazine Air & Cosmos , Jean-Charles Marcos, chief executive officer (CEO) of Akka Research, explained that once developed and marketed the Air-Cobot should cost between 100,000 and 200,000 euros. He could meet civilian needs in nondestructive testing and also military ones. [3] A possible continuation of the project could be the use of the robot on aircraft larger than the Airbus A320. The CEO also revealed that Akka Technologies plans to work on a duo of robots for inspection: the same mobile platform for the lower parts, and a drone for the upper parts. If funding is allocated then this second phase would take place during the period 2017–2020. [3]

At the Singapore Airshow in February 2016, Airbus Group presented Air-Cobot and its use in its vision of the hangar of the future. [22] The same month, the Singapore government enlisted Airbus Group to help local maintenance, repair, and operations providers to stay competitive against neighbour countries like Indonesia, Thailand and the Philippines which are cheaper. To improve productivity, Airbus Group launches, in October 2016, a testbed hangar where new technologies can be tested. Upon entering the hangar, cameras study the aircraft to detect damages. Mobile robots, such as the one of the Air-Cobot project, and drones, such as the one of the Aircam project, carry out more detailed inspections. [23]

During the 14th International Conference on Remote Engineering and Virtual Instrumentation in March 2017, Akka Research Toulouse, one of the centers for research and development of Akka Technologies, presents its vision of the airport of the future. [A 2] In addition of Air-Cobot, a previous step in this research axis is Co-Friend, an intelligent video surveillance system to monitor and improve airport operations. [A 2] [24] Futur researches will focus on the management of this operations, autonomous vehicles, non-destructive testing and human-machine interactions to increase efficiency and security on airports. [A 2] From August 2017, the robot comes in once a month in Aeroscopia, an aeronautics museum of Blagnac. The researchers of the project take advantage of the collection to test the robot and acquire data on other aircraft models such as Airbus A400M, Airbus A300 and Sud-Aviation SE 210 Caravelle. [25]

Communications

Air-Cobot under the belly of an Airbus A320 in a hangar. AFI 03 2016 Air-Cobot under an Airbus A320.png
Air-Cobot under the belly of an Airbus A320 in a hangar.

On 23 October 2014, a patent was filed by Airbus. [26] From 2014 to 2016, the robot had presentations in five exhibitions including Paris Air Show 2015, [1] [27] [28] and Singapore Airshow 2016. [22] [29] The research developed in the project was presented in eighteen conferences. Twenty-one scientific articles were published seventeen conference proceedings and four journal articles. [30] Part of publications is centered on navigation and/or inspection by Air-Cobot while the rest focuses on specific numerical methods or hardware solutions related to the issues of the project. During the international conference Machine Control and Guidance (MCG) of 2016, the prize for the best final application is awarded to the authors of the publication Human-robot collaboration to perform aircraft inspection in working environment. [31]

On 17 April 2015, Airbus Group distributed a project presentation video, made by the communication agency Clipatize, on its YouTube channel. [14] [32] On 25 September 2015, Toulouse métropole broadcasts a promotional video on its YouTube channel. Toulouse metropolis is presented as an attractive ecosystem, able to build the future and highlights its visibility internationally. The Air-Cobot demonstrator was chosen to illustrate the robotics research of this metropolis. [33] Located at Laboratoire d'analyse et d'architecture des systèmes during development, researchers or engineers working on the project regularly present a demonstration to visitors (external researchers, industrial partners, or students); it was also demonstrated to the general public during the 2015 Feast of Science. [34] Airbus Group, on 17 February 2016, broadcast a YouTube video presentation of its vision of the hangar of the future in which it plans to use Air-Cobot. [22]

See also

Notes and references

Research publications of the project

Proceedings

Journal articles

PhD thesis reports

Other references

  1. 1 2 (in French)Xavier Martinage (17 June 2015). "Air-Cobot : le robot dont dépendra votre sécurité". lci.tf1.fr. La Chaîne Info. Archived from the original on 3 January 2016. Retrieved 12 July 2016.
  2. 1 2 (in French) "Air-Cobot : un nouveau mode d'inspection visuelle des avions". competitivite.gouv.fr. Les pôles de compétitivité. Archived from the original on 11 October 2016. Retrieved 12 July 2016.
  3. 1 2 3 4 5 6 (in French)Olivier Constant (11 September 2015). "Le projet Air-Cobot suit son cours". Air et Cosmos (2487). Retrieved 12 July 2016.
  4. 1 2 3 (in French) "Rapport d'activité 2013–2014 de l'Aerospace Valley" (PDF). aerospace-valley.com. Aerospace Valley. Archived from the original (PDF) on 24 September 2016. Retrieved 12 July 2016.
  5. 1 2 (in French) "News du projet Air-Cobot". aircobot.akka.eu. Akka Technologies. Archived from the original on 10 July 2016. Retrieved 12 July 2016.
  6. 1 2 3 4 5 6 7 8 (in French) "AKKA Technologies coordonne le projet Air-COBOT, un robot autonome d'inspection visuelle des avions". Capital . 1 July 2014. Archived from the original on 25 June 2016. Retrieved 14 July 2016.
  7. 1 2 3 4 5 6 7 8 9 (in French) "Air-Cobot, le robot qui s'assure que vous ferez un bon vol !". Planète Robots (38): 32–33. March–April 2016.
  8. (in French) "Contrats RAP". Laboratoire d'analyse et d'architecture des systèmes. Archived from the original on 14 September 2015. Retrieved 17 July 2016.
  9. (in French) "Akka Technologies : une marque employeur orientée sur l'innovation". Le Parisien . 15 February 2016. Retrieved 17 July 2016.
  10. 1 2 "M3 Systems Flagship Solution". M3 Systems. Archived from the original on 6 August 2016. Retrieved 17 July 2016.
  11. 1 2 3 (in French) "4MOB, plateforme intelligente autonome" (PDF). Sterela Solutions. Archived from the original (PDF) on 9 August 2016. Retrieved 17 July 2016.
  12. (in French) "Financeurs". aircobot.akka.eu. Akka Technologies. Archived from the original on 4 August 2016. Retrieved 15 July 2016.
  13. 1 2 (in French)Véronique Guillermard (18 May 2015). "Aircobot contrôle les avions avant le décollage". Le Figaro . Retrieved 14 July 2016.
  14. 1 2 3 Air-Cobot on YouTube
  15. (in French)Pascal NGuyen (December 2014). "Des robots vérifient l'avion au sol". Sciences et Avenir (814). Archived from the original on 8 August 2016. Retrieved 17 July 2016.
  16. (in French) "Robair, Inspection robotisée des aéronefs". European Commission . Retrieved 16 July 2016.
  17. "Robair". London South Bank University . Retrieved 16 July 2016.
  18. Shang, Jianzhong; Sattar, Tariq; Chen, Shuwo; Bridge, Bryan (2007). "Design of a climbing robot for inspecting aircraft wings and fuselage" (PDF). Industrial Robot. 34 (6): 495–502. doi:10.1108/01439910710832093.
  19. (in French)Newsroom (8 June 2015). "Easy Jet commence à utiliser des drones pour l'inspection de ses avions". Humanoides. Archived from the original on 12 October 2015. Retrieved 16 July 2016.
  20. (in French)Florine Galéron (28 May 2015). "Aéronautique : la startup Donecle invente le drone anti-foudre". Objectif News, la Tribune. Retrieved 16 July 2016.
  21. 1 2 3 (in French)Arnaud Devillard (20 April 2016). "Des drones pour inspecter des avions". Sciences et Avenir. Archived from the original on 8 August 2016. Retrieved 16 July 2016.
  22. 1 2 3 Innovations in Singapore: the Hangar of the Future on YouTube
  23. "Pimp my Hangar: Excelling in MRO". airbusgroup.com. Airbus. Archived from the original on 21 December 2016. Retrieved 21 December 2016.
  24. (in French)Éric Parisot (21 June 2013). "Co-Friend, le système d'analyse d'images qui réduit les temps d'immobilisation des avions". Usine Digitale. Retrieved 24 February 2018.
  25. (in French)Aeroscopia, ed. (August 2017). "Le Musée accueille le projet AIR-COBOT". musee-aeroscopia.fr. Archived from the original on 14 October 2017. Retrieved 24 February 2018.
  26. "Espacenet – Bibliographic data – Collaborative robot for visually inspecting an aircraft". worldwide.espacenet.com. Retrieved 1 June 2016.
  27. (in French)Juliette Raynal; Jean-François Prevéraud (15 June 2015). "Bourget 2015 : les dix rendez-vous technos à ne pas louper". Industrie et Technologies. Retrieved 16 July 2016.
  28. (in French) "Akka Technologies au Salon du Bourget". Maurice Ricci. 21 June 2015. Archived from the original on 4 April 2016. Retrieved 16 July 2015.
  29. "Singapore Airshow 2016 Trends: Emerging Technologies Take Off – APEX | Airline Passenger Experience". apex.aero. Retrieved 1 June 2016.
  30. "Communications du projet Air-Cobot". aircobot.akka.eu (in French). Akka Technologies. Archived from the original on 11 August 2016. Retrieved 14 July 2016.
  31. "Best MCG2016 Final Application Award" (PDF). mcg2016.irstea.fr. Machine Control and Guidance. October 2016. Retrieved 22 February 2020.
  32. "AirCobot – Introducing Smart Robots for Aircraft Inspections". clipatize.com. Clipatize. Archived from the original on 6 August 2016. Retrieved 15 August 2016.
  33. (in French) Toulouse métropole, construire le futur on YouTube
  34. Air-Cobot, le robot d'assistance aux inspections des aéronefs (PDF). Programme de la fête de la science (in French). 2015. Retrieved 17 July 2016.

Related Research Articles

Computer vision tasks include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions. Understanding in this context means the transformation of visual images into descriptions of the world that make sense to thought processes and can elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.

<span class="mw-page-title-main">Lidar</span> Method of spatial measurement using laser

Lidar is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. Lidar may operate in a fixed direction or it may scan multiple directions, in which case it is known as lidar scanning or 3D laser scanning, a special combination of 3-D scanning and laser scanning. Lidar has terrestrial, airborne, and mobile applications.

An autonomous robot is a robot that acts without recourse to human control. The first autonomous robots environment were known as Elmer and Elsie, which were constructed in the late 1940s by W. Grey Walter. They were the first robots in history that were programmed to "think" the way biological brains do and meant to have free will. Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable of phototaxis which is the movement that occurs in response to light stimulus.

<span class="mw-page-title-main">Unmanned aerial vehicle</span> Aircraft without any human pilot or passengers on board

An unmanned aerial vehicle (UAV), commonly known as a drone, is an aircraft without any human pilot, crew, or passengers on board. UAVs were originally developed through the twentieth century for military missions too "dull, dirty or dangerous" for humans, and by the twenty-first, they had become essential assets to most militaries. As control technologies improved and costs fell, their use expanded to many non-military applications. These include aerial photography, precision agriculture, forest fire monitoring, river monitoring, environmental monitoring, policing and surveillance, infrastructure inspections, smuggling, product deliveries, entertainment, and drone racing.

<span class="mw-page-title-main">Machine vision</span> Technology and methods used to provide imaging-based automatic inspection and analysis

Machine vision (MV) is the technology and methods used to provide imaging-based automatic inspection and analysis for such applications as automatic inspection, process control, and robot guidance, usually in industry. Machine vision refers to many technologies, software and hardware products, integrated systems, actions, methods and expertise. Machine vision as a systems engineering discipline can be considered distinct from computer vision, a form of computer science. It attempts to integrate existing technologies in new ways and apply them to solve real world problems. The term is the prevalent one for these functions in industrial automation environments but is also used for these functions in other environment vehicle guidance.

Robotic mapping is a discipline related to computer vision and cartography. The goal for an autonomous robot is to be able to construct a map or floor plan and to localize itself and its recharging bases or beacons in it. Robotic mapping is that branch which deals with the study and application of ability to localize itself in a map / plan and sometimes to construct the map or floor plan by the autonomous robot.

<span class="mw-page-title-main">LANTIRN</span> US Air Force navigation and targeting system

LANTIRN is a combined navigation and targeting pod system for use on the United States Air Force fighter aircraft—the F-15E Strike Eagle and F-16 Fighting Falcon manufactured by Martin Marietta. LANTIRN significantly increases the combat effectiveness of these aircraft, allowing them to fly at low altitudes, at night and under-the-weather to attack ground targets with a variety of precision-guided weapons.

The following outline is provided as an overview of and topical guide to computer vision:

<span class="mw-page-title-main">Northrop Grumman Bat</span> Unmanned aerial vehicle

The Northrop Grumman Bat is a medium-altitude unmanned air vehicle originally developed for use by the United States Armed Forces. Designed primarily as an intelligence "ISR" gathering tool, the Bat features 30 lb (14 kg) payload capacity that is unmatched in a 10 ft (3.0 m) wing span.

Electro-optical MASINT is a subdiscipline of Measurement and Signature Intelligence, (MASINT) and refers to intelligence gathering activities which bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).

<span class="mw-page-title-main">Robotic vacuum cleaner</span> Autonomous vacuum floor cleaning system

A robotic vacuum cleaner, sometimes called a robovac or a roomba as a generic trademark, is an autonomous robotic vacuum cleaner which has a limited vacuum floor cleaning system combined with sensors and robotic drives with programmable controllers and cleaning routines. Early designs included manual operation via remote control and a "self-drive" mode which allowed the machine to clean autonomously.

<span class="mw-page-title-main">LAURON</span>

LAURON is a six-legged walking robot, which is being developed at the FZI Forschungszentrum Informatik in Germany. The mechanics and the movements of the robot are biologically-inspired, mimicking the stick insect Carausius Morosus. The development of the LAURON walking robot started with basic research in field of six-legged locomotion in the early 1990s and led to the first robot, called LAURON. In the year 1994, this robot was presented to public at the CeBIT in Hanover. This first LAURON generation was, in contrast to the current generation, controlled by an artificial neural network, hence the robot's German name: LAUfROboter Neuronal gesteuert. The current generation LARUON V was finished in 2013.

<span class="mw-page-title-main">National Robotics Engineering Center</span> Operating unit within the Robotics Institute of Carnegie Mellon University

The National Robotics Engineering Center (NREC) is an operating unit within the Robotics Institute (RI) of Carnegie Mellon University. NREC works closely with government and industry clients to apply robotic technologies to real-world processes and products, including unmanned vehicle and platform design, autonomy, sensing and image processing, machine learning, manipulation, and human–robot interaction.

<span class="mw-page-title-main">DelFly</span>

The DelFly is a fully controllable camera-equipped flapping wing Micro Air Vehicle or Ornithopter developed at the Micro Air Vehicle Lab of the Delft University of TechnologyArchived 2019-10-19 at the Wayback Machine in collaboration with Wageningen University.

An autonomous aircraft is an aircraft which flies under the control of automatic systems and needs no intervention from a human pilot. Most autonomous aircraft are unmanned aerial vehicle or drones. However, autonomous control systems are reaching a point where several air taxis and associated regulatory regimes are being developed.

<span class="mw-page-title-main">Enhanced flight vision system</span> Airborne system with imaging sensors

An enhanced flight vision system is an airborne system which provides an image of the scene and displays it to the pilot, in order to provide an image in which the scene and objects in it can be better detected. In other words, an EFVS is a system which provides the pilot with an image which is better than unaided human vision. An EFVS includes imaging sensors such as a color camera, infrared camera or radar, and typically a display for the pilot, which can be a head-mounted display or head-up display. An EFVS may be combined with a synthetic vision system to create a combined vision system.

Mobile industrial robots are pieces of machinery that are able to be programmed to perform tasks in an industrial setting. Typically these have been used in stationary and workbench applications; however, mobile industrial robots introduce a new method for lean manufacturing. With advances in controls and robotics, current technology has been improved allowing for mobile tasks such as product delivery. This additional flexibility in manufacturing can save a company time and money during the manufacturing process, and therefore results in a cheaper end product.

<span class="mw-page-title-main">Donecle</span>

Donecle is a Toulouse-based aircraft manufacturer which develops autonomous aircraft inspection UAVs. The company offers single UAVs and swarms of UAVs to visually inspect the exterior of airliners.

Testia, an Airbus company, is a training, services and products provider for aerostructure testing and Non-Destructive Testing (NDT). It has been fully owned by Airbus since 2013.