A Defence Technology Centre carries out military science research on behalf of the UK government. They are consortia with members from industry, universities as well as QinetiQ. They are organised by scientific themes. There are currently four DTCs. One for Human Factors Integration, one on Electromagnetic Remote Sensing, one on Data and Information Fusion and one on Autonomous and Semi Autonomous Vehicles.
Lidar is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. LIDAR may operate in a fixed direction or it may scan multiple directions, in which case it is known as LIDAR scanning or 3D laser scanning, a special combination of 3-D scanning and laser scanning. LIDAR has terrestrial, airborne, and mobile applications.
An autonomous robot is a robot that acts without recourse to human control. The first autonomous robots environment were known as Elmer and Elsie, which were constructed in the late 1940s by W. Grey Walter. They were the first robots in history that were programmed to "think" the way biological brains do and meant to have free will. Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable of phototaxis which is the movement that occurs in response to light stimulus.
Defence Research and Development Canada is the science and technology organization of the Department of National Defence (DND), whose purpose is to provide the Canadian Armed Forces (CAF), other government departments, and public safety and national security communities with knowledge and technology.
Network-centric warfare, also called network-centric operations or net-centric warfare, is a military doctrine or theory of war that aims to translate an information advantage, enabled partly by information technology, into a competitive advantage through the computer networking of dispersed forces. It was pioneered by the United States Department of Defense in the 1990s.
Multimodal interaction provides the user with multiple modes of interacting with a system. A multimodal interface provides several distinct tools for input and output of data.
An unmanned ground vehicle (UGV) is a vehicle that operates while in contact with the ground and without an onboard human presence. UGVs can be used for many applications where it may be inconvenient, dangerous, or impossible to have a human operator present. Generally, the vehicle will have a set of sensors to observe the environment, and will either autonomously make decisions about its behavior or pass the information to a human operator at a different location who will control the vehicle through teleoperation.
In the United States, geospatial intelligence (GEOINT) is intelligence about the human activity on earth derived from the exploitation and analysis of imagery, signals, or signatures with geospatial information. GEOINT describes, assesses, and visually depicts physical features and geographically referenced activities on the Earth. GEOINT, as defined in US Code, consists of imagery, imagery intelligence (IMINT) and geospatial information.
Vehicular automation involves the use of mechatronics, artificial intelligence, and multi-agent systems to assist the operator of a vehicle. These features and the vehicles employing them may be labeled as intelligent or smart. A vehicle using automation for difficult tasks, especially navigation, to ease but not entirely replace human input, may be referred to as semi-autonomous, whereas a vehicle relying solely on automation is called robotic or autonomous.
The image fusion process is defined as gathering all the important information from multiple images, and their inclusion into fewer images, usually a single one. This single image is more informative and accurate than any single source image, and it consists of all the necessary information. The purpose of image fusion is not only to reduce the amount of data but also to construct images that are more appropriate and understandable for the human and machine perception. In computer vision, multisensor image fusion is the process of combining relevant information from two or more images into a single image. The resulting image will be more informative than any of the input images.
Human Factors Integration (HFI) is the process adopted by a number of key industries in Europe to integrate human factors and ergonomics into the systems engineering process. Although each industry has a slightly different domain, the underlying approach is the same.
Guidance, navigation and control is a branch of engineering dealing with the design of systems to control the movement of vehicles, especially, automobiles, ships, aircraft, and spacecraft. In many cases these functions can be performed by trained humans. However, because of the speed of, for example, a rocket's dynamics, human reaction time is too slow to control this movement. Therefore, systems—now almost exclusively digital electronic—are used for such control. Even in cases where humans can perform these functions, it is often the case that GNC systems provide benefits such as alleviating operator work load, smoothing turbulence, fuel savings, etc. In addition, sophisticated applications of GNC enable automatic or remote control.
The Australian Research Centre for Aerospace Automation (ARCAA) was a research centre of the Queensland University of Technology. ARCAA conducted research into all aspects of aviation automation, with a particular research focus on autonomous technologies which support the more efficient and safer utilisation of airspace, and the development of autonomous aircraft and on-board sensor systems for a wide range of commercial applications.
Adaptive collaborative control is the decision-making approach used in hybrid models consisting of finite-state machines with functional models as subcomponents to simulate behavior of systems formed through the partnerships of multiple agents for the execution of tasks and the development of work products. The term “collaborative control” originated from work developed in the late 1990s and early 2000 by Fong, Thorpe, and Baur (1999). It is important to note that according to Fong et al. in order for robots to function in collaborative control, they must be self-reliant, aware, and adaptive. In literature, the adjective “adaptive” is not always shown but is noted in the official sense as it is an important element of collaborative control. The adaptation of traditional applications of control theory in teleoperations sought initially to reduce the sovereignty of “humans as controllers/robots as tools” and had humans and robots working as peers, collaborating to perform tasks and to achieve common goals. Early implementations of adaptive collaborative control centered on vehicle teleoperation. Recent uses of adaptive collaborative control cover training, analysis, and engineering applications in teleoperations between humans and multiple robots, multiple robots collaborating among themselves, unmanned vehicle control, and fault tolerant controller design.
The Defence Intelligence Fusion Centre (DIFC) is based at RAF Wyton in Cambridgeshire. Largely created from the staff of the National Imagery Exploitation Centre and then known for several years as the Defence Geospatial Intelligence Fusion Centre, it can trace its history back to clandestine reconnaissance operations at the beginning of the Second World War by Sydney Cotton on behalf of MI6 and then MI4, and the formation of the Allied Central Interpretation Unit at RAF Medmenham.
The Institute of Geomatics (IG) was a public consortium made up of the Autonomous Government of Catalonia and the Polytechnic University of Catalonia, created by Decree Law 256/1997 of the Autonomous Government of Catalonia, on September 30, 1997. It was a founding member of the Associació Catalana d'Entitats de Recerca (ACER).
An autonomous aircraft is an aircraft which flies under the control of automatic systems and needs no intervention from a human pilot. Most autonomous aircraft are unmanned aerial vehicle or drones. However, autonomous control systems are reaching a point where several air taxis and associated regulatory regimes are being developed.
A self-driving truck, also known as an autonomous truck or robo-truck, is an application of self-driving technology aiming to create trucks that can operate without human input. Alongside light, medium, and heavy-duty trucks, many companies are developing self-driving technology in semi trucks to automate highway driving in the delivery process.
The Internet of Military Things (IoMT) is a class of Internet of things for combat operations and warfare. It is a complex network of interconnected entities, or "things", in the military domain that continually communicate with each other to coordinate, learn, and interact with the physical environment to accomplish a broad range of activities in a more efficient and informed manner. The concept of IoMT is largely driven by the idea that future military battles will be dominated by machine intelligence and cyber warfare and will likely take place in urban environments. By creating a miniature ecosystem of smart technology capable of distilling sensory information and autonomously governing multiple tasks at once, the IoMT is conceptually designed to offload much of the physical and mental burden that warfighters encounter in a combat setting.
The NATO Science and Technology Organization (STO) is the primary NATO organization for Defence Science and Technology. Its intent is to maintain NATO's scientific and technological advantage by generating, sharing and utilizing advanced scientific knowledge, technological developments and innovation to support the alliance's core tasks.
Plus is an American autonomous trucking technology company based in Cupertino, California. The company develops Level 4 autonomous trucking technology for commercial freight trucks. In 2019, the company completed the first cross-country driverless freight delivery in the U.S. The company's self-driving system began to be used commercially in 2021.