In transportation, collision avoidance is the maintenance of systems and practices designed to prevent vehicles (such as aircraft, motor vehicles, ships, cranes and trains) from colliding with each other. They perceive the environment with sensors and prevent collisions using the data collected from the sensors. Collision avoidance is used in autonomous vehicles, aviation, trains and water transport. Examples of collision avoidance include:
Collison avoidance requires the position of objects in its surroundings and the location of the vehicle relative to their positions.
Many collision avoidance systems need two [2] things:
The first step in collision avoidance is perception, which can use sensors like LiDAR, visual cameras, thermal or IR cameras, or solid-state devices. They are divided upon the part of the electromagnetic spectrum they use. There are two types of sensors, passive and active sensors. Examples of active sensors are LiDAR, Radar and Sonar. Examples of passive sensors are cameras and thermal sensors. [3]
Passive sensors detect energy discharged by objects around them. They are primarily visual or infrared cameras. Visual cameras operate in visible light, and thermal cameras operate in infrared light (wavelength of 700 nanometres to 14 micrometres).
Cameras rely on capturing pictures of their surroundings to extract useful information. The advantages of using cameras are their smaller size, lesser weight and flexibility. Their disadvantages are their lack of image quality, and the sensitivity to the lighting and weather. They rely on image-processing systems to detect objects.
Infrared sensors use infrared light to detect objects. They are primarily used in low light conditions and can be used with visual cameras to overcome the poor performance of visual cameras in low lighting. Their have a lower resolution than traditional cameras. [3]
Active sensors emit radiation and read the reflected radiation. An active sensor has a transmitter and a receiver. A transmitter emits a signal like a light wave, an electrical signal, or an acoustic signal; this signal then bounces off an object, and the receiver of the sensor reads the reflected signal. They are fast, require less processing power and are affected less by weather.
A radio detection and ranging (radar) sensor transmits a radio signal which bounces back to the radar when it encounters an object. Depending on the time it took for the signal to bounce back, the distance between the object and the radar is calculated. Radar systems have good resistance to weather conditions.
In a light detection and ranging (LiDAR) sensor, one part emits laser pulses onto the surface and the other reads the reflection to measure the time it took for each pulse to bounce back in order to calculate the distance. Since LiDAR uses a short wavelength, it can detect small objects. A LiDAR cannot detect transparent objects such as clear glass. Another sensor, such as an ultrasonic sensor, can be used to overcome this issue.
Ultrasonic sensors measure the distance between an item and the sensor by sending out sound waves and then listening for the waves to be reflected back from the object. The frequency at which the sound waves are produced is higher than what is audible to humans. The distance can be derived using a formula:
where d is the distance, v is the speed of the wave, and t is the time of flight. [3]
Warnings are given before the autobrake. If the driver ignores the warnings, the autonomous brake or autobrake will apply a partial or full brake. It can be active at any speed.
A blind-spot monitoring system searches the spaces near the car with radars or cameras to detect any cars that may be approaching or hiding in blind zones. The relevant side-view mirror will display an illuminated symbol when a vehicle of that type is recognized.
Cross-traffic warning notifies the driver when traffic approaches from the sides when one reverses. The alert generally consists of a sound (like an auditory chirp) and a visual signal in either the outside mirror or the dash display for the back camera.
When one crosses into the path of an automobile, pedestrian detection can identify them. Certain vehicles will automatically apply the brakes, either fully or partially. Cyclists can also be detected by certain systems.
Adaptive headlights will revolve when the driver spins the steering wheel illuminating the road around bends.
Lane departure warning uses cameras and several sensors to detect lane markers and monitor the distance between the vehicle and these lanes. If the vehicle leaves the lane without signaling, a beep may be heard. It may also use physical systems such as vibration of the steering wheel or seat. In advanced versions, it may also apply brakes or turn the steering wheel to keep the vehicle within the lane. [4]
There are four methods used for performing collision avoidance. They are:
Geometric approaches analyse geometric attributes to make sure that a defined minimum distance between vehicles is not breached.
Force-field methods, also called potential field methods, use the concept of a repulsive or attractive force field to repel an agent from an obstacle or attract it to a target.
Optimisation based methods calculate an trajectory that avoids collisions using geographical information.
Sense and avoid methods detect and avoid individual objects without attributes of other objects. [3]
Depending on when they are deployed, collision avoidance systems can be classified into passive and active systems. [5]
Methods of collision avoidance like seatbelts and airbags are primarily designed to reduce injury to the driver. They are passive types of collision avoidance. This includes rescue systems that notify rescue centers of an accident. [5]
With the addition of camera and radar sensing technologies, active types of collision avoidance can assist or warn the driver, or take control in dangerous situations. [5]
Unmanned Aerial Vehicles use collision avoidance systems to operate safely. [6] TCAS is a collision avoidance system that is widely used. [7] It is a universally accepted last resort meant to reduce the chance of collisions. [8]
Collision avoidance is also used in autonomous cars. [1] The aim of a collision avoidance system in vehicles is to prevent collisions, primarily caused by negligence or blind spots, by developing safety measures. [9]
Automatic Train Protection, an important function of a train control system, helps prevent collisions by managing the speed of the train. [10] Kavach is a collision avoidance system used in the Indian Railways. [11]
Automatic identification systems are used for collision avoidance in water transport. [12]
Collision avoidance has been routinely used in spacecraft or space stations (when possible) to ensure their safety. The International Space Station (ISS) has performed 14 maneuvers between 2008 to 2014 due to the threat of a collision. [13]
Computer vision tasks include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions. Understanding in this context means the transformation of visual images into descriptions of the world that make sense to thought processes and can elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.
Radar is a system that uses radio waves to determine the distance (ranging), direction, and radial velocity of objects relative to the site. It is a radiodetermination method used to detect and track aircraft, ships, spacecraft, guided missiles, motor vehicles, map weather formations, and terrain.
Lidar is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. Lidar may operate in a fixed direction or it may scan multiple directions, in which case it is known as lidar scanning or 3D laser scanning, a special combination of 3-D scanning and laser scanning. Lidar has terrestrial, airborne, and mobile applications.
Ultra-wideband is a radio technology that can use a very low energy level for short-range, high-bandwidth communications over a large portion of the radio spectrum. UWB has traditional applications in non-cooperative radar imaging. Most recent applications target sensor data collection, precise locating, and tracking. UWB support started to appear in high-end smartphones in 2019.
Infrared thermography (IRT), thermal video or thermal imaging, is a process where a thermal camera captures and creates an image of an object by using infrared radiation emitted from the object in a process, which are examples of infrared imaging science. Thermographic cameras usually detect radiation in the long-infrared range of the electromagnetic spectrum and produce images of that radiation, called thermograms. Since infrared radiation is emitted by all objects with a temperature above absolute zero according to the black body radiation law, thermography makes it possible to see one's environment with or without visible illumination. The amount of radiation emitted by an object increases with temperature; therefore, thermography allows one to see variations in temperature. When viewed through a thermal imaging camera, warm objects stand out well against cooler backgrounds; humans and other warm-blooded animals become easily visible against the environment, day or night. As a result, thermography is particularly useful to the military and other users of surveillance cameras.
Radar cross-section (RCS), denoted σ, also called radar signature, is a measure of how detectable an object is by radar. A larger RCS indicates that an object is more easily detected.
Advanced driver-assistance systems (ADAS) are technologies that assist drivers with the safe operation of a vehicle. Through a human-machine interface, ADAS increase car and road safety. ADAS use automated technology, such as sensors and cameras, to detect nearby obstacles or driver errors, and respond accordingly. ADAS can enable various levels of autonomous driving.
Acoustic homing is the process in which a system uses the sound or acoustic signals of a target or destination to guide a moving object. There are two types of acoustic homing: passive acoustic homing and active acoustic homing. Objects using passive acoustic homing rely on detecting acoustic emissions produced by the target. Conversely, objects using active acoustic homing make use of sonar to emit a signal and detect its reflection off the target. The signal detected is then processed by the system to determine the proper response for the object. Acoustic homing is useful for applications where other forms of navigation and tracking can be ineffective. It is commonly used in environments where radio or GPS signals can not be detected, such as underwater.
Beamforming or spatial filtering is a signal processing technique used in sensor arrays for directional signal transmission or reception. This is achieved by combining elements in an antenna array in such a way that signals at particular angles experience constructive interference while others experience destructive interference. Beamforming can be used at both the transmitting and receiving ends in order to achieve spatial selectivity. The improvement compared with omnidirectional reception/transmission is known as the directivity of the array.
A motion detector is an electrical device that utilizes a sensor to detect nearby motion. Such a device is often integrated as a component of a system that automatically performs a task or alerts a user of motion in an area. They form a vital component of security, automated lighting control, home control, energy efficiency, and other useful systems. It can be achieved by either mechanical or electronic methods. When it is done by natural organisms, it is called motion perception.
An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.
Obstacle avoidance, in robotics, is a critical aspect of autonomous navigation and control systems. It is the capability of a robot or an autonomous system/machine to detect and circumvent obstacles in its path to reach a predefined destination. This technology plays a pivotal role in various fields, including industrial automation, self-driving cars, drones, and even space exploration. Obstacle avoidance enables robots to operate safely and efficiently in dynamic and complex environments, reducing the risk of collisions and damage.
A proximity sensor is a sensor able to detect the presence of nearby objects without any physical contact.
An indoor positioning system (IPS) is a network of devices used to locate people or objects where GPS and other satellite technologies lack precision or fail entirely, such as inside multistory buildings, airports, alleys, parking garages, and underground locations.
Electro-optical MASINT is a subdiscipline of Measurement and Signature Intelligence, (MASINT) and refers to intelligence gathering activities which bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).
Automatic target recognition (ATR) is the ability for an algorithm or device to recognize targets or other objects based on data obtained from sensors.
A collision avoidance system (CAS), also known as a pre-crash system, forward collision warning system (FCW), or collision mitigation system, is an advanced driver-assistance system designed to prevent or reduce the severity of a collision. In its basic form, a forward collision warning system monitors a vehicle's speed, the speed of the vehicle in front of it, and the distance between the vehicles, so that it can provide a warning to the driver if the vehicles get too close, potentially helping to avoid a crash. Various technologies and sensors that are used include radar (all-weather) and sometimes laser (LIDAR) and cameras to detect an imminent crash. GPS sensors can detect fixed dangers such as approaching stop signs through a location database. Pedestrian detection can also be a feature of these types of systems.
An automotive night vision system uses a thermographic camera to increase a driver's perception and seeing distance in darkness or poor weather beyond the reach of the vehicle's headlights. Such systems are offered as optional equipment on certain premium vehicles. The technology was first introduced in the year 2000 on the Cadillac Deville. This technology is based on the night vision devices (NVD), which generally denotes any electronically enhanced optical devices operate in three modes: image enhancement, thermal imaging, and active illumination. The automotive night vision system is a combination of NVDs such as infrared cameras, GPS, Lidar, and Radar, among others to sense and detect objects.
Lidar has a wide range of applications; one use is in traffic enforcement and in particular speed limit enforcement, has been gradually replacing radar since 2000. Current devices are designed to automate the entire process of speed detection, vehicle identification, driver identification and evidentiary documentation.
LEDDAR is a proprietary technology owned by LeddarTech. It uses the time of flight of light signals and signal processing algorithms to detect, locate, and measure objects in its field of view.