Collision avoidance in transportation

Last updated
A simple collision avoidance system FSM collision avoidance automaton.png
A simple collision avoidance system

In transportation, collision avoidance is the maintenance of systems and practices designed to prevent vehicles (such as aircraft, motor vehicles, ships, cranes and trains) from colliding with each other. They perceive the environment with sensors and prevent collisions using the data collected from the sensors. Collision avoidance is used in autonomous vehicles, aviation, trains and water transport. Examples of collision avoidance include:

Contents

Collison avoidance requires the position of objects in its surroundings and the location of the vehicle relative to their positions.

Technology

It shows the sensor types and their use in collision avoidance. Sensing systems.gif
It shows the sensor types and their use in collision avoidance.

Many collision avoidance systems need two [2] things:

The first step in collision avoidance is perception, which can use sensors like LiDAR, visual cameras, thermal or IR cameras, or solid-state devices. They are divided upon the part of the electromagnetic spectrum they use. There are two types of sensors, passive and active sensors. Examples of active sensors are LiDAR, Radar and Sonar. Examples of passive sensors are cameras and thermal sensors. [3]

Passive sensors

Passive sensors detect energy discharged by objects around them. They are primarily visual or infrared cameras. Visual cameras operate in visible light, and thermal cameras operate in infrared light (wavelength of 700 nanometres to 14 micrometres).

Visual Cameras

Cameras rely on capturing pictures of their surroundings to extract useful information. The advantages of using cameras are their smaller size, lesser weight and flexibility. Their disadvantages are their lack of image quality, and the sensitivity to the lighting and weather. They rely on image-processing systems to detect objects.

Infrared cameras

Infrared sensors use infrared light to detect objects. They are primarily used in low light conditions and can be used with visual cameras to overcome the poor performance of visual cameras in low lighting. Their have a lower resolution than traditional cameras. [3]

Active sensors

Active sensors emit radiation and read the reflected radiation. An active sensor has a transmitter and a receiver. A transmitter emits a signal like a light wave, an electrical signal, or an acoustic signal; this signal then bounces off an object, and the receiver of the sensor reads the reflected signal. They are fast, require less processing power and are affected less by weather.

Radar

A radio detection and ranging (radar) sensor transmits a radio signal which bounces back to the radar when it encounters an object. Depending on the time it took for the signal to bounce back, the distance between the object and the radar is calculated. Radar systems have good resistance to weather conditions.

Lidar

In a light detection and ranging (LiDAR) sensor, one part emits laser pulses onto the surface and the other reads the reflection to measure the time it took for each pulse to bounce back in order to calculate the distance. Since LiDAR uses a short wavelength, it can detect small objects. A LiDAR cannot detect transparent objects such as clear glass. Another sensor, such as an ultrasonic sensor, can be used to overcome this issue.

Sonar

Ultrasonic sensors measure the distance between an item and the sensor by sending out sound waves and then listening for the waves to be reflected back from the object. The frequency at which the sound waves are produced is higher than what is audible to humans. The distance can be derived using a formula:

where d is the distance, v is the speed of the wave, and t is the time of flight. [3]

Autobrake

Warnings are given before the autobrake. If the driver ignores the warnings, the autonomous brake or autobrake will apply a partial or full brake. It can be active at any speed.

Blind spot monitoring

Steps used in collision avoidance. Yasin3-3000064-large.gif
Steps used in collision avoidance.

A blind-spot monitoring system searches the spaces near the car with radars or cameras to detect any cars that may be approaching or hiding in blind zones. The relevant side-view mirror will display an illuminated symbol when a vehicle of that type is recognized.

Rear cross-traffic warning

Cross-traffic warning notifies the driver when traffic approaches from the sides when one reverses. The alert generally consists of a sound (like an auditory chirp) and a visual signal in either the outside mirror or the dash display for the back camera.

Pedestrian detection and braking

When one crosses into the path of an automobile, pedestrian detection can identify them. Certain vehicles will automatically apply the brakes, either fully or partially. Cyclists can also be detected by certain systems.

Adaptive headlights

Adaptive headlights will revolve when the driver spins the steering wheel illuminating the road around bends.

Lane departure warning (LDW)

Lane departure warning uses cameras and several sensors to detect lane markers and monitor the distance between the vehicle and these lanes. If the vehicle leaves the lane without signaling, a beep may be heard. It may also use physical systems such as vibration of the steering wheel or seat. In advanced versions, it may also apply brakes or turn the steering wheel to keep the vehicle within the lane. [4]

Classification

By methods used

There are four methods used for performing collision avoidance. They are:

Geometric methods

Geometric approaches analyse geometric attributes to make sure that a defined minimum distance between vehicles is not breached.

Force field methods

Force-field methods, also called potential field methods, use the concept of a repulsive or attractive force field to repel an agent from an obstacle or attract it to a target.

Optimisation based methods

Optimisation based methods calculate an trajectory that avoids collisions using geographical information.

Sense and avoid methods

Sense and avoid methods detect and avoid individual objects without attributes of other objects. [3]

By activity

Depending on when they are deployed, collision avoidance systems can be classified into passive and active systems. [5]

Passive types

Methods of collision avoidance like seatbelts and airbags are primarily designed to reduce injury to the driver. They are passive types of collision avoidance. This includes rescue systems that notify rescue centers of an accident. [5]

Active types

With the addition of camera and radar sensing technologies, active types of collision avoidance can assist or warn the driver, or take control in dangerous situations. [5]

Uses

In aviation

An example of a collision avoidance system. Collision Warning Brake Support.jpg
An example of a collision avoidance system.

Unmanned Aerial Vehicles use collision avoidance systems to operate safely. [6] TCAS is a collision avoidance system that is widely used. [7] It is a universally accepted last resort meant to reduce the chance of collisions. [8]

In autonomous driving

Collision avoidance is also used in autonomous cars. [1] The aim of a collision avoidance system in vehicles is to prevent collisions, primarily caused by negligence or blind spots, by developing safety measures. [9]

In trains

Automatic Train Protection, an important function of a train control system, helps prevent collisions by managing the speed of the train. [10] Kavach is a collision avoidance system used in the Indian Railways. [11]

In ships and other water transport

Automatic identification systems are used for collision avoidance in water transport. [12]

In spacecraft and space stations

Collision avoidance has been routinely used in spacecraft or space stations (when possible) to ensure their safety. The International Space Station (ISS) has performed 14 maneuvers between 2008 to 2014 due to the threat of a collision. [13]

See also

Related Research Articles

Computer vision tasks include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions. Understanding in this context means the transformation of visual images into descriptions of the world that make sense to thought processes and can elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.

<span class="mw-page-title-main">Radar</span> Object detection system using radio waves

Radar is a system that uses radio waves to determine the distance (ranging), direction, and radial velocity of objects relative to the site. It is a radiodetermination method used to detect and track aircraft, ships, spacecraft, guided missiles, motor vehicles, map weather formations, and terrain.

<span class="mw-page-title-main">Lidar</span> Method of spatial measurement using laser

Lidar is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. Lidar may operate in a fixed direction or it may scan multiple directions, in which case it is known as lidar scanning or 3D laser scanning, a special combination of 3-D scanning and laser scanning. Lidar has terrestrial, airborne, and mobile applications.

Ultra-wideband is a radio technology that can use a very low energy level for short-range, high-bandwidth communications over a large portion of the radio spectrum. UWB has traditional applications in non-cooperative radar imaging. Most recent applications target sensor data collection, precise locating, and tracking. UWB support started to appear in high-end smartphones in 2019.

<span class="mw-page-title-main">Thermography</span> Infrared imaging used to reveal temperature

Infrared thermography (IRT), thermal video or thermal imaging, is a process where a thermal camera captures and creates an image of an object by using infrared radiation emitted from the object in a process, which are examples of infrared imaging science. Thermographic cameras usually detect radiation in the long-infrared range of the electromagnetic spectrum and produce images of that radiation, called thermograms. Since infrared radiation is emitted by all objects with a temperature above absolute zero according to the black body radiation law, thermography makes it possible to see one's environment with or without visible illumination. The amount of radiation emitted by an object increases with temperature; therefore, thermography allows one to see variations in temperature. When viewed through a thermal imaging camera, warm objects stand out well against cooler backgrounds; humans and other warm-blooded animals become easily visible against the environment, day or night. As a result, thermography is particularly useful to the military and other users of surveillance cameras.

<span class="mw-page-title-main">Radar cross section</span> Strength of an objects radar echo

Radar cross-section (RCS), denoted σ, also called radar signature, is a measure of how detectable an object is by radar. A larger RCS indicates that an object is more easily detected.

<span class="mw-page-title-main">Advanced driver-assistance system</span> Electronic systems that help a vehicle driver while driving or parking

Advanced driver-assistance systems (ADAS) are technologies that assist drivers with the safe operation of a vehicle. Through a human-machine interface, ADAS increase car and road safety. ADAS use automated technology, such as sensors and cameras, to detect nearby obstacles or driver errors, and respond accordingly. ADAS can enable various levels of autonomous driving.

Acoustic homing is the process in which a system uses the sound or acoustic signals of a target or destination to guide a moving object. There are two types of acoustic homing: passive acoustic homing and active acoustic homing. Objects using passive acoustic homing rely on detecting acoustic emissions produced by the target. Conversely, objects using active acoustic homing make use of sonar to emit a signal and detect its reflection off the target. The signal detected is then processed by the system to determine the proper response for the object. Acoustic homing is useful for applications where other forms of navigation and tracking can be ineffective. It is commonly used in environments where radio or GPS signals can not be detected, such as underwater.

<span class="mw-page-title-main">Beamforming</span> Signal processing technique used in sensor arrays for directional signal transmission or reception

Beamforming or spatial filtering is a signal processing technique used in sensor arrays for directional signal transmission or reception. This is achieved by combining elements in an antenna array in such a way that signals at particular angles experience constructive interference while others experience destructive interference. Beamforming can be used at both the transmitting and receiving ends in order to achieve spatial selectivity. The improvement compared with omnidirectional reception/transmission is known as the directivity of the array.

<span class="mw-page-title-main">Motion detector</span> Electrical device which utilizes a sensor to detect nearby motion

A motion detector is an electrical device that utilizes a sensor to detect nearby motion. Such a device is often integrated as a component of a system that automatically performs a task or alerts a user of motion in an area. They form a vital component of security, automated lighting control, home control, energy efficiency, and other useful systems. It can be achieved by either mechanical or electronic methods. When it is done by natural organisms, it is called motion perception.

<span class="mw-page-title-main">Image sensor</span> Device that converts images into electronic signals

An image sensor or imager is a sensor that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.

Obstacle avoidance, in robotics, is a critical aspect of autonomous navigation and control systems. It is the capability of a robot or an autonomous system/machine to detect and circumvent obstacles in its path to reach a predefined destination. This technology plays a pivotal role in various fields, including industrial automation, self-driving cars, drones, and even space exploration. Obstacle avoidance enables robots to operate safely and efficiently in dynamic and complex environments, reducing the risk of collisions and damage.

<span class="mw-page-title-main">Proximity sensor</span> About proximity sensor

A proximity sensor is a sensor able to detect the presence of nearby objects without any physical contact.

<span class="mw-page-title-main">Indoor positioning system</span> Network of devices used to wirelessly locate objects inside a building

An indoor positioning system (IPS) is a network of devices used to locate people or objects where GPS and other satellite technologies lack precision or fail entirely, such as inside multistory buildings, airports, alleys, parking garages, and underground locations.

Electro-optical MASINT is a subdiscipline of Measurement and Signature Intelligence, (MASINT) and refers to intelligence gathering activities which bring together disparate elements that do not fit within the definitions of Signals Intelligence (SIGINT), Imagery Intelligence (IMINT), or Human Intelligence (HUMINT).

Automatic target recognition (ATR) is the ability for an algorithm or device to recognize targets or other objects based on data obtained from sensors.

<span class="mw-page-title-main">Collision avoidance system</span> Motorcar safety system

A collision avoidance system (CAS), also known as a pre-crash system, forward collision warning system (FCW), or collision mitigation system, is an advanced driver-assistance system designed to prevent or reduce the severity of a collision. In its basic form, a forward collision warning system monitors a vehicle's speed, the speed of the vehicle in front of it, and the distance between the vehicles, so that it can provide a warning to the driver if the vehicles get too close, potentially helping to avoid a crash. Various technologies and sensors that are used include radar (all-weather) and sometimes laser (LIDAR) and cameras to detect an imminent crash. GPS sensors can detect fixed dangers such as approaching stop signs through a location database. Pedestrian detection can also be a feature of these types of systems.

<span class="mw-page-title-main">Automotive night vision</span> Vehicle safety system

An automotive night vision system uses a thermographic camera to increase a driver's perception and seeing distance in darkness or poor weather beyond the reach of the vehicle's headlights. Such systems are offered as optional equipment on certain premium vehicles. The technology was first introduced in the year 2000 on the Cadillac Deville. This technology is based on the night vision devices (NVD), which generally denotes any electronically enhanced optical devices operate in three modes: image enhancement, thermal imaging, and active illumination. The automotive night vision system is a combination of NVDs such as infrared cameras, GPS, Lidar, and Radar, among others to sense and detect objects.

<span class="mw-page-title-main">Lidar traffic enforcement</span>

Lidar has a wide range of applications; one use is in traffic enforcement and in particular speed limit enforcement, has been gradually replacing radar since 2000. Current devices are designed to automate the entire process of speed detection, vehicle identification, driver identification and evidentiary documentation.

LEDDAR is a proprietary technology owned by LeddarTech. It uses the time of flight of light signals and signal processing algorithms to detect, locate, and measure objects in its field of view.

References

  1. 1 2 Hu, Xinyuan; Ye, Naijia (2024-01-22). "Design of Active Collision Avoidance Algorithm for Driverless Cars Based on Machine Vision". 2023 IEEE 6th International Conference on Information Systems and Computer Aided Education (ICISCAE). pp. 1042–1047. doi:10.1109/ICISCAE59047.2023.10392527. ISBN   979-8-3503-1344-4.
  2. Mukunth, Vasudevan (2024-08-12). "The technologies that keep vehicles from bumping into each other | Explained". The Hindu. ISSN   0971-751X . Retrieved 2024-08-23.
  3. 1 2 3 4 Yasin, Jawad N; Mohamed, Sherif A S; Haghbayan, Mohammad-Hashem; Heikkonen, Jukka; Tenhunen, Hannu; Plosila, Juha (2020-06-04). "Unmanned Aerial Vehicles (UAVs): Collision Avoidance Systems and Approaches". IEEE Access. 8: 105139–105155. Bibcode:2020IEEEA...8j5139Y. doi: 10.1109/ACCESS.2020.3000064 . ISSN   2169-3536.
  4. Linkov, Jon (17 December 2015). "Collision-Avoidance Systems Are Changing the Look of Car Safety". Consumer Reports. Retrieved 2024-09-04.
  5. 1 2 3 Zhao, Zhiguo; Zhou, Liangjie; Zhu, Qiang; Luo, Yugong; Li, Keqiang (2017-10-05). "A review of essential technologies for collision avoidance assistance systems". Advances in Mechanical Engineering. 9 (10): 168781401772524. doi: 10.1177/1687814017725246 . ISSN   1687-8140.
  6. Tang, Jun; Lao, Songyang; Wan, Yu (2021-09-01). "Systematic Review of Collision-Avoidance Approaches for Unmanned Aerial Vehicles". IEEE Systems Journal. 16 (3): 4356–4367. doi:10.1109/JSYST.2021.3101283. ISSN   1937-9234.
  7. He, Donglin; Yang, Youzhi; Deng, Shengji; Zheng, Lei; Su, Zhuolin; Lin, Zi (2023-10-15). "Comparison of Collision Avoidance Logic between ACAS X and TCAS II in General Aviation Flight". 2023 IEEE 5th International Conference on Civil Aviation Safety and Information Technology (ICCASIT). pp. 568–573. doi:10.1109/ICCASIT58768.2023.10351533. ISBN   979-8-3503-1060-3.
  8. Sun, Jiayi; Tang, Jun; Lao, Songyang (2017). "Collision Avoidance for Cooperative UAVs With Optimized Artificial Potential Field Algorithm". IEEE Access. 5: 18382–18390. Bibcode:2017IEEEA...518382S. doi: 10.1109/ACCESS.2017.2746752 . ISSN   2169-3536.
  9. Rammohan, A.; Chavhan, Suresh; Chidambaram, Ramesh Kumar; Manisaran, N.; Kumar, K. V. Pavan (2022), Hassanien, Aboul Ella; Gupta, Deepak; Khanna, Ashish; Slowik, Adam (eds.), "Automotive Collision Avoidance System: A Review", Virtual and Augmented Reality for Automobile Industry: Innovation Vision and Applications, Cham: Springer International Publishing, pp. 1–19, doi:10.1007/978-3-030-94102-4_1, ISBN   978-3-030-94102-4 , retrieved 2024-08-18
  10. Oh, Sehchan; Yoon, Yongki; Kim, Yongkyu (2012-06-21). "Automatic Train Protection Simulation for Radio-Based Train Control System". 2012 International Conference on Information Science and Applications. IEEE. pp. 1–4. doi:10.1109/ICISA.2012.6220965. ISBN   978-1-4673-1401-5. ISSN   2162-9048.
  11. "Explained: Kavach, the Indian technology that can prevent two trains from colliding". The Indian Express. 2022-03-04. Retrieved 2024-08-21.
  12. Chen, Dejun; Dai, Chu; Wan, Xuechao; Mou, Junmin (2015-09-03). "A research on AIS-based embedded system for ship collision avoidance". 2015 International Conference on Transportation Information and Safety (ICTIS). IEEE. pp. 512–517. doi:10.1109/ICTIS.2015.7232141. ISBN   978-1-4799-8694-1 . Retrieved 2024-08-21.
  13. Rongzhi, Zhang; Kaizhong, Yang (2020-03-03). Spacecraft Collision Avoidance Technology. Academic Press. p. 6. ISBN   978-0-12-818241-3.