Robotic sensing

Last updated

Robotic sensing is a subarea of robotics science intended to provide sensing capabilities to robots. Robotic sensing provides robots with the ability to sense their environments and is typically used as feedback to enable robots to adjust their behavior based on sensed input. Robot sensing includes the ability to see, [1] [2] [3] touch, [4] [5] [6] hear [7] and move [8] [9] [10] and associated algorithms to process and make use of environmental feedback and sensory data. Robot sensing is important in applications such as vehicular automation, robotic prosthetics, and for industrial, medical, entertainment and educational robots.

Contents

Vision

Method

Visual sensing systems can be based on a variety of technologies and methods including the use of camera, sonar, laser and radio frequency identification (RFID) [1] technology. All four methods aim for three procedures—sensation, estimation, and matching.

Image processing

Image quality is important in applications that require excellent robotic vision. Algorithms based on wavelet transform that are used for fusing images of different spectra and different foci result in improved image quality. [2] Robots can gather more accurate information from the resulting improved image.

Usage

Visual sensors help robots to identify the surrounding environment and take appropriate action. [3] Robots analyze the image of the immediate environment based on data input from the visual sensor. The result is compared to the ideal, intermediate or end image, so that appropriate movement or action can be determined to reach the intermediate or final goal.

Touch

[11]

Robot skin

Electronic skin refers to flexible, stretchable and self-healing electronics that are able to mimic functionalities of human or animal skin. [12] [13] The broad class of materials often contain sensing abilities that are intended to reproduce the capabilities of human skin to respond to environmental factors such as changes in heat and pressure. [12] [13] [14] [15]

Advances in electronic skin research focuses on designing materials that are stretchy, robust, and flexible. Research in the individual fields of flexible electronics and tactile sensing has progressed greatly; however, electronic skin design attempts to bring together advances in many areas of materials research without sacrificing individual benefits from each field. [16] The successful combination of flexible and stretchable mechanical properties with sensors and the ability to self-heal would open the door to many possible applications including soft robotics, prosthetics, artificial intelligence and health monitoring. [12] [16] [17] [18]

Recent advances in the field of electronic skin have focused on incorporating green materials ideals and environmental awareness into the design process. As one of the main challenges facing electronic skin development is the ability of the material to withstand mechanical strain and maintain sensing ability or electronic properties, recyclability and self-healing properties are especially critical in the future design of new electronic skins. [19]

Types and examples

Examples of the current state of progress in the field of robot skins as of mid-2022 are a robotic finger covered in a type of manufactured living human skin, [20] [21] an electronic skin giving biological skin-like haptic sensations and touch/pain-sensitivity to a robotic hand, [22] [23] a system of an electronic skin and a human-machine interface that can enable remote sensed tactile perception, and wearable or robotic sensing of many hazardous substances and pathogens, [24] [25] and a multilayer tactile sensor hydrogel-based robot skin. [26] [27]

Tactile discrimination

Early robotic prosthetic hand, made in 1963. On open public display at the main shopping mall in Belgrade. Belgrade hand.JPG
Early robotic prosthetic hand, made in 1963. On open public display at the main shopping mall in Belgrade.

As robots and prosthetic limbs become more complex the need for sensors capable of detecting touch with high tactile acuity becomes more and more necessary. There are many types of tactile sensors used for different tasks. [28] There are three types of tactile sensors. The first, single point sensors, can be compared to a single cell, or whiskers, and can detect very local stimuli. The second type of sensor is a high spatial resolution sensor which can be compared to a human fingertip and is essential for the tactile acuity in robotic hands. The third and final tactile sensor type is a low spatial resolution sensor which has similar tactile acuity as the skin on one's back or arm. [28] These sensors can be placed meaningfully throughout the surface of a prosthetic or a robot to give it the ability to sense touch in similar, if not better, ways than the human counterpart. [28]

Signal processing

Touch sensory signals can be generated by the robot's own movements. It is important to identify only the external tactile signals for accurate operations. Previous solutions employed the Wiener filter, which relies on the prior knowledge of signal statistics that are assumed to be stationary. Recent solution applies an adaptive filter to the robot's logic. [4] It enables the robot to predict the resulting sensor signals of its internal motions, screening these false signals out. The new method improves contact detection and reduces false interpretation.

Usage

[29] Touch patterns enable robots to interpret human emotions in interactive applications. Four measurable features—force, contact time, repetition, and contact area change—can effectively categorize touch patterns through the temporal decision tree classifier to account for the time delay and associate them to human emotions with up to 83% accuracy. [5] The Consistency Index [5] is applied at the end to evaluate the level of confidence of the system to prevent inconsistent reactions.

Robots use touch signals to map the profile of a surface in hostile environment such as a water pipe. Traditionally, a predetermined path was programmed into the robot. Currently, with the integration of touch sensors, the robots first acquire a random data point; the algorithm [6] of the robot will then determine the ideal position of the next measurement according to a set of predefined geometric primitives. This improves the efficiency by 42%. [5]

In recent years, using touch as a stimulus for interaction has been the subject of much study. In 2010, the robot seal PARO was built, which reacts to many stimuli from human interaction, including touch. The therapeutic benefits of such human-robot interaction is still being studied, but has shown very positive results. [30]

Hearing

Signal processing

Accurate audio sensors require low internal noise contribution. Traditionally, audio sensors combine acoustical arrays and microphones to reduce internal noise level. Recent solutions combine also piezoelectric devices. [7] These passive devices use the piezoelectric effect to transform force to voltage, so that the vibration that is causing the internal noise could be eliminated. On average, internal noise up to about 7dB can be reduced. [7]

Robots may interpret strayed noise as speech instructions. Current voice activity detection (VAD) system uses the complex spectrum circle centroid (CSCC) method and a maximum signal-to-noise ratio (SNR) beamformer. [31] Because humans usually look at their partners when conducting conversations, the VAD system with two microphones enable the robot to locate the instructional speech by comparing the signal strengths of the two microphones. Current system is able to cope with background noise generated by televisions and sounding devices that come from the sides.

Usage

Robots can perceive emotions through the way we talk and associated characteristics and features. Acoustic and linguistic features are generally used to characterize emotions. The combination of seven acoustic features and four linguistic features improves the recognition performance when compared to using only one set of features. [32]

Acoustic feature

Linguistic feature

Olfaction

Machine olfaction is the automated simulation of the sense of smell. An emerging application in modern engineering, it involves the use of robots or other automated systems to analyze air-borne chemicals. Such an apparatus is often called an electronic nose or e-nose. The development of machine olfaction is complicated by the fact that e-nose devices to date have responded to a limited number of chemicals, whereas odors are produced by unique sets of (potentially numerous) odorant compounds. The technology, though still in the early stages of development, promises many applications, such as: [33] quality control in food processing, detection and diagnosis in medicine, [34] detection of drugs, explosives and other dangerous or illegal substances, [35] disaster response, and environmental monitoring.

One type of proposed machine olfaction technology is via gas sensor array instruments capable of detecting, identifying, and measuring volatile compounds. However, a critical element in the development of these instruments is pattern analysis, and the successful design of a pattern analysis system for machine olfaction requires a careful consideration of the various issues involved in processing multivariate data: signal-preprocessing, feature extraction, feature selection, classification, regression, clustering, and validation. [36] Another challenge in current research on machine olfaction is the need to predict or estimate the sensor response to aroma mixtures. [37] Some pattern recognition problems in machine olfaction such as odor classification and odor localization can be solved by using time series kernel methods. [38]

Taste

The electronic tongue is an instrument that measures and compares tastes. As per the IUPAC technical report, an “electronic tongue” as analytical instrument including an array of non-selective chemical sensors with partial specificity to different solution components and an appropriate pattern recognition instrument, capable to recognize quantitative and qualitative compositions of simple and complex solutions [39] [40]

Chemical compounds responsible for taste are detected by human taste receptors. Similarly, the multi-electrode sensors of electronic instruments detect the same dissolved organic and inorganic compounds. Like human receptors, each sensor has a spectrum of reactions different from the other. The information given by each sensor is complementary, and the combination of all sensors' results generates a unique fingerprint. Most of the detection thresholds of sensors are similar to or better than human receptors.

In the biological mechanism, taste signals are transduced by nerves in the brain into electric signals. E-tongue sensors process is similar: they generate electric signals as voltammetric and potentiometric variations.

Taste quality perception and recognition are based on the building or recognition of activated sensory nerve patterns by the brain and the taste fingerprint of the product. This step is achieved by the e-tongue's statistical software, which interprets the sensor data into taste patterns.

For example, robot cooks may be able to taste food for dynamic cooking. [41]

Motion perception

Robots at the RoboCup 2019 Robots during RoboCup 2019.jpg
Robots at the RoboCup 2019

Usage

Automated robots require a guidance system to determine the ideal path to perform its task. However, at the molecular scale, nano-robots lack such guidance system because individual molecules cannot store complex motions and programs. Therefore, the only way to achieve motion in such environment is to replace sensors with chemical reactions. Currently, a molecular spider that has one streptavidin molecule as an inert body and three catalytic legs is able to start, follow, turn and stop when came across different DNA origami. [8] The DNA-based nano-robots can move over 100 nm with a speed of 3 nm/min. [8]

In a TSI operation, which is an effective way to identify tumors and potentially cancer by measuring the distributed pressure at the sensor's contacting surface, excessive force may inflict a damage and have the chance of destroying the tissue. The application of robotic control to determine the ideal path of operation can reduce the maximum forces by 35% and gain a 50% increase in accuracy [9] compared to human doctors.

Performance

Efficient robotic exploration saves time and resources. The efficiency is measured by optimality and competitiveness. Optimal boundary exploration is possible only when a robot has square sensing area, starts at the boundary, and uses the Manhattan metric. [10] In complicated geometries and settings, a square sensing area is more efficient and can achieve better competitiveness regardless of the metric and of the starting point. [10]

Non-human senses

Robots may not only be equipped with higher sensitivity and capabilities per sense than all or most [42] non-cyborg humans such as being able to "see" more of the electromagnetic spectrum such as ultraviolet and with higher fidelity and granularity,[ additional citation(s) needed ] but may also be able have more senses[ additional citation(s) needed ] such as sensing of magnetic fields (magnetoreception) [43] or of various hazardous air components. [25]

Collective sensing and sensemaking

Robots may share, [44] store, and transmit sensory data as well as data based on such. They may learn from or interpret the same or related data in different ways and some robots may have remote senses (e.g. without local interpretation or processing or computation such as with common types of telerobotics or with embedded [45] or mobile "sensor nodes").[ additional citation(s) needed ] Processing of sensory data may include processes such as facial recognition, [46] facial expression recognition, [47] gesture recognition and integration of interpretative abstract knowledge.[ additional citation(s) needed ]

See also

Related Research Articles

Computer vision tasks include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions. Understanding in this context means the transformation of visual images into descriptions of the world that make sense to thought processes and can elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.

<span class="mw-page-title-main">Sensor</span> Converter that measures a physical quantity and converts it into a signal

A sensor is a device that produces an output signal for the purpose of sensing a physical phenomenon.

<span class="mw-page-title-main">Haptic technology</span> Any form of interaction involving touch

Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

A biosensor is an analytical device, used for the detection of a chemical substance, that combines a biological component with a physicochemical detector. The sensitive biological element, e.g. tissue, microorganisms, organelles, cell receptors, enzymes, antibodies, nucleic acids, etc., is a biologically derived material or biomimetic component that interacts with, binds with, or recognizes the analyte under study. The biologically sensitive elements can also be created by biological engineering. The transducer or the detector element, which transforms one signal into another one, works in a physicochemical way: optical, piezoelectric, electrochemical, electrochemiluminescence etc., resulting from the interaction of the analyte with the biological element, to easily measure and quantify. The biosensor reader device connects with the associated electronics or signal processors that are primarily responsible for the display of the results in a user-friendly way. This sometimes accounts for the most expensive part of the sensor device, however it is possible to generate a user friendly display that includes transducer and sensitive element. The readers are usually custom-designed and manufactured to suit the different working principles of biosensors.

Stimulus modality, also called sensory modality, is one aspect of a stimulus or what is perceived after a stimulus. For example, the temperature modality is registered after heat or cold stimulate a receptor. Some sensory modalities include: light, sound, temperature, taste, pressure, and smell. The type and location of the sensory receptor activated by the stimulus plays the primary role in coding the sensation. All sensory modalities work together to heighten stimuli sensation when necessary.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

Machine olfaction is the automated simulation of the sense of smell. An emerging application in modern engineering, it involves the use of robots or other automated systems to analyze air-borne chemicals. Such an apparatus is often called an electronic nose or e-nose. The development of machine olfaction is complicated by the fact that e-nose devices to date have responded to a limited number of chemicals, whereas odors are produced by unique sets of odorant compounds. The technology, though still in the early stages of development, promises many applications, such as: quality control in food processing, detection and diagnosis in medicine, detection of drugs, explosives and other dangerous or illegal substances, disaster response, and environmental monitoring.

<span class="mw-page-title-main">Electronic nose</span> Electronic sensor for odor detection

An electronic nose is an electronic sensing device intended to detect odors or flavors. The expression "electronic sensing" refers to the capability of reproducing human senses using sensor arrays and pattern recognition systems.

The electronic tongue is an instrument that measures and compares tastes. As per the IUPAC technical report, an “electronic tongue” as analytical instrument including an array of non-selective chemical sensors with partial specificity to different solution components and an appropriate pattern recognition instrument, capable to recognize quantitative and qualitative compositions of simple and complex solutions

Machine perception is the capability of a computer system to interpret data in a manner that is similar to the way humans use their senses to relate to the world around them. The basic method that the computers take in and respond to their environment is through the attached hardware. Until recently input was limited to a keyboard, or a mouse, but advances in technology, both in hardware and software, have allowed computers to take in sensory input in a way similar to humans.

Tactile discrimination is the ability to differentiate information through the sense of touch. The somatosensory system is the nervous system pathway that is responsible for this essential survival ability used in adaptation. There are various types of tactile discrimination. One of the most well known and most researched is two-point discrimination, the ability to differentiate between two different tactile stimuli which are relatively close together. Other types of discrimination like graphesthesia and spatial discrimination also exist but are not as extensively researched. Tactile discrimination is something that can be stronger or weaker in different people and two major conditions, chronic pain and blindness, can affect it greatly. Blindness increases tactile discrimination abilities which is extremely helpful for tasks like reading braille. In contrast, chronic pain conditions, like arthritis, decrease a person's tactile discrimination. One other major application of tactile discrimination is in new prosthetics and robotics which attempt to mimic the abilities of the human hand. In this case tactile sensors function similarly to mechanoreceptors in a human hand to differentiate tactile stimuli.

Haptic perception means literally the ability "to grasp something". Perception in this case is achieved through the active exploration of surfaces and objects by a moving subject, as opposed to passive contact by a static subject during tactile perception.

<span class="mw-page-title-main">Sense of smell</span> Sense that detects smells

The sense of smell, or olfaction, is the special sense through which smells are perceived. The sense of smell has many functions, including detecting desirable foods, hazards, and pheromones, and plays a role in taste.

<span class="mw-page-title-main">Somatosensory system</span> Nerve system for sensing touch, temperature, body position, and pain

In physiology, the somatosensory system is the network of neural structures in the brain and body that produce the perception of touch, as well as temperature (thermoception), body position (proprioception), and pain. It is a subset of the sensory nervous system, which also represents visual, auditory, olfactory, and gustatory stimuli.

A sense is a biological system used by an organism for sensation, the process of gathering information about the world through the detection of stimuli. Although in some cultures five human senses were traditionally identified as such, it is now recognized that there are many more. Senses used by non-human organisms are even greater in variety and number. During sensation, sense organs collect various stimuli for transduction, meaning transformation into a form that can be understood by the brain. Sensation and perception are fundamental to nearly every aspect of an organism's cognition, behavior and thought.

<span class="mw-page-title-main">Tactile sensor</span>

A tactile sensor is a device that measures information arising from physical interaction with its environment. Tactile sensors are generally modeled after the biological sense of cutaneous touch which is capable of detecting stimuli resulting from mechanical stimulation, temperature, and pain. Tactile sensors are used in robotics, computer hardware and security systems. A common application of tactile sensors is in touchscreen devices on mobile phones and computing.

Sensory design aims to establish an overall diagnosis of the sensory perceptions of a product, and define appropriate means to design or redesign it on that basis. It involves an observation of the diverse and varying situations in which a given product or object is used in order to measure the users' overall opinion of the product, its positive and negative aspects in terms of tactility, appearance, sound and so on.

Electronic skin refers to flexible, stretchable and self-healing electronics that are able to mimic functionalities of human or animal skin. The broad class of materials often contain sensing abilities that are intended to reproduce the capabilities of human skin to respond to environmental factors such as changes in heat and pressure.

<span class="mw-page-title-main">Soft robotics</span> Subfield of robotics

Soft robotics is a subfield of robotics that concerns the design, control, and fabrication of robots composed of compliant materials, instead of rigid links. In contrast to rigid-bodied robots built from metals, ceramics and hard plastics, the compliance of soft robots can improve their safety when working in close contact with humans.

A chemical sensor array is a sensor architecture with multiple sensor components that create a pattern for analyte detection from the additive responses of individual sensor components. There exist several types of chemical sensor arrays including electronic, optical, acoustic wave, and potentiometric devices. These chemical sensor arrays can employ multiple sensor types that are cross-reactive or tuned to sense specific analytes.

References

  1. 1 2 Roh SG, Choi HR (Jan 2009). "3-D Tag-Based RFID System for Recognition of Object." IEEE Transactions on Automation Science and Engineering 6 (1): 55–65.
  2. 1 2 Arivazhagan S, Ganesan L, Kumar TGS (Jun 2009). "A modified statistical approach for image fusion using wavelet transform." Signal Image and Video Processing 3 (2): 137-144.
  3. 1 2 Jafar FA, et al (Mar 2011). "An Environmental Visual Features Based Navigation Method for Autonomous Mobile Robots." International Journal of Innovative Computing, Information and Control 7 (3): 1341-1355.
  4. 1 2 Anderson S, et al (Dec 2010). "Adaptive Cancelation of Self-Generated Sensory Signals in a Whisking Robot." IEEE Transactions on Robotics 26 (6): 1065-1076.
  5. 1 2 3 4 Kim YM, et al (Aug 2010)."A Robust Online Touch Pattern Recognition for Dynamic Human-robot Interaction." IEEE Transactions on Consumer Electronics 56 (3): 1979-1987.
  6. 1 2 Mazzini F, et al (Feb 2011). "Tactile Robotic Mapping of Unknown Surfaces, with Application to Oil Wells." IEEE Transactions on Instrumentation and Measurement 60 (2): 420-429.
  7. 1 2 3 Matsumoto M, Hashimoto S (2010). "Internal Noise Reduction Using Piezoelectric Device under Blind Condition." Internatl (Jan 2011). "Searching for the most important feature types signalling emotion-related user states in speech." Computer Speech and Language 25 (1): 4-28.
  8. 1 2 3 Lund K, et al (May 2010). "Molecular robots guided by prescriptive landscapes." Nature 465 (7295): 206-210.
  9. 1 2 Trejos AL, et al (Sep 2009). "Robot-assisted Tactile Sensing for Minimally Invasive Tumor Localization." International Journal of Robotics Research 28 (9): 1118-1133.
  10. 1 2 3 Czyzowicz J, Labourel A, Pelc A (Jan 2011). "Optimality and Competitiveness of Exploring Polygons by Mobile Robots." Information and Computation 209 (1): 74-88.
  11. Dahiya, Ravinder S.; Valle, Maurizio (2013). Robotic Tactile Sensing: Technologies and System. Springer. doi:10.1007/978-94-007-0579-1. ISBN   9789400705784.
  12. 1 2 3 Benight, Stephanie J.; Wang, Chao; Tok, Jeffrey B.H.; Bao, Zhenan (2013). "Stretchable and self-healing polymers and devices for electronic skin". Progress in Polymer Science. 38 (12): 1961–1977. doi:10.1016/j.progpolymsci.2013.08.001.
  13. 1 2 dos Santos, Andreia; Fortunato, Elvira; Martins, Rodrigo; Águas, Hugo; Igreja, Rui (January 2020). "Transduction Mechanisms, Micro-Structuring Techniques, and Applications of Electronic Skin Pressure Sensors: A Review of Recent Advances". Sensors. 20 (16): 4407. Bibcode:2020Senso..20.4407D. doi: 10.3390/s20164407 . PMC   7472322 . PMID   32784603.
  14. Chou, Ho-Hsiu; Nguyen, Amanda; Chortos, Alex; To, John W. F.; Lu, Chien; Mei, Jianguo; Kurosawa, Tadanori; Bae, Won-Gyu; Tok, Jeffrey B.-H. (2015-08-24). "A chameleon-inspired stretchable electronic skin with interactive colour changing controlled by tactile sensing". Nature Communications. 6: 8011. Bibcode:2015NatCo...6.8011C. doi:10.1038/ncomms9011. PMC   4560774 . PMID   26300307.
  15. Hou, Chengyi; Huang, Tao; Wang, Hongzhi; Yu, Hao; Zhang, Qinghong; Li, Yaogang (2013-11-05). "A strong and stretchable self-healing film with self-activated pressure sensitivity for potential artificial skin applications". Scientific Reports. 3 (1): 3138. Bibcode:2013NatSR...3E3138H. doi:10.1038/srep03138. ISSN   2045-2322. PMC   3817431 . PMID   24190511.
  16. 1 2 Hammock, Mallory L.; Chortos, Alex; Tee, Benjamin C.-K.; Tok, Jeffrey B.-H.; Bao, Zhenan (2013-11-01). "25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress". Advanced Materials. 25 (42): 5997–6038. Bibcode:2013AdM....25.5997H. doi:10.1002/adma.201302240. ISSN   1521-4095. PMID   24151185. S2CID   205250986.
  17. Bauer, Siegfried; Bauer-Gogonea, Simona; Graz, Ingrid; Kaltenbrunner, Martin; Keplinger, Christoph; Schwödiauer, Reinhard (2014-01-01). "25th Anniversary Article: A Soft Future: From Robots and Sensor Skin to Energy Harvesters". Advanced Materials. 26 (1): 149–162. Bibcode:2014AdM....26..149B. doi:10.1002/adma.201303349. ISSN   1521-4095. PMC   4240516 . PMID   24307641.
  18. Tee, Benjamin C-K.; Wang, Chao; Allen, Ranulfo; Bao, Zhenan (December 2012). "An electrically and mechanically self-healing composite with pressure- and flexion-sensitive properties for electronic skin applications". Nature Nanotechnology. 7 (12): 825–832. Bibcode:2012NatNa...7..825T. doi:10.1038/nnano.2012.192. ISSN   1748-3395. PMID   23142944.
  19. Zou, Zhanan; Zhu, Chengpu; Li, Yan; Lei, Xingfeng; Zhang, Wei; Xiao, Jianliang (2018-02-01). "Rehealable, fully recyclable, and malleable electronic skin enabled by dynamic covalent thermoset nanocomposite". Science Advances. 4 (2): eaaq0508. Bibcode:2018SciA....4..508Z. doi:10.1126/sciadv.aaq0508. ISSN   2375-2548. PMC   5817920 . PMID   29487912.
  20. Temming, Maria (9 June 2022). "Scientists grew living human skin around a robotic finger". Science News. Retrieved 20 July 2022.
  21. Kawai, Michio; Nie, Minghao; Oda, Haruka; Morimoto, Yuya; Takeuchi, Shoji (6 July 2022). "Living skin on a robot". Matter. 5 (7): 2190–2208. doi: 10.1016/j.matt.2022.05.019 . ISSN   2590-2393.
  22. Barker, Ross (June 1, 2022). "Artificial skin capable of feeling pain could lead to new generation of touch-sensitive robots". University of Glasgow . Retrieved 20 July 2022.
  23. Liu, Fengyuan; Deswal, Sweety; Christou, Adamos; Shojaei Baghini, Mahdieh; Chirila, Radu; Shakthivel, Dhayalan; Chakraborty, Moupali; Dahiya, Ravinder (June 2022). "Printed synaptic transistor–based electronic skin for robots to feel and learn" (PDF). Science Robotics. 7 (67): eabl7286. doi:10.1126/scirobotics.abl7286. ISSN   2470-9476. PMID   35648845. S2CID   249275626.
  24. Velasco, Emily (June 2, 2022). "Artificial skin gives robots sense of touch and beyond". California Institute of Technology . Retrieved 20 July 2022.
  25. 1 2 Yu, You; Li, Jiahong; Solomon, Samuel A.; Min, Jihong; Tu, Jiaobing; Guo, Wei; Xu, Changhao; Song, Yu; Gao, Wei (June 1, 2022). "All-printed soft human-machine interface for robotic physicochemical sensing". Science Robotics. 7 (67): eabn0495. doi:10.1126/scirobotics.abn0495. ISSN   2470-9476. PMC   9302713 . PMID   35648844.
  26. Yirka, Bob (June 9, 2022). "Biomimetic elastomeric robot skin has tactile sensing abilities". Tech Xplore. Retrieved 23 July 2022.
  27. Park, K.; Yuk, H.; Yang, M.; Cho, J.; Lee, H.; Kim, J. (8 June 2022). "A biomimetic elastomeric robot skin using electrical impedance and acoustic tomography for tactile sensing". Science Robotics. 7 (67): eabm7187. doi:10.1126/scirobotics.abm7187. ISSN   2470-9476. PMID   35675452. S2CID   249520303.
  28. 1 2 3 S. Luo; J. Bimbo; R. Dahiya; H. Liu (December 2017). "Robotic tactile perception of object properties: A review". Mechatronics. 48: 54–67. arXiv: 1711.03810 . Bibcode:2017arXiv171103810L. doi:10.1016/j.mechatronics.2017.11.002. S2CID   24222234.
  29. http://www.robotcub.org/misc/papers/10_Dahiya_etal.pdf [ bare URL PDF ]
  30. Archived at Ghostarchive and the Wayback Machine : "Cute Baby Seal Robot - PARO Theraputic Robot #DigInfo". YouTube .
  31. Kim HD, et al (2009). "Target Speech Detection and Separation for Communication with Humanoid Robots in Noisy Home Environments." Advanced Robotics 23 (15): 2093-2111.
  32. Batliner A, et al (Jan 2011). "Searching for the most important feature types signalling emotion-related user states in speech." Computer Speech and Language 25 (1): 4-28.
  33. "Special issue on machine olfaction". IEEE Sensors Journal. 11 (12): 3486. 2011. Bibcode:2011ISenJ..11.3486.. doi:10.1109/JSEN.2011.2167171.
  34. Geffen, Wouter H. van; Bruins, Marcel; Kerstjens, Huib A. M. (2016-01-01). "Diagnosing viral and bacterial respiratory infections in acute COPD exacerbations by an electronic nose: a pilot study". Journal of Breath Research. 10 (3): 036001. Bibcode:2016JBR....10c6001V. doi: 10.1088/1752-7155/10/3/036001 . ISSN   1752-7163. PMID   27310311.
  35. Stassen, I.; Bueken, B.; Reinsch, H.; Oudenhoven, J. F. M.; Wouters, D.; Hajek, J.; Van Speybroeck, V.; Stock, N.; Vereecken, P. M.; Van Schaijk, R.; De Vos, D.; Ameloot, R. (2016). "Towards metal–organic framework based field effect chemical sensors: UiO-66-NH2 for nerve agent detection". Chem. Sci. 7 (9): 5827–5832. doi:10.1039/C6SC00987E. hdl:1854/LU-8157872. PMC   6024240 . PMID   30034722.
  36. Gutierrez-Osuna, R. (2002). "Pattern analysis for machine olfaction: A review". IEEE Sensors Journal. 2 (3): 189–202. Bibcode:2002ISenJ...2..189G. doi:10.1109/jsen.2002.800688.
  37. Phaisangittisagul, Ekachai; Nagle, H. Troy (2011). "Predicting odor mixture's responses on machine olfaction sensors". Sensors and Actuators B: Chemical. 155 (2): 473–482. doi:10.1016/j.snb.2010.12.049.
  38. Vembu, Shankar; Vergara, Alexander; Muezzinoglu, Mehmet K.; Huerta, Ramón (2012). "On time series features and kernels for machine olfaction". Sensors and Actuators B: Chemical. 174: 535–546. doi:10.1016/j.snb.2012.06.070.
  39. Vlasov, Yu; Legin, A.; Rudnitskaya, A.; Natale, C. Di; D'Amico, A. (2005-01-01). "Nonspecific sensor arrays ("electronic tongue") for chemical analysis of liquids (IUPAC Technical Report)". Pure and Applied Chemistry. 77 (11): 1965–1983. doi: 10.1351/pac200577111965 . ISSN   0033-4545. S2CID   109659409.
  40. Khalilian, Alireza; Khan, Md. Rajibur Rahaman; Kang, Shin-Won (2017). "Highly sensitive and wide-dynamic-range side-polished fiber-optic taste sensor". Sensors and Actuators B: Chemical. 249: 700–707. doi:10.1016/j.snb.2017.04.088.
  41. Sochacki, Grzegorz; Abdulali, Arsen; Iida, Fumiya (2022). "Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking". Frontiers in Robotics and AI. 9: 886074. doi: 10.3389/frobt.2022.886074 . ISSN   2296-9144. PMC   9114309 . PMID   35603082.
  42. "Super seers: why some people can see ultraviolet light". New Scientist. 4 December 2019. Retrieved 4 August 2022.
  43. Cañón Bermúdez, Gilbert Santiago; Fuchs, Hagen; Bischoff, Lothar; Fassbender, Jürgen; Makarov, Denys (November 2018). "Electronic-skin compasses for geomagnetic field-driven artificial magnetoreception and interactive electronics". Nature Electronics. 1 (11): 589–595. doi:10.1038/s41928-018-0161-6. ISSN   2520-1131. S2CID   125371382.
  44. Varadharajan, Vivek Shankar; St-Onge, David; Adams, Bram; Beltrame, Giovanni (1 March 2020). "SOUL: data sharing for robot swarms" (PDF). Autonomous Robots. 44 (3): 377–394. doi:10.1007/s10514-019-09855-2. ISSN   1573-7527. S2CID   182651100.
  45. Scholl, Philipp M.; Brachmann, Martina; Santini, Silvia; Van Laerhoven, Kristof (2014). "Integrating Wireless Sensor Nodes in the Robot Operating System". Cooperative Robots and Sensor Networks 2014. Studies in Computational Intelligence. Vol. 554. Springer. pp. 141–157. doi:10.1007/978-3-642-55029-4_7. ISBN   978-3-642-55028-7.
  46. Vincent, James (14 November 2019). "Security robots are mobile surveillance devices, not human replacements". The Verge. Retrieved 4 August 2022.
  47. Melinte, Daniel Octavian; Vladareanu, Luige (23 April 2020). "Facial Expressions Recognition for Human–Robot Interaction Using Deep Convolutional Neural Networks with Rectified Adam Optimizer". Sensors. 20 (8): 2393. Bibcode:2020Senso..20.2393M. doi: 10.3390/s20082393 . PMC   7219340 . PMID   32340140.