Sensing floor

Last updated

A sensing floor is a floor with embedded sensors. Depending on their construction, these floors are either monobloc (e.g. structures made of a single frame, carpets). [1] [2] [3] or modular (e.g. tiled floors, floors made of stripes of sensors). [4] [5] [6] The first sensing floor prototypes were developed in the 1990s, mainly for human gait analysis. Such floors are usually used as a source of sensing information for an ambient intelligence. Depending on the type of sensors employed, sensing floors can measure load (pressure), proximity (to detect, track, and recognize humans), as well as the magnetic field (for detecting metallic objects like robots using magnetometers).

Sensing floors have a variety of usages:

More than 30 distinct sensing floor prototypes have been developed between 1990 and 2015. [10] Notable examples of sensing floors have been developed by Oracle, [5] MIT, [6] and Inria. [4] As of 2015, few sensing floors are available as commercial products, mainly targeting healthcare facilities (e.g. the GAITRite surface pressure sensing floor, [11] and the SensFloor [12] ).

Citations

  1. 1 2 Joseph Paradiso, Craig Abler, Kai-yuh Hsiao, Matthew Reynolds, The magic carpet: physical sensing for immersive environments, CHI '97 Extended Abstracts on Human Factors in Computing Systems, 1997.
  2. Yu-Lin Shen, Chow-Shing Shin Distributed Sensing Floor for an Intelligent Environment, IEEE Sensors Journal, December 2009, pp. 1673 –1678.
  3. Albrecht Schmidt, Martin Strohbach, Kristof van Laerhoven, Adrian Friday, Hans-Werner Gellersen, Context Acquisition Based on Load Sensing, Proceedings of the 4th international conference on Ubiquitous Computing, UbiComp 2002.
  4. 1 2 Mihai Andries, Olivier Simonin, François Charpillet Localization of Humans, Objects, and Robots Interacting on Load-Sensing Floors, IEEE Sensors Journal, Volume 16, Issue 4, Feb, 2016
  5. 1 2 M.D. Addlesee, A. Jones, F. Livesey, F. Samaria The ORL active floor [sensor system], IEEE Personal Communications, 1997
  6. 1 2 Bruce Richardson, Krispin Leydon, Mikael Fernström, Joseph A. Paradiso, Z-Tiles: building blocks for modular, pressure-sensing floorspaces, CHI, 2004
  7. Mihai Andries, Object and human tracking, and robot control through a load sensing floor, PhD dissertation, Université de Lorraine, 2015. Chapter 7 on "Providing roadmaps for autonomous robotic navigation".
  8. The Active Gaming Company. Lightspace floor
  9. Prashant Srinivasan, David Birchfield, Gang Qian, Assegid Kidané, A pressure sensing floor for interactive media applications, Proceedings of the 2005 ACM SIGCHI International Conference on Advances in computer entertainment technology, pp. 278--281, ACM 2005
  10. Mihai Andries, Object and human tracking, and robot control through a load sensing floor, PhD dissertation, Université de Lorraine, 2015. Chapter 2 "Sensing floors: existing prototypes and related work" contains a comprehensive list of sensing floor prototypes.
  11. GAITRite sensing carpet
  12. Christl Lauterbach, Axel Steinhage, Axel Techmer, Large-area wireless sensor system based on smart textiles, International Multi-Conference on Systems, Sygnals Devices, 2012

Related Research Articles

Haptic technology Any form of interaction involving touch

Haptic technology, also known as kinaesthetic communication or 3D touch, refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

Simultaneous localization and mapping Computational problem of constructing a map while tracking an agents location within it

Simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it. While this initially appears to be a chicken-and-egg problem there are several algorithms known for solving it, at least approximately, in tractable time for certain environments. Popular approximate solution methods include the particle filter, extended Kalman filter, covariance intersection, and GraphSLAM. SLAM algorithms are based on concepts in computational geometry and computer vision, and are used in robot navigation, robotic mapping and odometry for virtual reality or augmented reality.

Gesture recognition Topic in computer science and language technology

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. It is a subdiscipline of computer vision. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices.

Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

Mobile robot Type of robot

A mobile robot, is a robot that is capable of moving in the surrounding (locomotion). Mobile robotics is usually considered to be a subfield of robotics and information engineering.

iCub Open source robotics humanoid robot testbed

iCub is a 1 metre tall open source robotics humanoid robot testbed for research into human cognition and artificial intelligence.

Dava Newman

Dava J. Newman is the director of the MIT Media Lab and a former deputy administrator of NASA. Newman earned her PhD in aerospace biomedical engineering, and Master of Science degrees in aerospace engineering and technology and policy all from MIT, and her Bachelor of Science degree in aerospace engineering from the University of Notre Dame. Newman is the Apollo Program Professor of Aeronautics and Astronautics and Engineering Systems at the Massachusetts Institute of Technology and a member of the faculty at the Harvard–MIT Program in Health Sciences and Technology. She is also a MacVicar Faculty Fellow, former director of the Technology and Policy Program at MIT (2003–2015), and has been the director of the MIT Portugal Program since 2011. As the director of MIT's Technology and Policy Program (TPP), she led the institute's largest multidisciplinary graduate research program, with over 1,200 alumni. She has been a faculty member in her home department of Aeronautics and Astronautics and MIT's School of Engineering since 1993.

Modular self-reconfiguring robotic systems or self-reconfigurable modular robots are autonomous kinematic machines with variable morphology. Beyond conventional actuation, sensing and control typically found in fixed-morphology robots, self-reconfiguring robots are also able to deliberately change their own shape by rearranging the connectivity of their parts, in order to adapt to new circumstances, perform new tasks, or recover from damage.

Coco is the latest platform at the Massachusetts Institute of Technology's Humanoid Robotics Group, and a successor to Cog. Unlike previous platforms, Coco is built along more ape-like lines, rather than human. Coco is also notable for being mobile. Although there is ongoing research on the robot, the group has many robots dealing with human interactions. The Humanoid Robotics Group has planned to add more useful functions in the future, but have not set an exact date for such project.

Electric Field Proximity Sensing or EFPS is a sensory system that relies on the fact that an electric field can be perturbed by the existence of a nearby object, provided it is at least slightly conductive. One type of EFPS is The People Detector. The People Detector is a micro-electronic based device that can detect the presence of both moving and stationary objects through solid materials. Its ability to operate through any non-conductive material permits complete invisibility. The sensor functions by detecting small changes in an ultra-low-power electromagnetic field generated between two remotely located antenna electrodes. Its range is adjustable from a few centimetres [inches] to 4 m [over 12 feet]. Electric field proximity detectors can detect partially conducting or conducting objects and does not depend on impedance to ground.

Robot navigation

Robot localization denotes the robot's ability to establish its own position and orientation within the frame of reference. Path planning is effectively an extension of localisation, in that it requires the determination of the robot's current position and a position of a goal location, both within the same frame of reference or coordinates. Map building can be in the shape of a metric map or any notation describing locations in the robot frame of reference.

Robotics Design, construction, use, and application of robots

Robotics is an interdisciplinary branch of computer science and engineering. Robotics involves design, construction, operation, and use of robots. The goal of robotics is to design machines that can help and assist humans. Robotics integrates fields of mechanical engineering, electrical engineering, information engineering, mechatronics, electronics, bioengineering, computer engineering, control engineering, software engineering, mathematics, etc.

Tactile sensor

A tactile sensor is a device that measures information arising from physical interaction with its environment. Tactile sensors are generally modeled after the biological sense of cutaneous touch which is capable of detecting stimuli resulting from mechanical stimulation, temperature, and pain. Tactile sensors are used in robotics, computer hardware and security systems. A common application of tactile sensors is in touchscreen devices on mobile phones and computing.

The Miburi is a wearable musical instrument which was released commercially by the Yamaha Corporation’s Tokyo-based experimental division in 1994.

Robotic sensing is a subarea of robotics science intended to give robots sensing capabilities. Robotic sensing mainly gives robots the ability to see, touch, hear and move and uses algorithms that require environmental feedback.

Neuromechanics of orthoses refers to how the human body interacts with orthoses. Millions of people in the U.S. suffer from stroke, multiple sclerosis, postpolio, spinal cord injuries, or various other ailments that benefit from the use of orthoses. Insofar as active orthoses and powered exoskeletons are concerned, the technology to build these devices is improving rapidly, but little research has been done on the human side of these human-machine interfaces.

Maria Gini is an Italian and American Computer Scientist in artificial intelligence and robotics. She has considerable service to the computer science artificial intelligence community and for broadening participation in computing. She was Chair of the ACM Special Interest Group in Artificial Intelligence SIGAI from 2003-2010. She is currently a member of the CRA-W board.

Robotic prosthesis control is a method for controlling a prosthesis in such a way that the controlled robotic prosthesis restores a biologically accurate gait to a person with a loss of limb. This is a special branch of control that has an emphasis on the interaction between humans and robotics.

Soft robotics Subfield of robotics

Soft robotics is a subfield of robotics that concerns the design, control, and fabrication of robots composed of compliant materials, instead of rigid links. In contrast to rigid-bodied robots built from metals, ceramics and hard plastics, the compliance of soft robots can improve their safety when working in close contact with humans.

Vivian Chu is an American roboticist and entrepreneur, specializing in the field of human-robot interaction. She is Chief Technology Officer at Diligent Robotics, a company she co-founded in 2017 for creating autonomous, mobile, socially intelligent robots.