Michele Rucci

Last updated

Michele Rucci is an Italian born neuroscientist and biomedical engineer who studies visual perception. He is a Professor of Brain and Cognitive Sciences and member of the Center for Visual Science at the University of Rochester.

Contents

Biography

Rucci received Laurea (MA) and Ph.D. degrees in biomedical engineering from the University of Florence and the Sant'Anna School of Advanced Studies in Pisa, respectively. He trained as a Postdoctoral Fellow at The Neurosciences Institute in San Diego. He was then Professor of Psychological and Brain Sciences at Boston University.

He is primarily known for his work on active perception in humans and machines, particularly for his research on eye movements [1] [2] [3] [4] [5] and for developing robotic systems controlled by computational models of neural pathways in the brain. [6] [7] [8] [9]

Selected works

Related Research Articles

<span class="mw-page-title-main">Saccade</span> Eye movement

A saccade is a quick, simultaneous movement of both eyes between two or more phases of fixation in the same direction. In contrast, in smooth-pursuit movements, the eyes move smoothly instead of in jumps. The phenomenon can be associated with a shift in frequency of an emitted signal or a movement of a body part or device. Controlled cortically by the frontal eye fields (FEF), or subcortically by the superior colliculus, saccades serve as a mechanism for fixation, rapid eye movement, and the fast phase of optokinetic nystagmus. The word appears to have been coined in the 1880s by French ophthalmologist Émile Javal, who used a mirror on one side of a page to observe eye movement in silent reading, and found that it involves a succession of discontinuous individual movements.

Saccadic masking, also known as (visual) saccadic suppression, is the phenomenon in visual perception where the brain selectively blocks visual processing during eye movements in such a way that neither the motion of the eye nor the gap in visual perception is noticeable to the viewer.

<span class="mw-page-title-main">Afterimage</span> Image that continues to appear in the eyes after a period of exposure to the original image

An afterimage is an image that continues to appear in the eyes after a period of exposure to the original image. An afterimage may be a normal phenomenon or may be pathological (palinopsia). Illusory palinopsia may be a pathological exaggeration of physiological afterimages. Afterimages occur because photochemical activity in the retina continues even when the eyes are no longer experiencing the original stimulus.

In the study of vision, visual short-term memory (VSTM) is one of three broad memory systems including iconic memory and long-term memory. VSTM is a type of short-term memory, but one limited to information within the visual domain.

Magnocellular cells, also called M-cells, are neurons located within the magnocellular layer of the lateral geniculate nucleus of the thalamus. The cells are part of the visual system. They are termed "magnocellular" since they are characterized by their relatively large size compared to parvocellular cells.

Multistable perception is a perceptual phenomenon in which an observer experiences an unpredictable sequence of spontaneous subjective changes. While usually associated with visual perception, multistable perception can also be experienced with auditory and olfactory percepts.

Sensory processing is the process that organizes and distinguishes sensation from one's own body and the environment, thus making it possible to use the body effectively within the environment. Specifically, it deals with how the brain processes multiple sensory modality inputs, such as proprioception, vision, auditory system, tactile, olfactory, vestibular system, interoception, and taste into usable functional outputs.

Microsaccades are a kind of fixational eye movement. They are small, jerk-like, involuntary eye movements, similar to miniature versions of voluntary saccades. They typically occur during prolonged visual fixation, not only in humans, but also in animals with foveal vision. Microsaccade amplitudes vary from 2 to 120 arcminutes. The first empirical evidence for their existence was provided by Robert Darwin, the father of Charles Darwin.

<span class="mw-page-title-main">Intraparietal sulcus</span> Sulcus on the lateral surface of the parietal lobe

The intraparietal sulcus (IPS) is located on the lateral surface of the parietal lobe, and consists of an oblique and a horizontal portion. The IPS contains a series of functionally distinct subregions that have been intensively investigated using both single cell neurophysiology in primates and human functional neuroimaging. Its principal functions are related to perceptual-motor coordination and visual attention, which allows for visually-guided pointing, grasping, and object manipulation that can produce a desired effect.

<span class="mw-page-title-main">Flash lag illusion</span> Optical illusion

The flash lag illusion or flash-lag effect is a visual illusion wherein a flash and a moving object that appear in the same location are perceived to be displaced from one another. Several explanations for this simple illusion have been explored in the neuroscience literature.

<span class="mw-page-title-main">Optokinetic response</span>

The optokinetic reflex (OKR), also referred to as the optokinetic response, or optokinetic nystagmus (OKN), is a compensatory reflex that supports visual image stabilization. The purpose of OKR is to prevent image blur on the retina that would otherwise occur when an animal moves its head or navigates through its environment. This is achieved by the reflexive movement of the eyes in the same direction as image motion, so as to minimize the relative motion of the visual scene on the eye. OKR is best evoked by slow, rotational motion, and operates in coordination with several complementary reflexes that also support image stabilization, including the vestibulo-ocular reflex (VOR).

<span class="mw-page-title-main">Fixation (visual)</span> Maintaining ones gaze on a single location

Fixation or visual fixation is the maintaining of the gaze on a single location. An animal can exhibit visual fixation if it possess a fovea in the anatomy of their eye. The fovea is typically located at the center of the retina and is the point of clearest vision. The species in which fixational eye movement has been verified thus far include humans, primates, cats, rabbits, turtles, salamanders, and owls. Regular eye movement alternates between saccades and visual fixations, the notable exception being in smooth pursuit, controlled by a different neural substrate that appears to have developed for hunting prey. The term "fixation" can either be used to refer to the point in time and space of focus or the act of fixating. Fixation, in the act of fixating, is the point between any two saccades, during which the eyes are relatively stationary and virtually all visual input occurs. In the absence of retinal jitter, a laboratory condition known as retinal stabilization, perceptions tend to rapidly fade away. To maintain visibility, the nervous system carries out a procedure called fixational eye movement, which continuously stimulates neurons in the early visual areas of the brain responding to transient stimuli. There are three categories of fixational eye movement: microsaccades, ocular drifts, and ocular microtremor. At small amplitudes the boundaries between categories become unclear, particularly between drift and tremor.

<span class="mw-page-title-main">Posterior parietal cortex</span>

The posterior parietal cortex plays an important role in planned movements, spatial reasoning, and attention.

Neurorobotics is the combined study of neuroscience, robotics, and artificial intelligence. It is the science and technology of embodied autonomous neural systems. Neural systems include brain-inspired algorithms, computational models of biological neural networks and actual biological systems. Such neural systems can be embodied in machines with mechanic or any other forms of physical actuation. This includes robots, prosthetic or wearable systems but also, at smaller scale, micro-machines and, at the larger scales, furniture and infrastructures.

Visual perception is the ability to interpret the surrounding environment through photopic vision, color vision, scotopic vision, and mesopic vision, using light in the visible spectrum reflected by objects in the environment. This is different from visual acuity, which refers to how clearly a person sees. A person can have problems with visual perceptual processing even if they have 20/20 vision.

In vision science, stabilized Images are images that remain immobile on the retina. Under natural viewing conditions, the eyes are always in motion. Small eye movements continually occur even when attempting fixation. Experiments in the early 1950s established that stabilized images result in the fading and disappearance of the visual percept, possibly due to retinal adaptation to a stationary field. In 2007, studies indicated that stabilizing vision between saccades selectively impairs vision of fine spatial detail.

Richard Alan Andersen is an American neuroscientist. He is the James G. Boswell Professor of Neuroscience at the California Institute of Technology in Pasadena, California. His research focuses on visual physiology with an emphasis on translational research to humans in the field of neuroprosthetics, brain-computer interfaces, and cortical repair.

Visual crowding is the inability to view a target stimulus distinctly when presented in a clutter. Crowding impairs the ability to discriminate object features and contours among flankers, which in turn impairs people's ability to respond appropriately to the target stimulus.

Martin A. Giese is a German theoretical neuroscientist and biomedical engineer. He is full professor at the University of Tübingen and head of the Section Computational Sensomotorics at the Hertie Institute for Clinical Brain Research (HIH) as well as at the Centre for Integrative Neuroscience (CIN), since 2008.

Ehud Zohary is an Israeli scientist, professor of neurobiology at the Edmond and Lilly Safra Center for Brain Science and Alexander Silberman Institute of Life Sciences, the Hebrew University of Jerusalem.

References

  1. "Eye flickers key for fine detail". BBC News. June 2007.
  2. Kowler E, Collewijn H (2010). "The eye on the needle". Nature Neuroscience. 13 (12): 1443–1444. doi:10.1038/nn1210-1443. PMID   21102565. S2CID   7350173.
  3. Kagan I (2012). "Active vision: Fixational eye movements help seeing space in time". Current Biology. 22 (6): R186–R188. Bibcode:2012CBio...22.R186K. doi: 10.1016/j.cub.2012.02.009 . PMID   22440800.
  4. Kagan I, Hafed Z (2013). "Active vision: Microsaccades direct the eye to where it matters most". Current Biology. 23 (17): R712–R714. Bibcode:2013CBio...23.R712K. doi: 10.1016/j.cub.2013.07.038 . PMID   24028947.
  5. "Shifty eyes see finer details". Science News. 2007.
  6. "Neurotic robots act more human". Discovery News. June 2014.
  7. "Imagine machines that can see". Wired. June 2003.
  8. Wilan, Ken Howard (August 2005). "Technology to mimic mother nature". Boston.com. The Boston Globe.
  9. Service RF (October 2014). "Minds of their own". Science. 346 (6206): 182–183. doi:10.1126/science.346.6206.182. PMID   25301614.