Scott Fisher (technologist)

Last updated
Scott S. Fisher
ScottFFapple.jpg
Scott Fisher holding a plastic apple
Spouse Mizuko Ito
Children2

Scott Fisher is the Professor and Founding Chair of the Interactive Media Division in the USC School of Cinematic Arts at the University of Southern California, and Director of the Mobile and Environmental Media Lab there. He is an artist and technologist who has worked extensively on virtual reality, including pioneering work at NASA, Atari Research Labs, MIT's Architecture Machine Group (now the MIT Media Lab) and Keio University.

Contents

Early life

Scott S. Fisher was born in 1951 at Bryn Mawr, Pennsylvania, near Philadelphia. He was educated at MIT, receiving a Master of Science degree in Media Technology in 1981. His thesis advisor there was Nicholas Negroponte. There he participated in the creation of the Aspen Movie Map.[ citation needed ]

Career

Much of Fisher's career has focused on expanding the technologies and creative potentials of virtual reality. Between 1985 and 1990, he was founding Director of the Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center. They attempted to develop a simulator to enable space station maintenance rehearsal. The gloves and goggles often associated with virtual reality were developed there, along with the dataglove, head-coupled displays and 3D audio.[ citation needed ]

In 1989, with Brenda Laurel, Fisher founded Telepresence Research, a company specializing in first-person media, virtual reality and remote presence research and development. [1] [2] [3] Fisher, Joseph M. Rosen and Phil S. Green developed a robotic system called the Green SRI telepresence surgical system. [4] [5] [6] [7] [8] [9] [10] [11] Fisher and Joseph M. Rosen developed a virtual environment system for viewing and manipulating a model of the human leg. [12] [13] [14]

Fisher was Project Professor in the Graduate School of Media and Governance at Keio University. There he led a project to allow users to author and view location-based data superimposed over the physical world, a progenitor of what is now termed augmented reality.[ citation needed ]

In 2001, Fisher moved to the University of Southern California to spearhead their new Interactive Media Division within the School of Cinematic Arts. There, he established the division's research initiatives in the mediums of immersive, mobile and video games. He chaired the division from its founding through 2011.

Personal life

Fisher lives in Southern California with his wife, Mizuko Ito, a cultural anthropologist studying media technology, and their two children.[ citation needed ]

In addition to his other interests, he harbors a fascination with stereoscopy and 3D imaging, as well as Jamaican music.[ citation needed ]

Selected publications

Papers and conference proceedings

Media appearances

Related Research Articles

<span class="mw-page-title-main">Virtual reality</span> Computer-simulated experience

Virtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment, education and business. VR is one of the key technologies in the reality-virtuality continuum. As such, it is different from other digital visualization solutions, such as augmented virtuality and augmented reality.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. As such, it is one of the key technologies in the reality-virtuality continuum.

Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance or effect of being present via telerobotics, at a place other than their true location.

<span class="mw-page-title-main">Telerobotics</span>

Telerobotics is the area of robotics concerned with the control of semi-autonomous robots from a distance, chiefly using television, wireless networks or tethered connections. It is a combination of two major subfields, which are teleoperation and telepresence.

<span class="mw-page-title-main">Computer-mediated reality</span> Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

<span class="mw-page-title-main">Mixed reality</span> Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is a term used to describe the merging of a real-world environment and a computer-generated one. Physical and virtual objects may co-exist in mixed reality environments and interact in real time.

A volumetric display device is a display device that forms a visual representation of an object in three physical dimensions, as opposed to the planar image of traditional screens that simulate depth through a number of different visual effects. One definition offered by pioneers in the field is that volumetric displays create 3D imagery via the emission, scattering, or relaying of illumination from well-defined regions in (x,y,z) space.

A virtual fixture is an overlay of augmented sensory information upon a user's perception of a real environment in order to improve human performance in both direct and remotely manipulated tasks. Developed in the early 1990s by Louis Rosenberg at the U.S. Air Force Research Laboratory (AFRL), Virtual Fixtures was a pioneering platform in virtual reality and augmented reality technologies.

Eric Mayorga Howlett was the inventor of the LEEP, extreme wide-angle stereoscopic optics used in photographic and virtual reality systems.

<span class="mw-page-title-main">Virtual art</span>

Virtual art is a term for the virtualization of art, made with the technical media developed at the end of the 1980s. These include human-machine interfaces such as visualization casks, stereoscopic spectacles and screens, digital painting and sculpture, generators of three-dimensional sound, data gloves, data clothes, position sensors, tactile and power feed-back systems, etc. As virtual art covers such a wide array of mediums it is a catch-all term for specific focuses within it. Much contemporary art has become, in Frank Popper's terms, virtualized.

<span class="mw-page-title-main">Immersion (virtual reality)</span> Perception of being physically present in a non-physical world

Immersion into virtual reality (VR) is the perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system in images, sound or other stimuli that provide an engrossing total environment.

Presence is a theoretical concept describing the extent to which media represent the world. Presence is further described by Matthew Lombard and Theresa Ditton as “an illusion that a mediated experience is not mediated." Today, it often considers the effect that people experience when they interact with a computer-mediated or computer-generated environment. The conceptualization of presence borrows from multiple fields including communication, computer science, psychology, science, engineering, philosophy, and the arts. The concept of presence accounts for a variety of computer applications and Web-based entertainment today that are developed on the fundamentals of the phenomenon, in order to give people the sense of, as Sheridan called it, “being there."

Stereoscopic Displays and Applications (SD&A) is an academic technical conference in the field of stereoscopic 3D imaging. The conference started in 1990 and is held annually. The conference is held as part of the annual Electronic Imaging: Science and Technology Symposium organised by the Society for Imaging Science and Technology (IS&T).

John O. Merritt was an expert in the applications of stereoscopic 3D displays and remote-presence systems.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

<span class="mw-page-title-main">Crystal River Engineering</span>

Crystal River Engineering Inc. was an American technology company best known for their pioneering work in HRTF based real-time binaural, or 3D sound processing hardware and software. The company was founded in 1989 by Scott Foster after he received a contract from NASA to create the audio component of VIEW, a virtual reality based training simulator for astronauts. Crystal River Engineering was acquired by Aureal Semiconductor in 1996.

The Dextroscope is a medical equipment system that creates a virtual reality (VR) environment in which surgeons can plan neurosurgical and other surgical procedures.

Volume Interactions Pte Ltd was a company that pioneered in the 1990s the use Virtual Reality technology in surgery planning. The company created and marketed the Dextroscope, the first commercial surgical planning system that used virtual reality principles going beyond the mouse and keyboard. The Dextroscope introduced a variation of Virtual Reality technology that didn't use Head-Mounted display that provided a natural and comfortable interface to work with multi-modality 3D medical images for long periods of time. This environment was applied to the planning of patient-specific surgical approaches for several clinical disciplines, including neurosurgery, Ear-Nose-Throat, and liver surgery. The Dextroscope received world-wide attention by being involved in the planning of several craniopagus twin separations, most notably the Zambian twins (1997) and the German twins (2004) at Johns Hopkins Hospital led by Dr Benjamin Carson, and the Nepali twins separation at the Singapore General Hospital in 2001.

The Dextrobeam is a highly interactive console that enables collaborative examination of three-dimensional (3-D) medical imaging data for planning, discussing, or teaching neurosurgical approaches and strategies. The console is designed to work in combination with a 3D stereoscopic display. The console enables two-handed interaction by means of two 6 Degree-of-Freedom motion tracking devices. A set of built-in software tools gives users the ability to manipulate and interact with patients’ imaging data in a natural and intuitive way.

References