ViEWER

Last updated
ViEWER
Stable release
2.25
Written in C++
Operating system Windows NT family
Platform x86
Available inEnglish
Website webpages.uidaho.edu/~bdyre/viewer.htm

ViEWER, the Virtual Environment Workbench for Education and Research, is a proprietary, freeware computer program for Microsoft Windows written by researchers at the University of Idaho for the study of visual perception and complex immersive three-dimensional environments.

It was created using C++ and OpenGL, and has been used by Dr. Brian Dyre, Dr. Steffen Werner, Dr. Ernesto Bustamante, Dr. Ben Barton, and their undergraduate and graduate researchers in visual perception, signal detection, and child-safety experiments. [1]

[2] [3]

Related Research Articles

History of the graphical user interface Wikimedia history article

The history of the graphical user interface, understood as the use of graphic icons and a pointing device to control a computer, covers a five-decade span of incremental refinements, built on some constant core principles. Several vendors have created their own windowing systems based on independent code, but with basic elements in common that define the WIMP "window, icon, menu and pointing device" paradigm.

Virtual reality Computer-simulated environment simulating physical presence in real or imagined worlds

Virtual reality (VR) is a simulated experience that can be similar to or completely different from the real world. Applications of virtual reality can include entertainment and educational purposes. Other, distinct types of VR style technology include augmented reality and mixed reality.

Augmented reality View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that fulfills three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one. Augmented reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.

VA, Va and variants may refer to:

LabVIEW system-design platform and development environment

Laboratory Virtual Instrument Engineering Workbench (LabVIEW) is a system-design platform and development environment for a visual programming language from National Instruments.

National Instruments American industrial technology company

NI, formerly National Instruments Corporation, is an American multinational company with international operation. Headquartered in Austin, Texas, it is a producer of automated test equipment and virtual instrumentation software. Common applications include data acquisition, instrument control and machine vision.

Cave automatic virtual environment

A Cave Automatic Virtual Environment is an immersive virtual reality environment where projectors are directed to between three and six of the walls of a room-sized cube. The name is also a reference to the allegory of the Cave in Plato's Republic in which a philosopher contemplates perception, reality, and illusion.

Mixed reality Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time. Mixed reality does not exclusively take place in either the physical or virtual world, but is a hybrid of reality and virtual reality, encompassing both augmented reality and augmented virtuality via immersive technology.

James Jerome Gibson, was an American psychologist and one of the most important contributors to the field of visual perception. Gibson challenged the idea that the nervous system actively constructs conscious visual perception, and instead promoted ecological psychology, in which the mind directly perceives environmental stimuli without additional cognitive construction or processing. A Review of General Psychology survey, published in 2002, ranked him as the 88th most cited psychologist of the 20th century, tied with John Garcia, David Rumelhart, Louis Leon Thurstone, Margaret Floy Washburn, and Robert S. Woodworth.

Head-mounted display device used in virtual reality systems

A head-mounted display (HMD) is a display device, worn on the head or as part of a helmet, that has a small display optic in front of one or each eye. An HMD has many uses including gaming, aviation, engineering, and medicine. Virtual reality headsets are HMDs combined with IMUs. There is also an optical head-mounted display (OHMD), which is a wearable display that can reflect projected images and allows a user to see through it.

Viewer may refer to:

Visual music, sometimes called colour music, refers to the use of musical structures in visual imagery, which can also include silent films or silent Lumia work. It also refers to methods or devices which can translate sounds or music into a related visual presentation. An expanded definition may include the translation of music to painting; this was the original definition of the term, as coined by Roger Fry in 1912 to describe the work of Wassily Kandinsky. There are a variety of definitions of visual music, particularly as the field continues to expand. In some recent writing, usually in the fine art world, visual music is often confused with or defined as synaesthesia, though historically this has never been a definition of visual music. Visual music has also been defined as a form of intermedia.

Synthetic vision system

A synthetic vision system (SVS) is a computer-mediated reality system for aerial vehicles, that uses 3D to provide pilots with clear and intuitive means of understanding their flying environment.

Parallax scanning depth enhancing imaging methods rely on discrete parallax differences between depth planes in a scene. The differences are caused by a parallax scan. When properly balanced (tuned) and displayed, the discrete parallax differences are perceived by the brain as depth.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

Jocelyn Faubert is a psychophysicist best known for his work in the fields of visual perception, vision of the elderly, and neuropsychology. Professor Faubert holds the NSERC-Essilor Industrial Research Chair in Visual Perception and Presbyopia. He is the director of the Laboratory of Psychophysics and Visual Perception at the University of Montreal. Professor Faubert has also been involved in the award-winning transfer of research and developments from the laboratory into the commercial domain. He is a co-founder and member of the Board of Directors of CogniSens Inc.

Virtual reality sickness occurs when exposure to a virtual environment causes symptoms that are similar to motion sickness symptoms. The most common symptoms are general discomfort, headache, stomach awareness, nausea, vomiting, pallor, sweating, fatigue, drowsiness, disorientation, and apathy. Other symptoms include postural instability and retching. Virtual reality sickness is different from motion sickness in that it can be caused by the visually-induced perception of self-motion; real self-motion is not needed. It is also different from simulator sickness; non-virtual reality simulator sickness tends to be characterized by oculomotor disturbances, whereas virtual reality sickness tends to be characterized by disorientation.

Visuo-haptic mixed reality (VHMR) is a branch of mixed reality that has the ability of merging visual and tactile perceptions of both virtual and real objects with a collocated approach. The first known system to overlay augmented haptic perceptions on direct views of the real world is the Virtual Fixtures system developed in 1992 at the US Air Force Research Laboratories. Like any emerging technology, the development of the VHMR systems is accompanied by challenges that, in this case, deal with the efforts to enhance the multi-modal human perception with the user-computer interface and interaction devices at the moment available. Visuo-haptic mixed reality (VHMR) consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects and haptic devices necessary to provide haptic stimuli to the user while interacting with the virtual objects. A VHMR setup allows the user to perceive visual and kinesthetic stimuli in a co-located manner, i.e., the user can see and touch virtual objects at the same spatial location. This setup overcomes the limits of the traditional one, i.e, display and haptic device, because the visuo-haptic co-location of the user's hand and a virtual tool improve the sensory integration of multimodal cues and makes the interaction more natural. But it also comes with technological challenges in order to improve the naturalness of the perceptual experience.

<i>The Visualization Handbook</i> Textbook for scientific visualization training

The Visualization Handbook is a textbook by Charles D. Hansen and Christopher R. Johnson that serves as a survey of the field of scientific visualization by presenting the basic concepts and algorithms in addition to a current review of visualization research topics and tools. It is commonly used as a textbook for scientific visualization graduate courses. It is also commonly cited as a reference for scientific visualization and computer graphics in published papers, with almost 500 citations documented on Google Scholar.

Dr. Alan B. Craig works at the Institute for Computing in the Humanities, Arts, and Social Science (I-CHASS) at the University of Illinois, where he holds the title of Senior Associate Director for Human-Computer Interaction. He is also a research scientist at the National Center for Supercomputing Applications (NCSA) and the Humanities, Arts, and Social Science Specialist for the Extreme Science and Engineering Discovery Environment (XSEDE), a virtual system that scientists can use to share computing resources, data, and expertise.

References

  1. Derek Viita, Map Display Properties in On-Board Navigation Systems, unpublished master's thesis, University of Idaho, 2006.
  2. Dyre, BP; Grimes, JP; Lew, R (2007). "ViEWER: A Virtual Environment Workbench for Education and Research". Google Scholar.
  3. Bulkley, Nathan K.; Dyre, Brian P.; Lew, Roger; Caufield, Kristin (2009). "A Peripherally-Located Virtual Instrument Landing Display Affords More Precise Control of Approach Path during Simulated Landings than Traditional Instrument Landing Displays". Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 53: 31–35. doi:10.1177/154193120905300108.