Stereoscopic Displays and Applications

Last updated
Stereoscopic Displays and Applications
AbbreviationSD&A
Discipline Stereoscopic Imaging
Publication details
PublisherIS&T
History1990-(ongoing)
Frequencyannual
Website www.stereoscopic.org

Stereoscopic Displays and Applications (SD&A) is an academic technical conference in the field of stereoscopic 3D imaging. [1] The conference started in 1990 and is held annually. The conference is held as part of the annual Electronic Imaging: Science and Technology Symposium [2] organised by the Society for Imaging Science and Technology (IS&T).

Contents

Scope

SD&A is an academic technical conference dedicated to stereoscopic imaging topics, specializing in all forms of stereoscopic imaging, including stereoscopic 3D display hardware, stereoscopic 3D image capture, stereoscopic 3D image storage and processing, and also applications of these technologies. As well as including coverage of two-view 3D displays, the conference includes significant coverage of multi-view autostereoscopic displays and volumetric 3D displays - any system that stimulates stereoscopic vision in an observer.

Overview

The backbone of the annual SD&A conference is the technical presentations, which are all accompanied by a technical paper published in the conference proceedings. Alongside the technical sessions, the conference has its own keynote presentation, a popular demonstration session (where a range of stereoscopic 3D technologies can be seen in one place at one time), the 3D theater (where the latest stereoscopic content is shown), and a discussion forum.

History

SD&A was founded in 1990 by John O. Merritt and Scott Fisher, and has been held annually since. Holography pioneer Stephen Benton was also an SD&A conference chair from 2000 until his death in 2003.

From 1990 to 1993, papers from the SD&A conference were published in its own proceedings volume. From 1994, the papers from the SD&A conference were co-published with papers from The Engineering Reality of Virtual Reality conference (which was co-located with SD&A) in a volume series titled Stereoscopic Displays and Virtual Reality Systems. From 2008, SD&A went back to the publication of papers in its own proceedings volume.

Over the period 1990 to 2015, SD&A and the Electronic Imaging Symposium were jointly organized by IS&T and SPIE. From 2016 onwards, SD&A and EI are organized by IS&T.

A detailed listing of the conference program for every year since 1996 is available on the official SD&A website. The Introduction/Preface of every year's conference proceedings also contains a descriptive summary of each conference. The proceedings Introduction is also available on the official SD&A website.

Proceedings

A technical proceedings is published annually containing manuscripts presented at the conference. Over 1100 technical papers have been published over the history of the SD&A conference. The full list of conference proceedings, including a compilation DVD-ROM (1990–2009), is listed at the official SD&A website.

The DVD-ROM compilation "Stereoscopic Displays and Applications 1990-2009: A Complete 20-Year Retrospective - and The Engineering Reality of Virtual Reality 1994-2009 (CDP51)" [3] released in 2010 represents a technical knowledge base across stereoscopic 3D and VR topics containing the complete technical record of the Stereoscopic Displays and Applications conference (1990-2009), The Engineering Reality of Virtual Reality conference (1994-2009), and papers from a selection of ten other 3D related SPIE conferences (1977-1989) predating SD&A. The disc contains 1260 individual technical papers – 816 from the Stereoscopic Displays and Applications conference, 223 from The Engineering Reality of Virtual Reality conference, and 221 papers from the SPIE 3D conferences prior to SD&A.

Some of the presentations at the SD&A conference have been, or go onto be, published in the Journal of Electronic Imaging. From 1990 to 2015, the SD&A proceedings were published as part of the Proceedings of SPIE series. From 2016 the SD&A proceedings are published by IS&T and will be available open-access.

Citation statistics

Papers presented at SD&A are cited extensively. The conference tracks its citations via custom Google Scholar page. As of September 2019, the total citation count is 22k - which is almost double the citation count from January 2014 (12,371). [4] As of January 2014, the top ten most-cited papers across first 25 years of SD&A are:

RankTitleAuthor(s)YearCitations
1."Depth-image-based rendering (DIBR), compression, and transmission for a new approach on 3D-TV"Christoph Fehn2004704
2."Image distortions in stereoscopic video systems"Andrew J. Woods, Tom Docherty, Rolf Koch1993417
3."Perceptual issues in augmented reality"David Drascic, Paul Milgram1996237
4."Controlling perceived depth in stereoscopic images"Graham R. Jones, Delman Lee, Nicolas S. Holliman, David Ezra2001156
5."Variation and extrema of human interpupillary distance"Neil A. Dodgson2004147
6."Image preparation for 3D LCD"Cees van Berkel1999122
7."Effect of disparity and motion on visual comfort of stereoscopic images"Filippo Speranza, Wa J. Tam, Ron Renaud, Namho Hur2006100
8."Geometry of binocular imaging"Victor S. Grinberg, Gregg W. Podnar, Mel Siegel199499
9."Viewpoint-dependent stereoscopic display using interpolation of multiviewpoint images"Akihiro Katayama, Koichiro Tanaka, Takahiro Oshino, Hideyuki Tamura199592
10."Rapid 2D-to-3D conversion"Philip V. Harman, Julien Flack, Simon Fox, Mark Dowley200290

Reporting

Various media outlets have reported on research presented at SD&A and on the conference itself. Outlets include:

See also Citation Statistics above.

Related Research Articles

<span class="mw-page-title-main">Virtual reality</span> Computer-simulated experience

Virtual reality (VR) is a simulated experience that employs 3D near-eye displays and pose tracking to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment, education and business. VR is one of the key technologies in the reality-virtuality continuum. As such, it is different from other digital visualization solutions, such as augmented virtuality and augmented reality.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. As such, it is one of the key technologies in the reality-virtuality continuum.

<span class="mw-page-title-main">Stereoscopy</span> Technique for creating or enhancing the illusion of depth in an image

Stereoscopy is a technique for creating or enhancing the illusion of depth in an image by means of stereopsis for binocular vision. The word stereoscopy derives from Greek στερεός (stereos) 'firm, solid', and σκοπέω (skopeō) 'to look, to see'. Any stereoscopic image is called a stereogram. Originally, stereogram referred to a pair of stereo images which could be viewed using a stereoscope.

<span class="mw-page-title-main">Photogrammetry</span> Taking measurements using photography

Photogrammetry is the science and technology of obtaining reliable information about physical objects and the environment through the process of recording, measuring and interpreting photographic images and patterns of electromagnetic radiant imagery and other phenomena.

<span class="mw-page-title-main">3D display</span> Display device

A 3D display is a display device capable of conveying depth to the viewer. Many 3D displays are stereoscopic displays, which produce a basic 3D effect by means of stereopsis, but can cause eye strain and visual fatigue. Newer 3D displays such as holographic and light field displays produce a more realistic 3D effect by combining stereopsis and accurate focal length for the displayed content. Newer 3D displays in this manner cause less visual fatigue than classical stereoscopic displays.

A volumetric display device is a display device that forms a visual representation of an object in three physical dimensions, as opposed to the planar image of traditional screens that simulate depth through a number of different visual effects. One definition offered by pioneers in the field is that volumetric displays create 3D imagery via the emission, scattering, or relaying of illumination from well-defined regions in (x,y,z) space.

<span class="mw-page-title-main">Scott Fisher (technologist)</span>

Scott Fisher is the Professor and Founding Chair of the Interactive Media Division in the USC School of Cinematic Arts at the University of Southern California, and Director of the Mobile and Environmental Media Lab there. He is an artist and technologist who has worked extensively on virtual reality, including pioneering work at NASA, Atari Research Labs, MIT's Architecture Machine Group and Keio University.

SPIE is an international not-for-profit professional society for optics and photonics technology, founded in 1955. It organizes technical conferences, trade exhibitions, and continuing education programs for researchers and developers in the light-based fields of physics, including: optics, photonics, and imaging engineering. The society publishes peer-reviewed scientific journals, conference proceedings, monographs, tutorial texts, field guides, and reference volumes in print and online. SPIE is especially well-known for Photonics West, one of the laser and photonics industry's largest combined conferences and tradeshows which is held annually in San Francisco. SPIE also participates as partners in leading educational initiatives, and in 2020, for example, provided more than $5.8 million in support of optics education and outreach programs around the world.

<span class="mw-page-title-main">Autostereoscopy</span> Any method of displaying stereoscopic images without the use of special headgear or glasses

Autostereoscopy is any method of displaying stereoscopic images without the use of special headgear, glasses, something that affects vision, or anything for eyes on the part of the viewer. Because headgear is not required, it is also called "glasses-free 3D" or "glassesless 3D". There are two broad approaches currently used to accommodate motion parallax and wider viewing angles: eye-tracking, and multiple views so that the display does not need to sense where the viewer's eyes are located. Examples of autostereoscopic displays technology include lenticular lens, parallax barrier, and may include Integral imaging, but notably do not include volumetric display or holographic displays.

<span class="mw-page-title-main">Hogel</span> Part of a light-field hologram

A hogel is a part of a light-field hologram, in particular a computer-generated one. It is considered a small holographic optical element or HOE and that its total effect to that of a standard hologram only that the resolution is lower and it involves a pixelated structure. An array of these elements form the complete image of a holographic recording, which is typically displayed in 3D free-viewing device.

Infitec GmbH is a family-owned company based in Gerstetten that develops, produces and markets products for the projection of 3D content. The registered name INFITEC is an acronym of Interference Filter Technology, which was invented and patented by the founder of the company, Helmut Jorke.

<span class="mw-page-title-main">Parallax barrier</span>

A parallax barrier is a device placed in front of an image source, such as a liquid crystal display, to allow it to show a stereoscopic or multiscopic image without the need for the viewer to wear 3D glasses. Placed in front of the normal LCD, it consists of an opaque layer with a series of precisely spaced slits, allowing each eye to see a different set of pixels, so creating a sense of depth through parallax in an effect similar to what lenticular printing produces for printed products and lenticular lenses for other displays. A disadvantage of the method in its simplest form is that the viewer must be positioned in a well-defined spot to experience the 3D effect. However, recent versions of this technology have addressed this issue by using face-tracking to adjust the relative positions of the pixels and barrier slits according to the location of the user's eyes, allowing the user to experience the 3D from a wide range of positions. Another disadvantage is that the horizontal pixel count viewable by each eye is halved, reducing the overall horizontal resolution of the image.

Virtual heritage or cultural heritage and technology is the body of works dealing with information and communication technologies and their application to cultural heritage, such as virtual archaeology. It aims to restore ancient cultures as real (virtual) environments where users can immerse.

Eric Mayorga Howlett was the inventor of the LEEP, extreme wide-angle stereoscopic optics used in photographic and virtual reality systems.

Ian McDowall is the CEO of Fakespace Labs, a research and products company in Mountain View, California. He is one of the founders of Fakespace, started in 1991, and developed hardware and software for high end scientific and government virtual reality applications. Working with Mark Bolas and Eric Lorimer, the company created tools including the Boom, Push, Fs2, Pinch Gloves, Immersive Workbenches, the Rave, and a software library called VLIB. In 1998, Fakespace spun into two companies, Fakespace Systems and Fakespace Labs.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

<span class="mw-page-title-main">Lawrence J. Rosenblum</span> American mathematician

Lawrence Jay Rosenblum is an American mathematician, and Program Director for Graphics and Visualization at the National Science Foundation.

John O. Merritt was an expert in the applications of stereoscopic 3D displays and remote-presence systems.

<span class="mw-page-title-main">Stereoscopic video game</span>

A stereoscopic video game is a video game which uses stereoscopic technologies to create depth perception for the player by any form of stereo display. Such games should not be confused with video games that use 3D game graphics on a mono screen, which give the illusion of depth only by monocular cues but lack binocular depth information.

<span class="mw-page-title-main">Vergence-accommodation conflict</span> Visual and perceptual phenomenon

Vergence-accommodation conflict (VAC), also known as accommodation-vergence conflict, is a visual phenomenon that occurs when the brain receives mismatching cues between vergence and accommodation of the eye. This commonly occurs in virtual reality devices, augmented reality devices, 3D movies, and other types of stereoscopic displays and autostereoscopic displays. The effect can be unpleasant and cause eye strain.

References

  1. Stereoscopic Displays and Applications: 12–14 February 1990, SD&A conference, 1990, webpage: BGoogle-k6t.
  2. Electronic Imaging Symposium
  3. Selected SPIE/IS&T papers on DVD-ROM: Stereoscopic Displays and Applications 1990-2009: A Complete 20-Year Retrospective - and The Engineering Reality of Virtual Reality 1994-2009 , SPIE Press, CDP51, ISBN   978-0-8194-7659-3, 2010.
  4. Introduction - Stereoscopic Displays and Applications XXV , Proceedings of the Electronic Imaging, Volume 9011, SPIE Press, 2015.
  5. Seeing Triple , Scientific American, 296(86), June 2007. doi:10.1038/scientificamerican0607-86
  6. Hitting Every Angle with Autostereoscopic 3-D Displays , Photonics Spectra, May 2012.
  7. Editor's Introduction, Third Dimension Newsletter, Issue 23, Veritas et Visus, February 2008.
  8. Don't sit too close to the 3DTV , CNET, January 31, 2011.
  9. 25th Stereoscopic Displays Applications conference highlights , Display Daily, Insight Media, March 1, 2014.
  10. 3D Movie Making: Stereoscopic Digital Cinema from Script to Screen , B. Mendiburu, Taylor and Francis, 2009, ISBN   978-0-240-81137-6
  11. Book Reviews, The Photogrammetric Record, 14(83), pages 828–845, April 1994. doi:10.1111/j.1477-9730.1994.tb00797.x