Surface computer

Last updated

A surface computer is a computer that interacts with the user through the surface of an ordinary object, rather than through a monitor, keyboard, mouse, or other physical hardware.

Contents

The term "surface computer" was first adopted by Microsoft for its PixelSense (codenamed Milan) interactive platform, which was publicly announced on 30 May 2007. Featuring a horizontally-mounted 30-inch display in a coffee table-like enclosure, users can interact with the machine's graphical user interface by touching or dragging their fingertips and other physical objects such as paintbrushes across the screen, or by setting real-world items tagged with special bar-code labels on top of it. As an example, uploading digital files only requires each object (e.g. a Bluetooth-enabled digital camera) to be placed on the unit's display. The resulting pictures can then be moved across the screen, or their sizes and orientation can be adjusted as well.

PixelSense's internal hardware includes a 2.0 GHz Core 2 Duo processor, 2GB of memory, an off the shelf graphics card, a scratch-proof spill-proof surface, a DLP projector, and five infrared cameras to detect touch, unlike the iPhone, which uses a capacitive display. These expensive components resulted in a price tag of between $12,500 to $15,000 for the hardware.

The first PixelSense units were used as information kiosks in the Harrah's family of casinos. Other customers were T-Mobile, for comparing several cell phones side by side, and Sheraton Hotels and Resorts, to service lobby customers in numerous ways. [1] [2] These products were originally branded as "Microsoft Surface", but was renamed "Microsoft PixelSense" on June 18, 2012, after the manufacturer adopted the "Surface" name for its new series of tablet PCs.

See also

Related Research Articles

<span class="mw-page-title-main">Computer mouse</span> Pointing device used to control a computer

A computer mouse is a hand-held pointing device that detects two-dimensional motion relative to a surface. This motion is typically translated into the motion of the pointer on a display, which allows a smooth control of the graphical user interface of a computer.

<span class="mw-page-title-main">Pointing device</span> Human interface device for computers

A pointing device is a human interface device that allows a user to input spatial data to a computer. CAD systems and graphical user interfaces (GUI) allow the user to control and provide data to the computer using physical gestures by moving a hand-held mouse or similar device across the surface of the physical desktop and activating switches on the mouse. Movements of the pointing device are echoed on the screen by movements of the pointer and other visual changes. Common gestures are point and click and drag and drop.

<span class="mw-page-title-main">Hidden-surface determination</span> Visibility in 3D computer graphics

In 3D computer graphics, hidden-surface determination is the process of identifying what surfaces and parts of surfaces can be seen from a particular viewing angle. A hidden-surface determination algorithm is a solution to the visibility problem, which was one of the first major problems in the field of 3D computer graphics. The process of hidden-surface determination is sometimes called hiding, and such an algorithm is sometimes called a hider. When referring to line rendering it is known as hidden-line removal. Hidden-surface determination is necessary to render a scene correctly, so that one may not view features hidden behind the model itself, allowing only the naturally viewable portion of the graphic to be visible.

<span class="mw-page-title-main">Touchscreen</span> Input and output device

A touchscreen is a type of display that can detect touch input from a user. It consists of both an input device and an output device. The touch panel is typically layered on the top of the electronic visual display of a device. Touchscreens are commonly found in smartphones, tablets, laptops, and other electronic devices.

<span class="mw-page-title-main">Tangible user interface</span>

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

<span class="mw-page-title-main">Multi-touch</span> Technology

In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface at the same time. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. CERN started using multi-touch screens as early as 1976 for the controls of the Super Proton Synchrotron. Capacitive multi-touch displays were popularized by Apple's iPhone in 2007. Multi-touch may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures using gesture recognition.

Perceptive Pixel is a division of Microsoft specializing in research, development and production of multi-touch interfaces. Its technology is used in fields including broadcast, defense, geo-intelligence, energy exploration, industrial design and medical imaging.

<span class="mw-page-title-main">Pen computing</span> Uses a stylus and tablet/touchscreen

Pen computing refers to any computer user-interface using a pen or stylus and tablet, over input devices such as a keyboard or a mouse.

<span class="mw-page-title-main">Microsoft PixelSense</span> Interactive surface computing platform by Microsoft

Microsoft PixelSense was an interactive surface computing platform that allowed one or more people to use and touch real-world objects, and share digital content at the same time. The PixelSense platform consists of software and hardware products that combine vision based multitouch PC hardware, 360-degree multiuser application design, and Windows software to create a natural user interface (NUI).

Surface computing is the use of a specialized computer GUI in which traditional GUI elements are replaced by intuitive, everyday objects. Instead of a keyboard and mouse, the user interacts with a surface. Typically the surface is a touch-sensitive screen, though other surface types like non-flat three-dimensional objects have been implemented as well. It has been said that this more closely replicates the familiar hands-on experience of everyday object manipulation.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

<span class="mw-page-title-main">Input device</span> Device that provides data and signals to a computer

In computing, an input device is a piece of equipment used to provide data and control signals to an information processing system, such as a computer or information appliance. Examples of input devices include keyboards, computer mice, scanners, cameras, joysticks, and microphones.

<span class="mw-page-title-main">DiamondTouch</span> Multiple person interface device

The DiamondTouch table is a multi-touch, interactive PC interface product from Circle Twelve Inc. It is a human interface device that has the capability of allowing multiple people to interact simultaneously while identifying which person is touching where. The technology was originally developed at Mitsubishi Electric Research Laboratories (MERL) in 2001 and later licensed to Circle Twelve Inc in 2008. The DiamondTouch table is used to facilitate face-to-face collaboration, brainstorming, and decision-making, and users include construction management company Parsons Brinckerhoff, the Methodist Hospital, and the US National Geospatial-Intelligence Agency (NGA).

<span class="mw-page-title-main">Mobile technology</span> Technology used for cellular communication

Mobile technology is the technology used for cellular communication. Mobile technology has evolved rapidly over the past few years. Since the start of this millennium, a standard mobile device has gone from being no more than a simple two-way pager to being a mobile phone, GPS navigation device, an embedded web browser and instant messaging client, and a handheld gaming console. Many experts believe that the future of computer technology rests in mobile computing with wireless networking. Mobile computing by way of tablet computers is becoming more popular. Tablets are available on the 3G and 4G networks.

<span class="mw-page-title-main">Microsoft Tablet PC</span> Microsofts former line of tablets

Microsoft Tablet PC is a term coined by Microsoft for tablet computers conforming to a set of specifications announced in 2001 by Microsoft, for a pen-enabled personal computer, conforming to hardware specifications devised by Microsoft and running a licensed copy of Windows XP Tablet PC Edition operating system or a derivative thereof.

<span class="mw-page-title-main">Screen–smart device interaction</span>

Screen-Smart Device Interaction (SSI) is fairly new technology developed as a sub-branch of Digital Signage.

<span class="mw-page-title-main">Sprout (computer)</span>

Sprout by HP was a personal computer from HP Inc. announced on October 29, 2014 and released for sale on November 9, 2014. The system was conceived by Brad Short, Distinguished Technologist at HP Inc., who along with Louis Kim, Head of the Immersive Computing Group at HP Inc., co-founded and led a team within HP Inc. to develop and productize the computing concept.

<span class="mw-page-title-main">Surface Hub</span> Interactive whiteboard developed by Microsoft

The Surface Hub is a brand of interactive whiteboard developed and marketed by Microsoft, as part of the Microsoft Surface family. The Surface Hub is a wall-mounted or roller-stand-mounted device with either a 55-inch (140 cm) 1080p or an 84-inch (210 cm) 4K 120 Hz touchscreen with multi-touch and multi-pen capabilities, running the Windows 10 operating system. The devices are targeted for businesses to use while collaborating and videoconferencing.

The Surface Book 3 is the third generation of Microsoft's Surface Book series, and a successor to the Surface Book 2. Like its previous generation, the Surface Book 3 is part of the Microsoft Surface lineup of personal computers. It is a 2-in-1 PC that can be used like a conventional laptop, or detached from its base for use as a separate tablet, with touch and stylus input support in both scenarios. It was announced by Microsoft online alongside the Surface Go 2 on May 6, 2020, and later released for purchase on May 12, 2020.

References

  1. Grossman, Lev (14 June 2007). "Feeling out the Newest Touch Screens". TIME. Archived from the original on June 18, 2007. Retrieved 2007-06-16.
  2. "Microsoft Launches New Product Category: Surface Computing Comes to Life in Restaurants, Hotels, Retail Locations and Casino Resorts". Microsoft. 29 May 2007. Retrieved 2007-06-16.
  3. Jeff Han