This article is being considered for deletion in accordance with Wikipedia's deletion policy. Please share your thoughts on the matter at this article's deletion discussion page. |
This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
Shahram Izadi is a computer scientist known for his work on Augmented Reality, Computer Vision, Artificial Intelligence and Human-Computer Interaction. He is currently at Google as a senior director of engineering where he is reported to be working on an augmented reality headset [1] while also managing Google's ARCore software toolkit.
At Microsoft, Izadi was a director of an R&D team and worked on Microsoft Holoportation, a real-time holographic communication system. [2] [3] He appears in a YouTube video, where he communicates with a hologram of his daughter using a Microsoft Hololens. [4] In 2020, Microsoft launched Holoportation capabilities as part of Microsoft Mesh [5] .
At Microsoft, Izadi also worked on Kinect, Microsoft Hololens, Microsoft's Surface computers, Microsoft Touch Mouse [6] , Kinect for Windows [7] and also in Microsoft's Research organization. [8] In 2016, it was reported that Izadi left Microsoft to form a stealth startup called PerceptiveIO with Jefferson Han. [9] PerceptiveIO was reported to have been acquired by Google in 2018. [10]
Izadi has made significant contributions to academic research publishing highly cited papers on computer vision, specifically 3D reconstruction, depth estimation, real-time tracking and new types of sensors [11] .
In human–computer interaction, WIMP stands for "windows, icons, menus, pointer", denoting a style of interaction using these elements of the user interface. Other expansions are sometimes used, such as substituting "mouse" and "mice" for menus, or "pull-down menu" and "pointing" for pointer.
A voice-user interface (VUI) makes spoken human interaction with computers possible, using speech recognition to understand spoken commands and answer questions, and typically text to speech to play a reply. A voice command device (VCD) is a device controlled with a voice user interface.
William Arthur Stewart Buxton is a Canadian computer scientist and designer. He is a partner researcher at Microsoft Research. He is known for being one of the pioneers in the human–computer interaction field.
Jock D. Mackinlay is an American information visualization expert and Vice President of Research and Design at Tableau Software. With Stuart K. Card, George G. Robertson and others he invented a number of Information Visualization techniques.
Elizabeth D. "Beth" Mynatt is the Dean of the Khoury College of Computer Sciences at Northeastern University. She is former executive director of the Institute for People and Technology, director of the GVU Center at Georgia Tech, and Regents' and Distinguished Professor in the School of Interactive Computing, all at the Georgia Institute of Technology.
In computing, a natural user interface, or NUI, or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.
The DiamondTouch table is a multi-touch, interactive PC interface product from Circle Twelve Inc. It is a human interface device that has the capability of allowing multiple people to interact simultaneously while identifying which person is touching where. The technology was originally developed at Mitsubishi Electric Research Laboratories (MERL) in 2001 and later licensed to Circle Twelve Inc in 2008. The DiamondTouch table is used to facilitate face-to-face collaboration, brainstorming, and decision-making, and users include construction management company Parsons Brinckerhoff, the Methodist Hospital, and the US National Geospatial-Intelligence Agency (NGA).
A virtual touch screen (VTS) is a user interface system that augments virtual objects into reality either through a projector or optical display using sensors to track a person's interaction with the object. For instance, using a display and a rear projector system a person could create images that look three-dimensional and appear to float in midair. Some systems utilize an optical head-mounted display to augment the virtual objects onto the transparent display utilizing sensors to determine visual and physical interactions with the virtual objects projected.
OmniTouch is a wearable computer, depth-sensing camera and projection system that enables interactive multitouch interfaces on everyday surface. Beyond the shoulder-worn system, there is no instrumentation of the user or the environment. For example, the present shoulder-worn implementation allows users to manipulate interfaces projected onto the environment, held objects, and their own bodies. On such surfaces - without any calibration - OmniTouch provides capabilities similar to that of a touchscreen: X and Y location in 2D interfaces and whether fingers are “clicked” or hovering. This enables a wide variety of applications, similar to what one might find on a modern smartphone. A user study assessing pointing accuracy of the system suggested buttons needed to be 2.3 cm (0.91 in) in diameter to achieve reliable operation on the hand, 1.6 cm (0.63 in) on walls. This is approaching the accuracy of capacitive touchscreens, like those found in smart phones, but on arbitrary surfaces.
Xbox is a video gaming brand created and owned by Microsoft. The brand consists of five video game consoles, as well as applications (games), streaming services, an online service by the name of Xbox network, and the development arm by the name of Xbox Game Studios. The brand was first introduced in the United States in November 2001, with the launch of the original Xbox console.
IllumiRoom is a Microsoft Research project that augments a television screen with images projected onto the wall and surrounding objects. The current proof-of-concept uses a Kinect sensor and video projector. The Kinect sensor captures the geometry and colors of the area of the room that surrounds the television, and the projector displays video around the television that corresponds to a video source on the television, such as a video game or movie.
Windows Mixed Reality is a platform introduced as part of the Windows 10 operating system, which provides augmented reality and mixed reality experiences with compatible head-mounted displays.
Microsoft HoloLens, known under development as Project Baraboo, are a pair of mixed reality smartglasses developed and manufactured by Microsoft. HoloLens was the first head-mounted display running the Windows Mixed Reality platform under the Windows 10 computer operating system. The tracking technology used in HoloLens can trace its lineage to Kinect, an add-on for Microsoft's Xbox game console that was introduced in 2010.
Chris Harrison is a British-born, American computer scientist and entrepreneur, working in the fields of human-computer interaction, machine learning and sensor-driven interactive systems. He is a professor at Carnegie Mellon University and director of the Future Interfaces Group within the Human-Computer Interaction Institute. He has previously conducted research at AT&T Labs, Microsoft Research, IBM Research and Disney Research. He is also the CTO and co-founder of Qeexo, a machine learning and interaction technology startup.
Jacob O. Wobbrock is a Professor in the University of Washington Information School and, by courtesy, in the Paul G. Allen School of Computer Science & Engineering at the University of Washington. He is Director of the ACE Lab, founding Co-Director of the CREATE center, and a founding member of the DUB Group and the MHCI+D degree program.
Ken Hinckley is an American computer scientist and inventor. He is a senior principal research manager at Microsoft Research. He is known for his research in human-computer interaction, specifically on sensing techniques, pen computing, and cross-device interaction.
Microsoft HoloLens 2 is a pair of mixed reality smartglasses developed and manufactured by Microsoft. It is the successor to the pioneering Microsoft HoloLens. On February 24, 2019 the HoloLens 2 enterprise edition debuted as the first variant of the device, followed by a developer edition that was announced on May 2, 2019. It was subsequently released in limited numbers on November 7, 2019.
The Azure Kinect DK is a developer kit and PC peripheral which employs the use of artificial intelligence sensors for computer vision and speech models, and is connected to the Microsoft Azure cloud. It is the successor to the Microsoft Kinect line of sensors.
Meredith Ringel Morris is an American computer scientist who works in human-computer interaction and collaborative web search. She is a Senior Principal Researcher at Microsoft Research, Research Manager of the Ability Team, and an affiliate professor at the University of Washington in The Paul G. Allen School of Computer Science & Engineering and in The Information School.
Microsoft Holoportation is a project from Microsoft Research that demonstrates real-time holographic communications with the Microsoft Hololens. Holoportation is described as "a new type of 3D capture technology that allows high-quality 3D models of people to be reconstructed, compressed and transmitted anywhere in the world in real time. This allows users wearing virtual or augmented reality displays to see, hear and interact with remote participants in 3D, almost as if they were present in the same physical space. From an audio-visual perspective, communicating and interacting with remote users edges closer to face-to-face communication." The project was launched by Shahram Izadi and his Microsoft team in 2016. In March 2016, Alex Kipman performed a live demonstration of the technology at the TED conference as part of his talk. In 2020, Microsoft Mesh was launched which offered Holoportation capabilities to "project yourself as your most lifelike, photorealistic self in mixed reality to interact as if you are there in person"
This article needs additional or more specific categories .(February 2022) |