Digital Life

Last updated

Digital Life is a research and educational program about radically rethinking of the human-computer interactive experience. It integrates digital world (information & services) and physical world (physical objects/environment). It makes interfaces more responsive and proactive (objects & environments monitor user and (proactively) present information & services relevant to user’s current needs/interests)

The program is to use information technology to augment physical environments and objects around the people that can draw attention. When one is walking around town, for example, the system points out buildings/places of particular interest to a user. The program is also to augment reality in order to provide a composite view for the participants: a mix of a real scene with the virtual scene that augments the digital environment with interactive information.

The Program was originally initiated by MIT Media Lab as: Digital Life is a multi-sponsor, Lab-wide research consortium that conducts basic research on technologies and techniques that spur expression as well as social and economic activity. They first explore the design and scalability of agile, grassroots communications systems that incorporate a growing understanding of emergent social behaviors in a digital world; the second considers a cognitive architecture that can support many features of “human intelligent thinking” and its expressive and economic use; and the third extends the idea of inclusive design to immersive, affective, and biological interfaces and actions.

Related Research Articles

Ubiquitous computing is a concept in software engineering and computer science where computing is made to appear anytime and everywhere. In contrast to desktop computing, ubiquitous computing can occur using any device, in any location, and in any format. A user interacts with the computer, which can exist in many different forms, including laptop computers, tablets and terminals in everyday objects such as a refrigerator or a pair of glasses. The underlying technologies to support ubiquitous computing include Internet, advanced middleware, operating system, mobile code, sensors, microprocessors, new I/O and user interfaces, computer networks, mobile protocols, location and positioning, and new materials.

User interface means by which a user interacts with and controls a machine

The user interface (UI), in the industrial design field of human-computer interaction, is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, whilst the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls, and process controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology.

Augmented reality View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. An augogram is a computer generated image that is used to create AR. Augography is the science and practice of making augograms for AR. AR can be defined as a system that fulfills three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one. Augmented reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.

MIT Media Lab interdisciplinary research laboratory at the Massachusetts Institute of Technology

The MIT Media Lab is a research laboratory at the Massachusetts Institute of Technology, growing out of MIT's Architecture Machine Group in the School of Architecture. Its research does not restrict to fixed academic disciplines, but draws from technology, media, science, art, and design. As of 2014, Media Lab's research groups include neurobiology, biologically inspired fabrication, socially engaging robots, emotive computing, bionics, and hyperinstruments.

Interaction design, often abbreviated as IxD, is "the practice of designing interactive digital products, environments, systems, and services." Beyond the digital aspect, interaction design is also useful when creating physical (non-digital) products, exploring how a user might interact with it. Common topics of interaction design include design, human–computer interaction, and software development. While interaction design has an interest in form, its main area of focus rests on behavior. Rather than analyzing how things are, interaction design synthesizes and imagines things as they could be. This element of interaction design is what characterizes IxD as a design field as opposed to a science or engineering field.

Mixed reality Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time. Mixed reality does not exclusively take place in either the physical or virtual world, but is a hybrid of reality and virtual reality, encompassing both augmented reality and augmented virtuality via immersive technology.

Scott Fisher (technologist) American artist and technologist

Scott Fisher is the Professor and Founding Chair of the Interactive Media Division in the USC School of Cinematic Arts at the University of Southern California, and Director of the Mobile and Environmental Media Lab there. He is an artist and technologist who has worked extensively on virtual reality, including pioneering work at NASA, Atari Research Labs, MIT's Architecture Machine Group and Keio University.

Tangible user interface user interface in which a person interacts with digital information through the physical environment

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

Ambient intelligence electronic environments that are sensitive and responsive to the presence of people

In computing, ambient intelligence (AmI) refers to electronic environments that are sensitive and responsive to the presence of people. Ambient intelligence was a projection on the future of consumer electronics, telecommunications and computing that was originally developed in the late 1990s by Eli Zelkha and his team at Palo Alto Ventures for the time frame 2010–2020. Ambient intelligence would allow devices to work in concert to support people in carrying out their everyday life activities, tasks and rituals in an intuitive way using information and intelligence that is hidden in the network connecting these devices. As these devices grew smaller, more connected and more integrated into our environment, the technological framework behind them would disappear into our surroundings until only the user interface remains perceivable by users.

A virtual artifact (VA) is an immaterial object that exists in the human mind or in a digital environment, for example the Internet, intranet, virtual reality, cyberspace, etc.

Kent Larson is Director of the City Science research group at the MIT Media Lab. Before joining MIT full-time in 2000, he practiced architecture for 15 years in New York City. His research focuses on developing urban interventions that enable more entrepreneurial, livable, high-performance urban districts. Projects include advanced simulation and augmented reality for urban design, transformable micro-housing for millennials, mobility-on-demand systems that create alternatives to private automobiles, and urban living lab deployments in Hamburg, Helsinki, Andorra, Taipei, Shanghai, Toronto, and Guadalajara. He and the researchers from his MIT lab have twice received the “10-Year Impact Award” from Ubicomp: a “test of time” award for work that, with the benefit of that hindsight, has had the greatest impact. His book, Louis I. Kahn: Unbuilt Masterworks was selected as one of the Ten Best Books in Architecture by the New York Times Review of Books. Larson's TED talk, "Brilliant designs to fit more people in every city," summarized his vision for cities in the future.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

Elizabeth D. "Beth" Mynatt is the executive director of the Institute for People and Technology, the former director of the GVU Center at Georgia Tech; and professor in the School of Interactive Computing, all at the Georgia Institute of Technology.

SixthSense

SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997, and 1998, and further developed by Pranav Mistry, in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition.

Human–computer interaction (HCI) studies the design and use of computer technology, focused on the interfaces between people (users) and computers. Researchers in the field of HCI observe the ways in which humans interact with computers and design technologies that let humans interact with computers in novel ways. As a field of research, human–computer interaction is situated at the intersection of computer science, behavioural sciences, design, media studies, and several other fields of study. The term was popularized by Stuart K. Card, Allen Newell, and Thomas P. Moran in their seminal 1983 book, The Psychology of Human–Computer Interaction, although the authors first used the term in 1980 and the first known use was in 1975. The term connotes that, unlike other tools with only limited uses, a computer has many uses and this takes place as an open-ended dialog between the user and the computer. The notion of dialog likens human–computer interaction to human-to-human interaction, an analogy which is crucial to theoretical considerations in the field.

Immersive technology

Immersive technology refers to technology that attempts to emulate a physical world through the means of a digital or simulated world by creating a surrounding sensory feeling, thereby creating a sense of immersion. Immersive technology enables mixed reality; which is a combination of Virtual reality and Augmented reality or a combination of physical and digital. in some uses, the term "immersive computing" is effectively synonymous with mixed reality as user interface.

A living lab, or living laboratory, is a research concept, which may be defined as a user-centered, open-innovation ecosystem, often operating in a territorial context, integrating concurrent research and innovation processes within a public-private-people partnership.

Human Media Lab

The Human Media Lab (HML) is a research laboratory in Human-Computer Interaction at Queen's University's School of Computing in Kingston, Ontario. Its goals are to advance user interface design by creating and empirically evaluating disruptive new user interface technologies, and educate graduate students in this process. The Human Media Lab was founded in 2000 by Prof. Roel Vertegaal and employs an average of 12 graduate students.

James Patten is an American interaction designer, inventor, and visual artist. Patten is a TED fellow and speaker whose studio-initiated research has led to the creation of new technology platforms, like Thumbles, tiny-computer controlled robots; interactive, kinetic lighting features; and immersive environments that engage the body.

Chris Harrison (computer scientist)

Chris Harrison is a British-born, American computer scientist and entrepreneur, working in the fields of human-computer interaction, machine learning and sensor-driven interactive systems. He is a professor at Carnegie Mellon University and director of the Future Interfaces Group within the Human-Computer Interaction Institute. He has previously conducted research at AT&T Labs, Microsoft Research, IBM Research and Disney Research. He is also the CTO and co-founder of Qeexo, a machine learning and interaction technology startup.