ARCore

Last updated

ARCore
Developer(s) Google
Initial release1 March 2018 (2018-03-01)
Stable release
1.22.20322056 / 15 January 2021;3 years ago (2021-01-15) [1]
Operating system Android
Platform Android 7.0 and later
Website developers.google.com/ar/

ARCore, also known as Google Play Services for AR, is a software development kit developed by Google that allows for augmented reality (AR) applications to be built. ARCore has been integrated into a multitude of devices. [2]

Contents

Key Technologies

ARCore uses a few key technologies to integrate virtual content with the real world as seen through the camera of a smartphone or tablet. [3] . Each of these technologies can be utilized by developers to create a high-quaity, immersive AR experience.

Six Degrees of Freedom

Environmental Understanding

Light Estimation

Depth Analysis

Geospatial Capabilities

Related Research Articles

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.

<span class="mw-page-title-main">Photogrammetry</span> Taking measurements using photography

Photogrammetry is the science and technology of obtaining reliable information about physical objects and the environment through the process of recording, measuring and interpreting photographic images and patterns of electromagnetic radiant imagery and other phenomena.

In visual effects, match moving is a technique that allows the insertion of 2D elements, other live action elements or CG computer graphics into live-action footage with correct position, scale, orientation, and motion relative to the photographed objects in the shot. It also allows for the removal of live action elements from the live action shot. The term is used loosely to describe several different methods of extracting camera motion information from a motion picture. Sometimes referred to as motion tracking or camera solving, match moving is related to rotoscoping and photogrammetry. Match moving is sometimes confused with motion capture, which records the motion of objects, often human actors, rather than the camera. Typically, motion capture requires special cameras and sensors and a controlled environment. Match moving is also distinct from motion control photography, which uses mechanical hardware to execute multiple identical camera moves. Match moving, by contrast, is typically a software-based technology, applied after the fact to normal footage recorded in uncontrolled environments with an ordinary camera.

<span class="mw-page-title-main">Google Maps</span> Googles web mapping service (launched 2005)

Google Maps is a web mapping platform and consumer application offered by Google. It offers satellite imagery, aerial photography, street maps, 360° interactive panoramic views of streets, real-time traffic conditions, and route planning for traveling by foot, car, bike, air and public transportation. As of 2020, Google Maps was being used by over one billion people every month around the world.

Locative media or location-based media (LBM) is a virtual medium of communication functionally bound to a location. The physical implementation of locative media, however, is not bound to the same location to which the content refers.

<span class="mw-page-title-main">Geosocial networking</span> Social network with geographic features

Geosocial networking is a type of social networking in which geographic services and capabilities such as geocoding and geotagging are used to enable additional social dynamics. User-submitted location data or geolocation techniques can allow social networks to connect and coordinate users with local people or events that match their interests. Geolocation on web-based social network services can be IP-based or use hotspot trilateration. For mobile social networks, texted location information or mobile phone tracking can enable location-based services to enrich social networking.

<span class="mw-page-title-main">3D reconstruction</span> Process of capturing the shape and appearance of real objects

In computer vision and computer graphics, 3D reconstruction is the process of capturing the shape and appearance of real objects. This process can be accomplished either by active or passive methods. If the model is allowed to change its shape in time, this is referred to as non-rigid or spatio-temporal reconstruction.

Real-time geotagging refers to the automatic technique of acquiring media, associating a specific location with the media, transferring the media to an online map and publishing the media in real time. It is thus an extension of an automatic geotagging process, requiring an in-built or attached location acquisition device, but also requires communication with a wireless data transfer device. Most modern smartphones and several digital cameras already integrate camera, aGPS, and wireless data transfer into one device, thus directly producing a geotagged photograph. Real-time geotagging is sometimes referred to as "mobile geotagging" or "autogeotagging", but this does not imply the real-time publishing step.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

iClone is a real-time 3D animation and rendering software program. Real-time playback is enabled by using a 3D videogame engine for instant on-screen rendering.

<span class="mw-page-title-main">Far-Play</span> Software platform

Far-Play is a software platform developed at the University of Alberta, for creating location-based, scavenger-hunt style games which use the GPS and web-connectivity features of a player's smartphone. According to the development team, "our long-term objective is to develop a general framework that supports the implementation of AARGs that are fun to play and also educational". It utilizes Layar, an augmented reality smartphone application, QR codes located at particular real-world sites, or a phone's web browser, to facilitate games which require players to be in close physical proximity to predefined "nodes". A node, referred to by the developers as a Virtual Point of Interest (vPOI), is a point in space defined by a set of map coordinates; fAR-Play uses the GPS function of a player's smartphone — or, for indoor games, which are not easily tracked by GPS satellites, specially-created QR codes— to confirm that they are adequately near a given node. Once a player is within a node's proximity, Layar's various augmented reality features can be utilized to display a range of extra content overlaid upon the physical play-space or launch another application for extra functionality.

<span class="mw-page-title-main">Tango (platform)</span> Mobile computer vision platform for Android developed by Google

Tango was an augmented reality computing platform, developed and authored by the Advanced Technology and Projects (ATAP), a skunkworks division of Google. It used computer vision to enable mobile devices, such as smartphones and tablets, to detect their position relative to the world around them without using GPS or other external signals. This allowed application developers to create user experiences that include indoor navigation, 3D mapping, physical space measurement, environmental recognition, augmented reality, and windows into a virtual world.

<span class="mw-page-title-main">Pixel Camera</span> Camera application developed by Google for Pixel devices

Pixel Camera, formerly Google Camera, is a camera phone application developed by Google for the Android operating system. Development for the application began in 2011 at the Google X research incubator led by Marc Levoy, which was developing image fusion technology for Google Glass. It was publicly released for Android 4.4+ on the Google Play on April 16, 2014. It was initially supported on all devices running Android 4.4 KitKat and higher, but became only officially supported on Google Pixel devices in the following years. The app was renamed Pixel Camera in October 2023, with the launch of the Pixel 8 and Pixel 8 Pro.

<span class="mw-page-title-main">Kubity</span> Cloud-based 3D communication tool

Kubity is a cloud-based 3D communication tool that works on desktop computers, the web, smartphones, tablets, augmented reality gear, and virtual reality glasses. Kubity is powered by several proprietary 3D processing engines including "Paragone" and "Etna" that prepare the 3D file for transfer over mobile devices.

This is a glossary of terms relating to computer graphics.

<span class="mw-page-title-main">Pose tracking</span>

In virtual reality (VR) and augmented reality (AR), a pose tracking system detects the precise pose of head-mounted displays, controllers, other objects or body parts within Euclidean space. Pose tracking is often referred to as 6DOF tracking, for the six degrees of freedom in which the pose is often tracked.

<span class="mw-page-title-main">Lenovo Phab 2 Pro</span> Android smartphone

The Phab 2 Pro is an Android smartphone in a phablet form factor, developed and produced by Lenovo and first released in November 2016 at an MSRP of US$499. The device is notable for being the first consumer smartphone to support Google Tango augmented reality (AR) technology.

Volumetric capture or volumetric video is a technique that captures a three-dimensional space, such as a location or performance. This type of volumography acquires data that can be viewed on flat screens as well as using 3D displays and VR goggles. Consumer-facing formats are numerous and the required motion capture techniques lean on computer graphics, photogrammetry, and other computation-based methods. The viewer generally experiences the result in a real-time engine and has direct input in exploring the generated volume.

<span class="mw-page-title-main">Android Pie</span> Ninth major version of the Android mobile operating system

Android Pie, also known as Android 9 is the ninth major release and the 16th version of the Android mobile operating system. It was first released as a developer preview on March 7, 2018, and was released publicly on August 6, 2018.

References

  1. "Google Play Services for AR APKs". APKMirror.
  2. "ARCore supported devices". Google LLC. Retrieved 23 February 2020.
  3. Amadeo, Ron (29 August 2017). "Google's ARCore brings augmented reality to millions of Android devices". Ars Technica . Condé Nast . Retrieved 6 November 2017.
  4. "Fundamental Concepts". ARCore. Google LLC. Retrieved 22 February 2024.
  5. "Get the Lighting Right". ARCore. Google LLC. Retrieved 22 February 2024.
  6. "Fundamental Concepts". ARCore. Google LLC. Retrieved 22 February 2024.
  7. "Depth Adds Realism". ARCore. Google LLC. Retrieved 22 February 2024.
  8. "Build global-scale, immersive, location-based AR experiences with the ARCore Geospatial API". ARCore. Google LLC. Retrieved 22 February 2024.