This article contains content that is written like an advertisement .(June 2024) |
Company type | Private company |
---|---|
Industry | Software, Medical technology, Computer hardware |
Founded | 2011 |
Headquarters | San Francisco, United States |
Area served | Worldwide |
Key people | Jon Fisher [1] [2] [3] (CEO) |
Products | Software, Mobile technology, Medical technology |
Website | CrowdOptic.com |
CrowdOptic, Inc. (known as CrowdOptic) is a privately held San Francisco-based medical technology company founded in 2011. [4] [5] [6] [7] CrowdOptic, led by CEO Jon Fisher, developed augmented reality technology and triangulation algorithms used in medicine, sports, and government that gathers and analyzes data from smart devices based on where they are pointed to identify areas of interest. [8] [9] As of 2016, CrowdOptic remains the only patented solution for wearables like Google Glass and Sony SmartEyeGlass. [10]
CrowdOptic was founded in 2011 by Jon Fisher, Jeff Broderick, Doug Van Blaricom, and Alex Malinovsky. [6] [7] The company analyzes data from mobile devices to identify hot spot activity and connects Google Glass footage to live video feeds. [11] [12] The technology is in use by professional sports, medicine, and government including for emergency response, fire and public-safety workers. [13]
CrowdOptic's investors include Silicon Valley Bank, John Elway, Eric Yuan, and Ronnie Lott. [14] The company has raised $5 million in funding. [13]
In 2015, the company was named one of the most well-established of the 10 Glass for Work partners. [15] In July 2015, 9to5Google announced that the CrowdOptic was in acquisition talks with a Fortune 500 firm. [15]
In 2016, CrowdOptic released its first in-house developed hardware product, the CrowdOptic Eye, a device that streams video through the company's video streaming stack with the push of one button. [16]
In October 2016, CrowdOptic launched Field App through the Google Play Store to "triangulate on a point of interest and broadcast its GPS location to a command center with live-video verification." [17] The application uses a cloud-based system, GPS, compass and live video and smart sensor data to coordinate emergency responders, firemen and police. [17]
In August 2017, the Houston Chronicle reported that Hewlett Packard Enterprise and CrowdOptic reached a deal to combine CrowdOptic's augmented reality platform and triangulation algorithms and Hewlett Packard servers in its internet of things lab in Houston, Texas. [18]
The company is a founding certified Google Glass partner with its technology also in use by Sony, Vuzix and Microsoft. [13] [19] CrowdOptic develops algorithms which let smartphones and wearables live-stream from locations such as hospital operating rooms or sports stadiums. [20] [21] [22]
In 2014, CrowdOptic partnered with the Sacramento Kings to develop an alternative view of basketball games using Google Glass. [23] [24] The company broadcast Google Glass video footage from the perspective of players and cheerleaders on the Jumbotron and mobile devices. [25] This technology was also implemented during warm-ups by the Stanford basketball team. [11]
The company also partnered with the Indiana Pacers to use the technology. [24] The footage was broadcast from the video feeds of team employees wearing Google Glass. [26] CrowdOptic has agreements with the Philadelphia Eagles, and Sony for SmartEyeGlass to use the technology. [13]
In August 2014, CrowdOptic partnered with NASCAR's International Speedway Corporation to broadcast live racing and behind-the-scenes footage from Google Glass. [27] [28]
In 2016, CrowdOptic deployed with the Denver Broncos at the AFC Championship game in Denver, Colorado and at the Super Bowl 50 at Levi's Stadium in the San Francisco Bay Area. [29] [30]
In June 2014, CrowdOptic announced a partnership with the University of California, San Francisco to stream procedures by UCSF Department of Orthopaedic Surgery faculty. [31] The company announced in July 2014 that ProTransport-1, a California-based medical transport provider, would install Google Glass in its ambulances. [32] [33] Google Glass uses CrowdOptic's software to send a live video feed from an ambulance to a destination hospital.
CrowdOptic also partnered with Stanford University Medical School. The software is used to live stream surgeries to doctors and medical students wearing Google Glass. The data from the live stream is owned by Stanford University. [34] In 2017 the company debuted a live-streaming platform for medical practitioner training at National Bioskills Laboratories in San Francisco. [35]
CrowdOptic's technology has been deployed with NASA to enhance the launch and landing of a lunar lander and provide live streaming for incident response. [36] [37] In 2016, the company paired with Solford Industries to market a low-bandwidth live-streaming device integrated with a conventional firefighter helmet in use by fire departments, police and first responders in both the United States and China. [38] [39] In September 2016 CrowdOptic also deployed augmented reality for United States Special Operations Command for field personnel to report to central command about specific targets including the GPS location and a live stream of the target using CrowdOptic's triangulation algorithms. [40]
In November 2016, CrowdOptic combined its augmented reality technology and algorithms with Portland-based SicDrone unmanned aerial vehicles and Suspect Technology's facial recognition technologies to enable emergency responders and law enforcement with state-of-the-art surveillance and identification technology. [41] Vice Magazine reported that the drones "fly fast, record faces in real time, recognize patterns in traffic and pinpoint people who are in the middle of an emergency." [41]
CrowdOptic joined with cosmetic company L'Oreal to market its products at the Luminato festival in Toronto, Ontario, Canada. L'Oreal's virtual art exhibit generated analytics that showed where people were aiming their phones. [42]
A wearable computer, also known as a body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.
Augmented reality (AR) is an interactive experience that combines the real world and computer-generated 3D content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. As such, it is one of the key technologies in the reality-virtuality continuum.
William Stephen George Mann is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann is sometimes labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field. He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children. In 2023, Mann unsuccessfully ran for mayor of Toronto.
Jon Fisher is a Silicon Valley entrepreneur. As of 2021, Fisher is the CEO and a co-founder of software company ViciNFT. As a co-founding CEO, Fisher built multiple companies including Bharosa—which produced the Oracle Adaptive Access Manager and sold to Oracle Corporation for a reported $50 million in 2007, NetClerk—now part of Roper Technologies, AutoReach—now part of AutoNation, and CrowdOptic.
Wearable technology is any technology that is designed to be used while worn. Common types of wearable technology include smartwatches and smartglasses. Wearable electronic devices are often close to or on the surface of the skin, where they detect, analyze, and transmit information such as vital signs, and/or ambient data and which allow in some cases immediate biofeedback to the wearer.
Word Lens was an augmented reality translation application from Quest Visual. Word Lens used the built-in cameras on smartphones and similar devices to quickly scan and identify foreign text, and then translated and displayed the words in another language on the device's display. The words were displayed in the original context on the original background, and the translation was performed in real-time without a connection to the internet. For example, using the viewfinder of a camera to show a shop sign on a smartphone's display would result in a real-time image of the shop sign being displayed, but the words shown on the sign would be the translated words instead of the original foreign words.
Google Glass, or simply Glass, is a brand of smart glasses developed and sold by Google. It was developed by X, with the mission of producing a ubiquitous computer. Google Glass displays information to the wearer using a head-up display. Wearers communicate with the Internet via natural language voice commands.
Erick Miller is a CEO, technology entrepreneur and investor who began his career building startups during the dot-com bubble of the late 1990s in San Francisco, California. Miller is the Founder and CEO of CoinCircle, and founding managing director of Hyperspeed Ventures and the former CEO and Founder of Vergence Labs, a company known for designing and developing wearable computer enabled video streaming glasses under the brand name Epiphany Eyewear as well as augmented reality (AR) and virtual reality (VR) eyewear.
Epiphany Eyewear are smartglasses developed by Vergence Labs. The glasses record video stored within the glasses' hardware for live-stream upload to a computer or social media. The glasses use smartphone technology. The head mounted display is a mobile computer and a high-definition camera. The glasses take photographic images, record or stream video to a smartphone or computer tablet.
An optical head-mounted display (OHMD) is a wearable device that has the capability of reflecting projected images as well as allowing the user to see through it. In some cases, this may qualify as augmented reality (AR) technology. OHMD technology has existed since 1997 in various forms, but despite a number of attempts from industry, has yet to have had major commercial success.
Smartglasses or smart glasses are eye or head-worn wearable computers that offer useful capabilities to the user. Many smartglasses include displays that add information alongside or to what the wearer sees. Alternatively, smartglasses are sometimes defined as glasses that are able to change their optical properties, such as smart sunglasses that are programmed to change tint by electronic means. Alternatively, smartglasses are sometimes defined as glasses that include headphone functionality.
A body camera, bodycam, body-worn video (BWV), body-worn camera, or wearable camera is a wearable audio, video, or photographic recording system.
Magic Leap, Inc. is an American technology company that released a head-mounted augmented reality display, called Magic Leap One, which superimposes 3D computer-generated imagery over real world objects. It is attempting to construct a light-field chip using silicon photonics.
Meta was a company that designed augmented reality products. The company was founded by Meron Gribetz in 2012, based on the "Extramissive spatial imaging digital eye glass" technology invented by Gribetz and Mann originally filed with the US Patent and Trademark office Jan 3, 2013.
Microsoft HoloLens is an augmented reality (AR)/mixed reality (MR) headset developed and manufactured by Microsoft. HoloLens runs the Windows Mixed Reality platform under the Windows 10 operating system. Some of the positional tracking technology used in HoloLens can trace its lineage to the Microsoft Kinect, an accessory for Microsoft's Xbox 360 and Xbox One game consoles that was introduced in 2010.
Pristine is a VC funded startup that develops software for hands-free smartglasses and smart mobile devices, enabling video collaboration and remote support in industrial and manufacturing environments, field service management and healthcare. Pristine is based in Austin, Texas.
DAQRI was an American augmented reality company headquartered in Los Angeles, CA.
A virtual reality headset is a head-mounted device that uses 3D near-eye displays and positional tracking to provide a virtual reality environment for the user. VR headsets are widely used with VR video games, but they are also used in other applications, including simulators and trainers. VR headsets typically include a stereoscopic display, stereo sound, and sensors like accelerometers and gyroscopes for tracking the pose of the user's head to match the orientation of the virtual camera with the user's eye positions in the real world.
Shafi Ahmed is a chief surgeon, teacher, futurist, innovator, professor and entrepreneur.
Louis Barry Rosenberg is an American engineer, researcher, inventor, and entrepreneur. He researches augmented reality, virtual reality, and artificial intelligence. He was the Cotchett Endowed Professor of Educational Technology at the California Polytechnic State University, San Luis Obispo. He founded the Immersion Corporation and Unanimous A.I., and he wrote the screenplay for the 2009 romantic comedy film, Lab Rats.