Cathal Gurrin | |
---|---|
![]() | |
Born | Dublin |
Nationality | Irish |
Occupation(s) | Lifelogger and Associate Professor |
Known for | Extensive personal database of lifelog images and their interpretation |
Cathal Gurrin is an Irish Professor and lifelogger. [1] [2] He is the Head of the Adapt Centre at Dublin City University, a Funded Investigator of the Insight Centre, [3] and the director of the Human Media Archives research group. He was previously the deputy head of the School of Computing.
His interests include personal analytics and lifelogging. He publishes in information retrieval (IR) with a particular focus on how people access information from pervasive computing devices. [4] He has captured a continuous personal digital memory since 2006 using a wearable camera and logged hundreds of millions of other sensor readings. [5]
Cathal attended primary school in Scoil Lorcáin, Kilbarrack, Dublin, and secondary school in St. Fintan's High School, Sutton. He graduated from Dublin City University with a PhD in developing web search engines, including the first Irish language search engine.[ citation needed ]
Gurrin has worn a wearable camera since 2006 which takes several still photographs every minute. He is likely the longest wearer of such a device in the world. He also records his location (using GPS) and various other sources of biometric data. Gurrin generated a database of over 18 million images, and produces about a terabyte of personal data a year. [6] Gurrin and his researchers use information retrieval algorithms to segment his personal image archive into "events" such as eating, driving, etc. New events are recognised on a daily basis using machine learning. [7] In an interview Gurrin said that "If I need to remember where I left my keys, or where I parked my car, or what wine I drank at an event two years ago... the answers should all be there." [6] While searching by date and time is easy, more complex searches within images such as looking for brand names and objects with complex form factors, such as keys, is more difficult. One aim of Gurrin's research is to create search engines to allow complex searches of such image databases, and to develop assistive technology. He is the founder of the annual ACM Lifelog Search Challenge, which attracts a worldwide participant list annually.[ citation needed ]
Information retrieval (IR) in computing and information science is the task of identifying and retrieving information system resources that are relevant to an information need. The information need can be specified in the form of a search query. In the case of document retrieval, queries can be based on full-text or other content-based indexing. Information retrieval is the science of searching for information in a document, searching for documents themselves, and also searching for the metadata that describes data, and for databases of texts, images or sounds.
In computing, a search engine is an information retrieval software system designed to help find information stored on one or more computer systems. Search engines discover, crawl, transform, and store information for retrieval and presentation in response to user queries. The search results are usually presented in a list and are commonly called hits. The most widely used type of search engine is a web search engine, which searches for information on the World Wide Web.
Ubiquitous computing is a concept in software engineering, hardware engineering and computer science where computing is made to appear seamlessly anytime and everywhere. In contrast to desktop computing, ubiquitous computing implies use on any device, in any location, and in any format. A user interacts with the computer, which can exist in many different forms, including laptop computers, tablets, smart phones and terminals in everyday objects such as a refrigerator or a pair of glasses. The underlying technologies to support ubiquitous computing include the Internet, advanced middleware, kernels, operating systems, mobile codes, sensors, microprocessors, new I/Os and user interfaces, computer networks, mobile protocols, global navigational systems, and new materials.
A wearable computer, also known as a body-borne computer, is a computing device worn on the body. The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.
In mass communication, digital media is any communication media that operates in conjunction with various encoded machine-readable data formats. Digital content can be created, viewed, distributed, modified, listened to, and preserved on a digital electronic device, including digital data storage media and digital broadcasting. Digital is defined as any data represented by a series of digits, and media refers to methods of broadcasting or communicating this information. Together, digital media refers to mediums of digitized information broadcast through a screen and/or a speaker. This also includes text, audio, video, and graphics that are transmitted over the internet for viewing or listening to on the internet.
William Stephen George Mann is a Canadian engineer, professor, and inventor who works in augmented reality, computational photography, particularly wearable computing, and high-dynamic-range imaging. Mann has sometimes been labeled the "Father of Wearable Computing" for early inventions and continuing contributions to the field. He cofounded InteraXon, makers of the Muse brain-sensing headband, and is also a founding member of the IEEE Council on Extended Intelligence (CXI). Mann is currently CTO and cofounder at Blueberry X Technologies and Chairman of MannLab. Mann was born in Canada, and currently lives in Toronto, Canada, with his wife and two children. In 2023, Mann unsuccessfully ran for mayor of Toronto.
An image retrieval system is a computer system used for browsing, searching and retrieving images from a large database of digital images. Most traditional and common methods of image retrieval utilize some method of adding metadata such as captioning, keywords, title or descriptions to the images so that retrieval can be performed over the annotation words. Manual image annotation is time-consuming, laborious and expensive; to address this, there has been a large amount of research done on automatic image annotation. Additionally, the increase in social web applications and the semantic web have inspired the development of several web-based image annotation tools.
Sousveillance is the recording of an activity by a member of the public, rather than a person or organisation in authority, typically by way of small wearable or portable personal technologies. The term, coined by Steve Mann, stems from the contrasting French words sur, meaning "above", and sous, meaning "below", i.e. "surveillance" denotes the "eye-in-the-sky" watching from above, whereas "sousveillance" denotes bringing the means of observation down to human level, either physically or hierarchically.
An EyeTap is a concept for a wearable computing device that is worn in front of the eye that acts as a camera to record the scene available to the eye as well as a display to superimpose computer-generated imagery on the original scene available to the eye. This structure allows the user's eye to operate as both a monitor and a camera as the EyeTap intakes the world around it and augments the image the user sees allowing it to overlay computer-generated data over top of the normal world the user would perceive.
A mobile device or handheld computer is a computer small enough to hold and operate in hand. Mobile devices are typically battery-powered and possess a flat-panel display and one or more built-in input devices, such as a touchscreen or keypad. Modern mobile devices often emphasize wireless networking, to both the Internet and to other devices in their vicinity, such as headsets or in-car entertainment systems, via Wi-Fi, Bluetooth, cellular networks, or near-field communication.
Content-based image retrieval, also known as query by image content and content-based visual information retrieval (CBVIR), is the application of computer vision techniques to the image retrieval problem, that is, the problem of searching for digital images in large databases. Content-based image retrieval is opposed to traditional concept-based approaches.
A handheld projector is an image projector in a handheld device. It was developed as a computer display device for compact portable devices such as mobile phones, personal digital assistants, and digital cameras, which have sufficient storage capacity to handle presentation materials but are too small to accommodate a display screen that an audience can see easily. Handheld projectors involve miniaturized hardware, and software that can project digital images onto a nearby viewing surface.
MyLifeBits is a life-logging experiment begun in 2001. It is a Microsoft Research project inspired by Vannevar Bush's hypothetical Memex computer system. The project includes full-text search, text and audio annotations, and hyperlinks. The "experimental subject" of the project is computer scientist Gordon Bell, and the project will try to collect a lifetime of storage on and about Bell. Jim Gemmell of Microsoft Research and Roger Lueder were the architects and creators of the system and its software.
Alan F. Smeaton MRIA is a researcher and academic at Dublin City University. He was founder of TRECVid, and the Centre for Digital Video Processing, and a winner of the University President's Research Award in Science and Engineering in 2002 and the DCU Educational Trust Leadership Award in 2009. Smeaton is a founding director of the Insight Centre for Data Analytics at Dublin City University (2013–2019). Prior to that he was a Principal Investigator and Deputy Director of CLARITY: Centre for Sensor Web Technologies (2008–2013). As of 2013, Smeaton was serving on the editorial board of the ACM Journal on Computers and Cultural Heritage, Information Processing and Management. Smeaton was elected a Member of the Royal Irish Academy in May 2013, becoming DCU's 10th member. In 2012 Smeaton was appointed by Minister Sean Sherlock to the board of the Irish Research Council.
Microsoft's SenseCam is a lifelogging camera with a fisheye lens and trigger sensors, such as accelerometers, heat sensing, and audio, invented by Lyndsay Williams, a patent granted in 2009. Usually worn around the neck, Sensecam is used for the MyLifeBits project, a lifetime storage database. Early developers were James Srinivasan and Trevor Taylor.
A lifelog is a personal record of one's daily life in a varying amount of detail, for a variety of purposes. The record contains a comprehensive dataset of a human's activities. The data could be used to increase knowledge about how people live their lives. In recent years, some lifelog data has been automatically captured by wearable technology or mobile devices. People who keep lifelogs about themselves are known as lifeloggers.
SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997, and 1998, and further developed by Pranav Mistry, in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition.
Target is the name of a collaborative research project specialising in big data processing and management in northern Netherlands. It is a public-private cooperation, initiated in 2009 and supported by government subsidies. It is run by a consortium of ten academic and computer industry partners, coordinated by the University of Groningen, and researches data management of science projects in the area of astronomy, life sciences, artificial intelligence and medical diagnosis.
Liquid Image Corporation was a Winnipeg-based company that manufactured head-mounted displays. The company formed in 1992 by Tony Havelka, David Collette and Shannon O'Brien. Liquid Image was started in Winnipeg, MB in response to the emergence of a market for virtual reality technology. Funding as provided by a group of local angels and the first office was in the attic of Tony Havelka.
Egocentric vision or first-person vision is a sub-field of computer vision that entails analyzing images and videos captured by a wearable camera, which is typically worn on the head or on the chest and naturally approximates the visual field of the camera wearer. Consequently, visual data capture the part of the scene on which the user focuses to carry out the task at hand and offer a valuable perspective to understand the user's activities and their context in a naturalistic setting.