Bruno Zamborlin

Last updated
Bruno Zamborlin
BrunoZamborlinPhotograph.jpg
BornJuly 1983 (age 40)
Alma mater Goldsmiths, University of London
IRCAM - Centre Pompidou
Occupation(s)Entrepreneur, artist

Bruno Zamborlin (born 1983 in Vicenza) is an AI researcher, entrepreneur and artist based in London, working in the field of human-computer interaction. [1] [2] [3] His work focuses on converting physical objects into touch-sensitive, interactive surfaces using vibration sensors and artificial intelligence. In 2013 he founded Mogees Limited [4] a start-up to transform everyday objects into musical instruments and games using a vibration sensor and a mobile phone. With HyperSurfaces, [5] he converts physical surfaces of any material, shape and form into data-enabled-interactive surfaces using a vibration sensor and a coin-sized chipset. As an artist, he has created art installations around the world, with his most recent work comprising a unique series of "sound furnitures" that was showcased at the Italian Pavilion of the Venice Biennale 2023 [6] . He regularly performed with UK-based electronic music duo Plaid (Warp Records). He is also honorary visiting research fellow at Goldsmiths, University of London. [7]

Contents

Early life and education

From 2008-2011, Zamborlin worked at the IRCAM (Institute for Research and Coordination Acoustic Musical) – Centre Pompidou as a member of the Sound Music Movement Interaction team. [8] Under the supervision of Frederic Bevilacqua, he started experimenting with the use of artificial intelligence and human movements, [9] and contributed to the creation of Gesture Follower, [10] [11] a software used to analyse body movements of performers and dancers through motion sensors in order to control sound and visual media in real-time, slowing down or speeding up their reproduction based on the speed the gestures are performed. [12] [13]

He has lived in London since 2011, where he developed a joint PhD between Goldsmiths, University of London and IRCAM - Centre Pompidou/Pierre and Marie Curie University Paris in AI, [14] focussing on the concept of Interactive Machine Learning [15] applied to digital musical instruments and performing arts. [16]

Career

Zamborlin founded Mogees Limited in 2013 in London, with IRCAM being amongst the early partners. [17] Mogees transform physical objects into musical instruments and games using a vibration sensor and a series of apps for smartphones and desktop. [18] [19] [20] [21] [22] [23] After a campaign on Kickstarter in 2014, [24] Mogees was used both by common users [25] and artists such as Rodrigo y Gabriela, [26] Jean-Michel Jarre [27] and Plaid. [28] [29] The algorithms implemented in these apps employ a special version of physical modelling sound synthesis, where the vibration produced by users when interacting with the physical object are used as exciter for a digital resonator which runs in the app. The result is a hybrid, half acoustic and half digital sound which is a function of both software and acoustic properties of the physical object the users decide to play. [30]

In 2017, Zamborlin founded HyperSurfaces together with computational artist Parag K Mital. [31] to merge "the physical and the digital worlds". HyperSurfaces technology converts any surface made of any material, shape and size into data-enabled interactive objects, employing a vibration sensor and proprietary AI algorithms running on a coin-sized chipset. [32] The vibrations generated by people's interactions on the surface are converted into an electric signal by a piezoelectric sensor and analysed in realtime by AI algorithms that run on the chipset. Anytime the AI recognises in the vibration signal one of the events that have been predefined by the user beforehand, a corresponding notification message is generated in realtime and sent to some application. [33] The technology can be applied to anything ranging from button-less human-computer interaction applications for automotive and smart home to the Internet of things. [34] [35] [36] [37] Because the AI algorithms employed by HyperSurfaces run locally on a chipset, without the need to access cloud-based services, they are considered to be part of the field of edge computing. Also, because the AI can be trained beforehand to recognise the events its users are interested in, HyperSurfaces algorithms belong to the field of supervised machine learning. [38] [39]

Selected awards

Patents and academic publications

Related Research Articles

<span class="mw-page-title-main">Pointing device gesture</span>

In computing, a pointing device gesture or mouse gesture is a way of combining pointing device or finger movements and clicks that the software recognizes as a specific computer event and responds to accordingly. They can be useful for people who have difficulties typing on a keyboard. For example, in a web browser, a user can navigate to the previously viewed page by pressing the right pointing device button, moving the pointing device briefly to the left, then releasing the button.

<span class="mw-page-title-main">Gesture recognition</span> Topic in computer science and language technology

Gesture recognition is an area of research and development in computer science and language technology concerned with the recognition and interpretation of human gestures. A subdiscipline of computer vision, it employs mathematical algorithms to interpret gestures.

<span class="mw-page-title-main">Tangible user interface</span>

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

<span class="mw-page-title-main">Ben Shneiderman</span> American computer scientist

Ben Shneiderman is an American computer scientist, a Distinguished University Professor in the University of Maryland Department of Computer Science, which is part of the University of Maryland College of Computer, Mathematical, and Natural Sciences at the University of Maryland, College Park, and the founding director (1983-2000) of the University of Maryland Human-Computer Interaction Lab. He conducted fundamental research in the field of human–computer interaction, developing new ideas, methods, and tools such as the direct manipulation interface, and his eight rules of design.

Multimodal interaction provides the user with multiple modes of interacting with a system. A multimodal interface provides several distinct tools for input and output of data.

In artificial intelligence, an embodied agent, also sometimes referred to as an interface agent, is an intelligent agent that interacts with the environment through a physical body within that environment. Agents that are represented graphically with a body, for example a human or a cartoon animal, are also called embodied agents, although they have only virtual, not physical, embodiment. A branch of artificial intelligence focuses on empowering such agents to interact autonomously with human beings and the environment. Mobile robots are one example of physically embodied agents; Ananova and Microsoft Agent are examples of graphically embodied agents. Embodied conversational agents are embodied agents that are capable of engaging in conversation with one another and with humans employing the same verbal and nonverbal means that humans do.

STEIM was a center for research and development of new musical instruments in the electronic performing arts, located in Amsterdam, Netherlands. Beginning in the 1970's, STEIM became known as a pioneering center for electronic music, where the specific context of electronic music was always strongly related to the physical and direct actions of a musician. In this tradition, STEIM supported artists in residence such as composers and performers, but also multimedia and video artists, helping them to develop setups which allowed for bespoke improvisation and performance with individually designed technology.

<span class="mw-page-title-main">Multi-touch</span> Technology

In computing, multi-touch is technology that enables a surface to recognize the presence of more than one point of contact with the surface at the same time. The origins of multitouch began at CERN, MIT, University of Toronto, Carnegie Mellon University and Bell Labs in the 1970s. CERN started using multi-touch screens as early as 1976 for the controls of the Super Proton Synchrotron. Capacitive multi-touch displays were popularized by Apple's iPhone in 2007. Multi-touch may be used to implement additional functionality, such as pinch to zoom or to activate certain subroutines attached to predefined gestures using gesture recognition.

The Hyperbow is an electronic violin bow interface that was developed as a result of an in-depth research project by students at MIT. The instrument is intended for use only by accomplished players and was designed to amplify their gestures, which lead to supplementary sound or musical control possibilities. It offers the violin player a range of expressive possibilities in its form as an augmented bow controller that lends itself to the control of bowed string physical models.

<span class="mw-page-title-main">Robotics</span> Design, construction, use, and application of robots

Robotics is the interdisciplinary study and practice of the design, construction, operation, and use of robots.

In computing, a natural user interface (NUI) or natural interface is a user interface that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include voice assistants, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles furnitures.

<span class="mw-page-title-main">Human–computer interaction</span> Academic discipline studying the relationship between computer systems and their users

Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".

Sketch recognition describes the process by which a computer, or artificial intelligence can interpret hand-drawn sketches created by a human being, or other machine. Sketch recognition is a key frontier in the field of artificial intelligence and human-computer interaction, similar to natural language processing or conversational artificial intelligence

<span class="mw-page-title-main">Finger tracking</span> High-resolution technique in gesture recognition and image processing

In the field of gesture recognition and image processing, finger tracking is a high-resolution technique developed in 1969 that is employed to know the consecutive position of the fingers of the user and hence represent objects in 3D. In addition to that, the finger tracking technique is used as a tool of the computer, acting as an external device in our computer, similar to a keyboard and a mouse.

Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.

<span class="mw-page-title-main">PrimeSense</span> Former Israeli company

PrimeSense was an Israeli 3D sensing company based in Tel Aviv. PrimeSense had offices in Israel, North America, Japan, Singapore, Korea, China and Taiwan. PrimeSense was bought by Apple Inc. for $360 million on November 24, 2013.

Andrea Cera is an Italian electroacoustic composer, sound designer and sound installation artist.

uSens, Inc. is a Silicon Valley startup founded in 2014 in San Jose, California. The company's core team includes researchers and developers who build interactive and immersive computer-vision tracking solutions. The uSens team has extensive experience in artificial intelligence (AI), computer vision, 3D Human–computer interaction (HCI) technology and augmented reality and virtual reality. uSens has been applying computer vision and AI technologies in AR/VR, Automotive and smartphones. 

<span class="mw-page-title-main">Force Touch</span> Force-sensing touch technology developed by Apple Inc.

Force Touch is a haptic technology developed by Apple Inc. that enables trackpads and touchscreens to distinguish between various levels of force being applied to their surfaces. It uses pressure sensors to add another method of input to Apple's devices. The technology was first unveiled on September 9, 2014, during the introduction of Apple Watch. Starting with the Apple Watch, Force Touch has been incorporated into many products within Apple's lineup. This notably includes MacBooks and the Magic Trackpad 2. The technology is known as 3D Touch on the iPhone models. The technology brings usability enhancements to the software by offering a third dimension to accept input. Accessing shortcuts, previewing details, drawing art and system wide features enable users to additionally interact with the displayed content by applying force on the input surface.

<span class="mw-page-title-main">Chris Harrison (computer scientist)</span> American computer scientist

Chris Harrison is a British-born, American computer scientist and entrepreneur, working in the fields of human–computer interaction, machine learning and sensor-driven interactive systems. He is a professor at Carnegie Mellon University and director of the Future Interfaces Group within the Human–Computer Interaction Institute. He has previously conducted research at AT&T Labs, Microsoft Research, IBM Research and Disney Research. He is also the CTO and co-founder of Qeexo, a machine learning and interaction technology startup.

References

  1. Vdovin, Marsha (23 June 2014). "An Interview with Bruno Zamborlin". Cycling74. San Francisco. Retrieved 17 January 2019.
  2. Tardif, Antoine (29 December 2020). "Bruno Zamborlin, CEO and Chief Scientist at Hypersurfaces – Interview Series". unite.ai. Retrieved 18 March 2019.
  3. "Bruno Zamborlin, PhD Feature". Coruzant Technologies. 1 July 2020. Retrieved 23 July 2019.
  4. "Home". mogees.co.uk.
  5. "Home". hypersurfaces.com.
  6. ""What we know about the Italian Pavilion at Venice Biennale 2023"". domusweb.
  7. Goldsmiths University, Computing department website
  8. Past and current members of the Sound Music Movement Interaction team at IRCAM
  9. Kimura, Mari; Rasamimanana, Nicolas; Bevilacqua, Frédéric; Zamborlin, Bruno; Schnell, Bruno; Flety, Emmanuel (2012). "Extracting Human Expression For Interactive Composition with the Augmented Violin". International Conference on New Interfaces for Musical Expression (NIME). Retrieved 17 January 2021.
  10. Bevilacqua, Frédéric; Zamborlin, Bruno; Sypniewski, Anthony; Schnell, Norbert; Guédy, Fabrice; Rasamimanana, Nicolas (2010). "Continuous Realtime Gesture Following and Recognition". Gesture in Embodied Communication and Human-Computer Interaction. Lecture Notes in Computer Science. Vol. 5934. pp. 73–84. doi:10.1007/978-3-642-12553-9_7. ISBN   978-3-642-12552-2. S2CID   16251822 . Retrieved 17 January 2021.
  11. "Gesture Follower". March 4, 2014.
  12. Bevilacqua, Frédéric; Schnell, Norbert; Rasamimanana, Nicolas; Zamborlin, Bruno; Guedy, Fabrice (2011). "Online Gesture Analysis and Control of Audio Processing". Musical Robots and Interactive Multimodal Systems. Springer Tracts in Advanced Robotics. Vol. 74. pp. 127–142. doi:10.1007/978-3-642-22291-7_8. ISBN   978-3-642-22290-0 . Retrieved 17 January 2021.
  13. Seminar by Bruno Zamborlin on Gesture interaction, Music Technology Group, University of Pompeu Fabra, 5 October 2011
  14. "EDB - Bienvenue". Archived from the original on 2021-01-22. Retrieved 2021-01-20.
  15. Interactive machine learning: experimental evidence for the human in the algorithmic loop. Holzinger, A., Plass, M., Kickmeier-Rust, M. et al. Interactive machine learning: experimental evidence for the human in the algorithmic loop. Appl Intell 49, 2401–2414 (2019)
  16. Zamborlin, Bruno; Bevilacqua, Frédéric; Gillies, Marco; D'Inverno, Mark (2014-01-15). "Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces". ACM Transactions on Interactive Intelligent Systems. 3 (4): 22:1–30. doi:10.1145/2543921. S2CID   7887245 . Retrieved 17 January 2021.
  17. "Industrial Applications".
  18. McPherson, Andrew; Morreale, Fabio; Harrison, Jacob (7 February 2019). "Musical Instruments for Novices: Comparing NIME, HCI and Crowdfunding Approaches". New Directions in Music and Human-Computer Interaction. Springer Series on Cultural Computing. pp. 179–212. doi:10.1007/978-3-319-92069-6_12. ISBN   978-3-319-92068-9. S2CID   151068133 . Retrieved 20 May 2021.
  19. Nagle, Paul (March 2016). "Mogees: Resynthesis App & Sensor For iOS & Mac". Sound on Sound. Retrieved 2 October 2020.
  20. Solon, Olivia (1 April 2012). "Mogees Project Turns Any Surface Into a Gestural Musical Interface". Wired.com. Retrieved 2 October 2020.
  21. O'Hear, Steve (25 May 2017). "Mogees picks up seed funding to put audio-based gesture recognition tech into new devices". TechCrunch. Retrieved 2 October 2020.
  22. Madelaine, Nicolas (22 August 2016). "Mogees, ou la réalité virtuelle sonore pour tous". Les Echos. Retrieved 2 October 2020.
  23. Rociola, Arcangelo (30 September 2014). "Mogees: an Italian's startup that is making the whole world play music (from trees to DJ's". StartupItalia. Retrieved 10 October 2020.
  24. "Kickstarter success for gadget that turns everyday objects into instruments". Fact Magazine. 5 March 2014. Retrieved 15 October 2020.
  25. Michaut, Cécile (12 June 2014). "Les chercheurs de l'Ircam ouvrent les portes de leurs laboratoires". Le Monde. Retrieved 15 June 2021.
  26. Rodrigo y Gabriela's website
  27. Bruno Zamborlin and Mogees on Jean Michel Jarre website
  28. Plaid and Bruno Zamborlin, ELEX music video
  29. Turk, Victoria (19 February 2014). "This Gadget Turns Any Object into Electronic Music". Vice.com. Retrieved 15 January 2021.
  30. Hattwick, Ian; Beebe, Preston; Hale, Zachary; Marcelo, Wanderley (2014). "Unsounding Objects: Audio Feature Extraction for the Control of Sound Synthesis". Proceedings of the International Conference on New Interfaces for Musical Expression: 597–600. doi:10.5281/zenodo.1178790 . Retrieved 20 May 2021.
  31. Parag K Mital's website
  32. O'Hear, Steve (20 November 2018). "HyperSurfaces turns any surface into a user interface using vibration sensors and AI". Techcrunch. Retrieved 17 January 2021.
  33. GBPending WO/2019/086862,Bruno Zamborlin; Conor Barry& Alessandro Saccoiaet al.,"A user interface for vehicles",published 2019-05-09, assigned to Mogees Limited
  34. GBPending WO/2019/086863,Bruno Zamborlin; Conor Barry& Alessandro Saccoiaet al.,"Trigger for game events",published 2019-05-09, assigned to Mogees Limited
  35. Ridden, Paul (20 November 2018). "HyperSurfaces uses AI to make object interfacing more natural". NewsAtlas. Retrieved 17 January 2021.
  36. "HyperSurfaces – Seamlessly Merging The Physical And Data Worlds". TechCompanyNews. 26 November 2018. Retrieved 17 January 2021.
  37. "Video Highlights: Data-enabled Hypersurfaces". Inside bigdata. 18 October 2018. Retrieved 22 January 2021.
  38. United Statespending US10817798B2,Bruno Zamborlin&Carmine Emanuele Cella,"Method to recognize a gesture and corresponding device",published 2016-04-27, assigned to Mogees Limited
  39. Yuanming, Shi; Kai, Yang; Tao, Jiang; Zhang, Jun; Letaief, Khaled B (2020). "Communication-efficient edge AI: Algorithms and systems". IEEE Communications Surveys & Tutorials. 22 (4): 2167–2191. arXiv: 2002.09668 . doi:10.1109/COMST.2020.3007787. S2CID   211258847 . Retrieved 20 May 2021.
  40. "Journée Science et Musique 2012".
  41. "Mogees- NEMODE Dragon's Den 2012 Winner- Where are they now? « www.nemode.ac.uk".