Peter Corke

Last updated

Peter I. Corke
Born (1959-08-24) 24 August 1959 (age 64)
NationalityAustralian
Alma mater University of Melbourne
Known forVision-based robot control, Field robotics
Awards IEEE Fellow, Fellow of Australian Academy of Technological Sciences and Engineering, Senior Fellow of the Higher Education Academy
Scientific career
Fields Robotics
Computer Vision
Institutions Queensland University of Technology
CSIRO
University of Melbourne
Thesis High-performance visual closed-loop robot control  (1994)
Doctoral advisor M.C. Good
Website petercorke.com

Peter Corke FAA (born 24 August 1959) is an Australian roboticist known for his work on Visual Servoing, field robotics, online education, the online Robot Academy and the Robotics Toolbox and Machine Vision Toolbox for MATLAB (matrix laboratory). He is currently director of the Australian Research Council Centre of Excellence for Robotic Vision, and a Distinguished Professor of Robotic Vision at Queensland University of Technology. His research is concerned with robotic vision, flying robots and farming robots.

Contents

Corke is a Fellow of the Australian Academy of Technological Sciences and Engineering and of the Institute of Electrical and Electronics Engineers. [1] He is a founding editor of the Journal of Field Robotics, [2] and a former member of the executive editorial board of The International Journal of Robotics Research.

Career

Corke received Bachelor of Engineering, Masters of Engineering and Ph.D. degrees from the University of Melbourne in Australia. [3]

In 1984 he worked at CSIRO, formerly the Commonwealth Scientific and Industrial Research Organisation, on robotics. He developed an open-source robot control system [4] and vision applications in food processing and for real-time traffic monitoring. [5] [6]

In 1995 he moved to Brisbane and established a program of research into mining automation [7] focused on Dragline excavators, rope shovels and load-haul-dump (load-haul-dump) units. [8] In 1996, Corke co-authored an early tutorial paper [9] and later proposed the partitioned approach to visual control. [10] He served as Research Director of the Autonomous Systems Laboratory of CSIRO's Information and Communications Technology Centre (ICTC), from 2004 to 2007. [11] [12]

From 2005 to 2009 he worked on wireless sensor network technology, was a co-developer of the Fleck wireless sensor node, and investigated applications to environmental monitoring and agriculture, [13] and virtual fencing. [14] [15] He was a senior principal research scientist when he left to take up a chair at the Queensland University of Technology in 2010. [16] [17] [18]

From 2009 to 2013, he served as editor-in-chief of the IEEE's Robotics & Automation magazine. [19]

Works

Related Research Articles

Computer vision tasks include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions. Understanding in this context means the transformation of visual images into descriptions of the world that make sense to thought processes and can elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.

<span class="mw-page-title-main">Home automation</span> Building automation for a home

Home automation or domotics is building automation for a home. A home automation system will monitor and/or control home attributes such as lighting, climate, entertainment systems, and appliances. It may also include home security such as access control and alarm systems.

<span class="mw-page-title-main">Simultaneous localization and mapping</span> Computational navigational technique used by robots and autonomous vehicles

Simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it. While this initially appears to be a chicken or the egg problem, there are several algorithms known to solve it in, at least approximately, tractable time for certain environments. Popular approximate solution methods include the particle filter, extended Kalman filter, covariance intersection, and GraphSLAM. SLAM algorithms are based on concepts in computational geometry and computer vision, and are used in robot navigation, robotic mapping and odometry for virtual reality or augmented reality.

<span class="mw-page-title-main">Swarm robotics</span> Coordination of multiple robots as a system

Swarm robotics is an approach to the coordination of multiple robots as a system which consist of large numbers of mostly simple physical robots. ″In a robot swarm, the collective behavior of the robots results from local interactions between the robots and between the robots and the environment in which they act.″ It is supposed that a desired collective behavior emerges from the interactions between the robots and interactions of robots with the environment. This approach emerged on the field of artificial swarm intelligence, as well as the biological studies of insects, ants and other fields in nature, where swarm behaviour occurs.

<span class="mw-page-title-main">Gregory Dudek</span>

Gregory L. Dudek is a Canadian computer scientist specializing in robotics, computer vision, and intelligent systems. He is a chaired professor at McGill University where he has led the Mobile Robotics Lab since the 1990s. He was formerly the director of McGill's school of computer science and before that director of McGill's center for intelligent machines.

<span class="mw-page-title-main">Mobile robot</span> Type of robot

A mobile robot is an automatic machine that is capable of locomotion. Mobile robotics is usually considered to be a subfield of robotics and information engineering.

An area of computer vision is active vision, sometimes also called active computer vision. An active vision system is one that can manipulate the viewpoint of the camera(s) in order to investigate the environment and get better information from it.

Visual servoing, also known as vision-based robot control and abbreviated VS, is a technique which uses feedback information extracted from a vision sensor to control the motion of a robot. One of the earliest papers that talks about visual servoing was from the SRI International Labs in 1979.

<span class="mw-page-title-main">Visual odometry</span> Determining the position and orientation of a robot by analyzing associated camera images

In robotics and computer vision, visual odometry is the process of determining the position and orientation of a robot by analyzing the associated camera images. It has been used in a wide variety of robotic applications, such as on the Mars Exploration Rovers.

<span class="mw-page-title-main">Flower robot</span>

In home automation systems and robotics, a flower robot is a simple electromechanical device with the appearance of a common flower, with components such as stem and leaves. First developed by Berufsbildende Schule 1 Kaiserslautern in 2006 and later by Carnegie Mellon University in 2007, flower robots are used as intelligent home appliances, with capabilities such as sensing, simple actuation for movements, and effectors such as lights or humidifier.

<span class="mw-page-title-main">Robotics</span> Design, construction, use, and application of robots

Robotics is the interdisciplinary study and practice of the design, construction, operation, and use of robots.

Robotics middleware is middleware to be used in complex robot control software systems.

The Australian Research Centre for Aerospace Automation (ARCAA) was a research centre of the Queensland University of Technology. ARCAA conducted research into all aspects of aviation automation, with a particular research focus on autonomous technologies which support the more efficient and safer utilisation of airspace, and the development of autonomous aircraft and on-board sensor systems for a wide range of commercial applications.

The Robotics Toolbox is MATLAB toolbox software that supports research and teaching into arm-type and mobile robotics. While the Robotics Toolbox is free software, it requires the proprietary MATLAB environment in order to execute. The Toolbox forms the basis of the exercises in several textbooks.

Cloud robotics is a field of robotics that attempts to invoke cloud technologies such as cloud computing, cloud storage, and other Internet technologies centered on the benefits of converged infrastructure and shared services for robotics. When connected to the cloud, robots can benefit from the powerful computation, storage, and communication resources of modern data center in the cloud, which can process and share information from various robots or agent. Humans can also delegate tasks to robots remotely through networks. Cloud computing technologies enable robot systems to be endowed with powerful capability whilst reducing costs through cloud technologies. Thus, it is possible to build lightweight, low-cost, smarter robots with an intelligent "brain" in the cloud. The "brain" consists of data center, knowledge base, task planners, deep learning, information processing, environment models, communication support, etc.

<span class="mw-page-title-main">Gregory D. Hager</span> American computer scientist

Gregory D. Hager is the Mandell Bellmore Professor of Computer Science and founding director of the Johns Hopkins Malone Center for Engineering in Healthcare at Johns Hopkins University.

<span class="mw-page-title-main">Simone Schürle-Finke</span> German biomedical engineer

Simone Schürle-Finke is a German biomedical engineer, assistant professor, and Principal Investigator for the Responsive Biomedical Systems Laboratory in Switzerland. Schürle is a pioneer in nanorobotic and magnetic servoing technologies.

<span class="mw-page-title-main">Margarita Chli</span> Greek computer vision and robotics researcher

Margarita Chli is an assistant professor and leader of the Vision for Robotics Lab at ETH Zürich in Switzerland. Chli is a leader in the field of computer vision and robotics and was on the team of researchers to develop the first fully autonomous helicopter with onboard localization and mapping. Chli is also the Vice Director of the Institute of Robotics and Intelligent Systems and an Honorary Fellow of the University of Edinburgh in the United Kingdom. Her research currently focuses on developing visual perception and intelligence in flying autonomous robotic systems.

Sonia Martínez Díaz is a Spanish mechanical engineer whose research applies control theory to the coordinated motion of robot swarms and mobile wireless sensor networks. She is a professor in the Department of Mechanical and Aerospace Engineering at the University of California, San Diego.

References

  1. "IEEE RAS Fellow Listing" (PDF).
  2. "Journal of Field Robotics".
  3. "QUT biography profile" . Retrieved 8 September 2013.
  4. Corke, P.; Kirkham, R. "The ARCL Robot Programming Systems". CiteSeerX   10.1.1.45.4558 .
  5. Kassler, Michael (1 December 1994). ""Robosorter": A system for simultaneous sorting of food products". Assembly Automation. 14 (4): 18–20. doi:10.1108/EUM0000000004214. ISSN   0144-5154.
  6. Kassler, Michael; Corke, Peter I.; Wong, Paul C. (1 December 1993). "Automatic grading and packing of prawns". Computers and Electronics in Agriculture. 9 (4): 319–333. doi:10.1016/0168-1699(93)90049-7. ISSN   0168-1699.
  7. Collis, Brad (2002). Fields of Discovery: Australia's CSIRO. Allen&Unwin. p. 336. ISBN   978-1-86508-602-6.
  8. McCabe, Bruce (27 June 2006), "Profit from our big bots to go offshore", The Australian
  9. Hutchinson, S.; Hager, G.; Corke, P. (October 1996), "A tutorial on visual servo control", IEEE Transactions on Robotics and Automation, 12 (5): 651–670, doi:10.1109/70.538972, S2CID   1814423
  10. Corke, P.; Hutchinson, S. (August 2001), "A new partitioned approach to image-based visual servo control", IEEE Transactions on Robotics and Automation, 17 (4): 507–515, doi:10.1109/70.954764, S2CID   18899160
  11. Douglas, Jeanne-Vida (6 December 2005). "Developer keeps computing 'til the cows come home". The Age. Retrieved 24 June 2019.
  12. "Our people". Robotics and Autonomous Systems Group. Retrieved 24 June 2019.
  13. Corke, P.; Wark, T.; Jurdak, R.; Hu, W.; Valencia, P.; Moore, D. (November 2010), "Environmental wireless sensor networks", Proceedings of the IEEE, 98 (11): 1903–1917, doi:10.1109/JPROC.2010.2068530, S2CID   17865564
  14. Butler, Z.; Corke, P.; Peterson, R.; Rus, D. (April 2004), "Virtual fences for controlling cows", Proceedings of the IEEE Conference on Robotics & Automation: 14429–4436
  15. Douglas, Jeanne-Vida (6 December 2005), "Developer keeps computing 'til the cows come home", The Age
  16. McCosker, Amy (17 July 2013). "Farm robots soon to be a reality". ABC Rural. Retrieved 23 June 2019.
  17. "QUT researchers develop new surveillance robots – QUT News" . Retrieved 23 June 2019.
  18. "Online courses put life in robot pilot". www.theaustralian.com.au. 19 January 2015. Retrieved 23 June 2019.
  19. Corke, P. (March 2010). "First Experience as EiC [From the Editor's Desk]". IEEE Robotics Automation Magazine. 17 (1): 2–119. doi:10.1109/MRA.2010.935800. ISSN   1070-9932.