Nello Cristianini

Last updated

Nello Cristianini
Born
Nello Cristianini

1968 (age 5455)
Alma mater
Known for Statistical learning, media content analysis, support vector machines, Philosophy of artificial intelligence
Awards Royal Society Wolfson Research Merit Award, ERC Advanced Grant
Scientific career
Fields Artificial intelligence
Institutions
Website researchportal.bath.ac.uk/en/persons/nello-cristianini/

Nello Cristianini (born 1968) is a professor of Artificial Intelligence in the Department of Computer Science at the University of Bath. [1]

Contents

Education

Cristianini holds a degree in physics from the University of Trieste, a Master in computational intelligence from the University of London and a PhD from the University of Bristol. [2] Previously he has been a professor of Artificial Intelligence at the University of Bristol, an associate professor at the University of California, Davis, and held visiting positions at other universities. [3]

Research

His research contributions encompass the fields of machine learning, artificial intelligence and bioinformatics. Particularly, his work has focused on statistical analysis of learning algorithms, to its application to support vector machines, kernel methods and other algorithms. Cristianini is the co-author of two widely known books in machine learning, An Introduction to Support Vector Machines and Kernel Methods for Pattern Analysis and a book in bioinformatics, "Introduction to Computational Genomics".

Recent research has focused on the philosophical challenges posed by modern artificial intelligence, big-data analysis of newspapers content, the analysis of social media content. Previous research had focused on statistical pattern analysis; machine learning and artificial intelligence; machine translation; bioinformatics.

As a practitioner of data-driven AI and Machine Learning, Cristianini frequently gives public talks about the need for a deeper ethical understanding of the effects of modern data-science on society. [4] His book "The Shortcut" is devoted to the philosophical foundations of Artificial Intelligence and its potential risks for individuals and society. [5]

Awards and honours

Cristianini is a recipient of the Royal Society Wolfson Research Merit Award and of a European Research Council Advanced Grant. In June 2014, Cristianini was included in a list of the "most influential scientists of the decade" compiled by Thomson Reuters (listing the top one per cent of scientists who are "the world’s leading scientific minds" and whose publications are among the most influential in their fields). [6] [7] [8] In December 2016 he was included in the list of Top100 most influential researchers in Machine Learning by AMiner. [9] In 2017, Cristianini was the keynote speaker at the Annual STOA Lecture at the European Parliament. [10] From 2020 to 2024 he was a member of the International Advisory Board of STOA (Panel for the Future of Science and Technology of the European Parliament). [11]

Books

Related Research Articles

Binary classification is the task of classifying the elements of a set into two groups on the basis of a classification rule. Typical binary classification problems include:

<span class="mw-page-title-main">Machine learning</span> Study of algorithms that improve automatically through experience

Machine learning (ML) is a field devoted to understanding and building methods that let machines "learn" – that is, methods that leverage data to improve computer performance on some set of tasks.

Text mining, text data mining (TDM) or text analytics is the process of deriving high-quality information from text. It involves "the discovery by computer of new, previously unknown information, by automatically extracting information from different written resources." Written resources may include websites, books, emails, reviews, and articles. High-quality information is typically obtained by devising patterns and trends by means such as statistical pattern learning. According to Hotho et al. (2005) we can distinguish between three different perspectives of text mining: information extraction, data mining, and a knowledge discovery in databases (KDD) process. Text mining usually involves the process of structuring the input text, deriving patterns within the structured data, and finally evaluation and interpretation of the output. 'High quality' in text mining usually refers to some combination of relevance, novelty, and interest. Typical text mining tasks include text categorization, text clustering, concept/entity extraction, production of granular taxonomies, sentiment analysis, document summarization, and entity relation modeling.

<span class="mw-page-title-main">Computational sociology</span> Branch of the discipline of sociology

Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.

Bernhard Schölkopf is a German computer scientist known for his work in machine learning, especially on kernel methods and causality. He is a director at the Max Planck Institute for Intelligent Systems in Tübingen, Germany, where he heads the Department of Empirical Inference. He is also an affiliated professor at ETH Zürich, honorary professor at the University of Tübingen and the Technical University Berlin, and chairman of the European Laboratory for Learning and Intelligent Systems (ELLIS).

The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.

<span class="mw-page-title-main">Kernel method</span> Class of algorithms for pattern analysis

In machine learning, kernel machines are a class of algorithms for pattern analysis, whose best known member is the support-vector machine (SVM). Kernel methods are types of algorithms that are used for pattern analysis. These methods involve using linear classifiers to solve nonlinear problems. The general task of pattern analysis is to find and study general types of relations in datasets. For many algorithms that solve these tasks, the data in raw representation have to be explicitly transformed into feature vector representations via a user-specified feature map: in contrast, kernel methods require only a user-specified kernel, i.e., a similarity function over all pairs of data points computed using inner products. The feature map in kernel machines is infinite dimensional but only requires a finite dimensional matrix from user-input according to the Representer theorem. Kernel machines are slow to compute for datasets larger than a couple of thousand examples without parallel processing.

Vasant G. Honavar is an Indian born American computer scientist, and artificial intelligence, machine learning, big data, data science, causal inference, knowledge representation, bioinformatics and health informatics researcher and professor.

<span class="mw-page-title-main">John Shawe-Taylor</span>

John Stewart Shawe-Taylor is Director of the Centre for Computational Statistics and Machine Learning at University College, London (UK). His main research area is statistical learning theory. He has contributed to a number of fields ranging from graph theory through cryptography to statistical learning theory and its applications. However, his main contributions have been in the development of the analysis and subsequent algorithmic definition of principled machine learning algorithms founded in statistical learning theory. This work has helped to drive a fundamental rebirth in the field of machine learning with the introduction of kernel methods and support vector machines, including the mapping of these approaches onto novel domains including work in computer vision, document classification and brain scan analysis. More recently he has worked on interactive learning and reinforcement learning. He has also been instrumental in assembling a series of influential European Networks of Excellence. The scientific coordination of these projects has influenced a generation of researchers and promoted the widespread uptake of machine learning in both science and industry that we are currently witnessing. He has published over 300 papers with over 42000 citations. Two books co-authored with Nello Cristianini have become standard monographs for the study of kernel methods and support vector machines and together have attracted 21000 citations. He is Head of the Computer Science Department at University College London, where he has overseen a significant expansion and witnessed its emergence as the highest ranked Computer Science Department in the UK in the 2014 UK Research Evaluation Framework (REF).

In statistical classification, the Fisher kernel, named after Ronald Fisher, is a function that measures the similarity of two objects on the basis of sets of measurements for each object and a statistical model. In a classification procedure, the class for a new object can be estimated by minimising, across classes, an average of the Fisher kernel distance from the new object to each known member of the given class.

<span class="mw-page-title-main">Zoubin Ghahramani</span> British-Iranian machine learning researcher

Zoubin Ghahramani FRS is a British-Iranian researcher and Professor of Information Engineering at the University of Cambridge. He holds joint appointments at University College London and the Alan Turing Institute. and has been a Fellow of St John's College, Cambridge since 2009. He was Associate Research Professor at Carnegie Mellon University School of Computer Science from 2003–2012. He was also the Chief Scientist of Uber from 2016 until 2020. He joined Google Brain in 2020 as senior research director. He is also Deputy Director of the Leverhulme Centre for the Future of Intelligence.

Léon Bottou is a researcher best known for his work in machine learning and data compression. His work presents stochastic gradient descent as a fundamental learning algorithm. He is also one of the main creators of the DjVu image compression technology, and the maintainer of DjVuLibre, the open source implementation of DjVu. He is the original developer of the Lush programming language.

In machine learning and data mining, a string kernel is a kernel function that operates on strings, i.e. finite sequences of symbols that need not be of the same length. String kernels can be intuitively understood as functions measuring the similarity of pairs of strings: the more similar two strings a and b are, the higher the value of a string kernel K(a, b) will be.

<span class="mw-page-title-main">Sepp Hochreiter</span> German computer scientist

Josef "Sepp" Hochreiter is a German computer scientist. Since 2018 he has led the Institute for Machine Learning at the Johannes Kepler University of Linz after having led the Institute of Bioinformatics from 2006 to 2018. In 2017 he became the head of the Linz Institute of Technology (LIT) AI Lab. Hochreiter is also a founding director of the Institute of Advanced Research in Artificial Intelligence (IARAI). Previously, he was at the Technical University of Berlin, at the University of Colorado at Boulder, and at the Technical University of Munich. He is a chair of the Critical Assessment of Massive Data Analysis (CAMDA) conference.

mlpy is a Python, open-source, machine learning library built on top of NumPy/SciPy, the GNU Scientific Library and it makes an extensive use of the Cython language. mlpy provides a wide range of state-of-the-art machine learning methods for supervised and unsupervised problems and it is aimed at finding a reasonable compromise among modularity, maintainability, reproducibility, usability and efficiency. mlpy is multiplatform, it works with Python 2 and 3 and it is distributed under GPL3.

<span class="mw-page-title-main">Social machine</span>

A social machine is an environment comprising humans and technology interacting and producing outputs or action which would not be possible without both parties present. It can also be regarded as a machine, in which specific tasks are performed by human participants, whose interaction is mediated by an infrastructure. The growth of social machines has been greatly enabled by technologies such as the Internet, the smartphone, social media and the World Wide Web, by connecting people in new ways.

<span class="mw-page-title-main">Word embedding</span> Method in natural language processing

In natural language processing (NLP), a word embedding is a representation of a word. The embedding is used in text analysis. Typically, the representation is a real-valued vector that encodes the meaning of the word in such a way that words that are closer in the vector space are expected to be similar in meaning. Word embeddings can be obtained using language modeling and feature learning techniques, where words or phrases from the vocabulary are mapped to vectors of real numbers.

Klaus-Robert Müller is a German computer scientist and physicist, most noted for his work in machine learning and brain–computer interfaces.

<span class="mw-page-title-main">Outline of machine learning</span> Overview of and topical guide to machine learning

The following outline is provided as an overview of and topical guide to machine learning. Machine learning is a subfield of soft computing within computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence. In 1959, Arthur Samuel defined machine learning as a "field of study that gives computers the ability to learn without being explicitly programmed". Machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions.

References

  1. Nello Cristianini publications indexed by Google Scholar
  2. "Nello Cristianini". Mathematics Genealogy Project. Retrieved 17 December 2022.
  3. "Nello Cristianini". www.cs.berkeley.edu. Archived from the original on 5 December 2001. Retrieved 19 April 2022.
  4. "The road to artificial intelligence: A case of data over theory". New Scientist. Retrieved 26 December 2016.
  5. Cristianini, Nello (2023). The Shortcut. CRC Press. ISBN   9781032305097.
  6. "Bristol University fab five among world's finest scientists | Bristol Post". www.bristolpost.co.uk. Archived from the original on 19 August 2014. Retrieved 19 April 2022.
  7. "Bristol University | News | August: World's leading scientific minds". www.bristol.ac.uk. Archived from the original on 19 August 2014. Retrieved 19 April 2022.
  8. "Home". highlycited.com.
  9. "AMiner".
  10. "How should we manage media in the age of artificial intelligence? | Epthinktank | European Parliament". epthinktank.eu. Archived from the original on 23 September 2021. Retrieved 19 April 2022.
  11. "STOA International Advisory Board 2020-2024". 20 May 2022. Archived from the original on 20 May 2022. Retrieved 20 May 2022.