Nello Cristianini

Last updated

Nello Cristianini
Born1968 (age 5556)
Alma mater
Known for Statistical learning, media content analysis, support vector machines, Philosophy of artificial intelligence
Awards Royal Society Wolfson Research Merit Award, ERC Advanced Grant
Scientific career
Fields Artificial intelligence
Institutions
Website researchportal.bath.ac.uk/en/persons/nello-cristianini/

Nello Cristianini (born 1968) is a professor of Artificial Intelligence in the Department of Computer Science at the University of Bath. [1]

Contents

Education

Cristianini holds a degree in physics from the University of Trieste, a Master in computational intelligence from Royal Holloway, University of London and a PhD from the University of Bristol. [2] Previously he has been a professor of Artificial Intelligence at the University of Bristol, an associate professor at the University of California, Davis, and held visiting positions at other universities. [3]

Research

His research contributions encompass the fields of machine learning, artificial intelligence and bioinformatics. Particularly, his work has focused on statistical analysis of learning algorithms, to its application to support vector machines, kernel methods and other algorithms. Cristianini is the co-author of two widely known books in machine learning, An Introduction to Support Vector Machines and Kernel Methods for Pattern Analysis and a book in bioinformatics, "Introduction to Computational Genomics".

Recent research has focused on the philosophical challenges posed by modern artificial intelligence, big-data analysis of newspapers content, the analysis of social media content. Previous research had focused on statistical pattern analysis; machine learning and artificial intelligence; machine translation; bioinformatics.

As a practitioner of data-driven AI and Machine Learning, Cristianini frequently gives public talks about the need for a deeper ethical understanding of the effects of modern data-science on society. [4] His book "The Shortcut" is devoted to the philosophical foundations of Artificial Intelligence and its potential risks for individuals and society. [5]

Awards and honours

Cristianini is a recipient of the Royal Society Wolfson Research Merit Award and of a European Research Council Advanced Grant. In June 2014, Cristianini was included in a list of the "most influential scientists of the decade" compiled by Thomson Reuters (listing the top one per cent of scientists who are "the world’s leading scientific minds" and whose publications are among the most influential in their fields). [6] [7] [8] In December 2016 he was included in the list of Top100 most influential researchers in Machine Learning by AMiner. [9] In 2017, Cristianini was the keynote speaker at the Annual STOA Lecture at the European Parliament. [10] From 2020 to 2024 he was a member of the International Advisory Board of STOA (Panel for the Future of Science and Technology of the European Parliament). [11]

Books

Related Research Articles

<span class="mw-page-title-main">Binary classification</span> Dividing things between two categories

Binary classification is the task of classifying the elements of a set into one of two groups. Typical binary classification problems include:

Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data and thus perform tasks without explicit instructions. Recently, artificial neural networks have been able to surpass many previous approaches in performance.

Text mining, text data mining (TDM) or text analytics is the process of deriving high-quality information from text. It involves "the discovery by computer of new, previously unknown information, by automatically extracting information from different written resources." Written resources may include websites, books, emails, reviews, and articles. High-quality information is typically obtained by devising patterns and trends by means such as statistical pattern learning. According to Hotho et al. (2005) we can distinguish between three different perspectives of text mining: information extraction, data mining, and a knowledge discovery in databases (KDD) process. Text mining usually involves the process of structuring the input text, deriving patterns within the structured data, and finally evaluation and interpretation of the output. 'High quality' in text mining usually refers to some combination of relevance, novelty, and interest. Typical text mining tasks include text categorization, text clustering, concept/entity extraction, production of granular taxonomies, sentiment analysis, document summarization, and entity relation modeling.

<span class="mw-page-title-main">Computational sociology</span> Branch of the discipline of sociology

Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like social network analysis, computational sociology develops and tests theories of complex social processes through bottom-up modeling of social interactions.

Bernhard Schölkopf is a German computer scientist known for his work in machine learning, especially on kernel methods and causality. He is a director at the Max Planck Institute for Intelligent Systems in Tübingen, Germany, where he heads the Department of Empirical Inference. He is also an affiliated professor at ETH Zürich, honorary professor at the University of Tübingen and Technische Universität Berlin, and chairman of the European Laboratory for Learning and Intelligent Systems (ELLIS).

In machine learning, kernel machines are a class of algorithms for pattern analysis, whose best known member is the support-vector machine (SVM). These methods involve using linear classifiers to solve nonlinear problems. The general task of pattern analysis is to find and study general types of relations in datasets. For many algorithms that solve these tasks, the data in raw representation have to be explicitly transformed into feature vector representations via a user-specified feature map: in contrast, kernel methods require only a user-specified kernel, i.e., a similarity function over all pairs of data points computed using inner products. The feature map in kernel machines is infinite dimensional but only requires a finite dimensional matrix from user-input according to the Representer theorem. Kernel machines are slow to compute for datasets larger than a couple of thousand examples without parallel processing.

Vasant G. Honavar is an Indian-American computer scientist, and artificial intelligence, machine learning, big data, data science, causal inference, knowledge representation, bioinformatics and health informatics researcher and professor.

<span class="mw-page-title-main">John Shawe-Taylor</span> English academic (born 1953)

John Stewart Shawe-Taylor is Director of the Centre for Computational Statistics and Machine Learning at University College, London (UK). His main research area is statistical learning theory. He has contributed to a number of fields ranging from graph theory through cryptography to statistical learning theory and its applications. However, his main contributions have been in the development of the analysis and subsequent algorithmic definition of principled machine learning algorithms founded in statistical learning theory. This work has helped to drive a fundamental rebirth in the field of machine learning with the introduction of kernel methods and support vector machines, including the mapping of these approaches onto novel domains including work in computer vision, document classification and brain scan analysis. More recently he has worked on interactive learning and reinforcement learning. He has also been instrumental in assembling a series of influential European Networks of Excellence. The scientific coordination of these projects has influenced a generation of researchers and promoted the widespread uptake of machine learning in both science and industry that we are currently witnessing. He has published over 300 papers with over 42000 citations. Two books co-authored with Nello Cristianini have become standard monographs for the study of kernel methods and support vector machines and together have attracted 21000 citations. He was Head of the Computer Science Department at University College London from 2010 to 2019, where he oversaw a significant expansion and witnessed its emergence as the highest ranked Computer Science Department in the UK in the 2014 UK Research Evaluation Framework (REF).

In statistical classification, the Fisher kernel, named after Ronald Fisher, is a function that measures the similarity of two objects on the basis of sets of measurements for each object and a statistical model. In a classification procedure, the class for a new object can be estimated by minimising, across classes, an average of the Fisher kernel distance from the new object to each known member of the given class.

<span class="mw-page-title-main">Zoubin Ghahramani</span> British-Iranian machine learning researcher

Zoubin Ghahramani FRS is a British-Iranian researcher and Professor of Information Engineering at the University of Cambridge. He holds joint appointments at University College London and the Alan Turing Institute. and has been a Fellow of St John's College, Cambridge since 2009. He was Associate Research Professor at Carnegie Mellon University School of Computer Science from 2003–2012. He was also the Chief Scientist of Uber from 2016 until 2020. He joined Google Brain in 2020 as senior research director. He is also Deputy Director of the Leverhulme Centre for the Future of Intelligence.

<span class="mw-page-title-main">David Haussler</span> American bioinformatician

David Haussler is an American bioinformatician known for his work leading the team that assembled the first human genome sequence in the race to complete the Human Genome Project and subsequently for comparative genome analysis that deepens understanding the molecular function and evolution of the genome.

Léon Bottou is a researcher best known for his work in machine learning and data compression. His work presents stochastic gradient descent as a fundamental learning algorithm. He is also one of the main creators of the DjVu image compression technology, and the maintainer of DjVuLibre, the open source implementation of DjVu. He is the original developer of the Lush programming language.

In machine learning and data mining, a string kernel is a kernel function that operates on strings, i.e. finite sequences of symbols that need not be of the same length. String kernels can be intuitively understood as functions measuring the similarity of pairs of strings: the more similar two strings a and b are, the higher the value of a string kernel K(a, b) will be.

<span class="mw-page-title-main">Sepp Hochreiter</span> German computer scientist

Josef "Sepp" Hochreiter is a German computer scientist. Since 2018 he has led the Institute for Machine Learning at the Johannes Kepler University of Linz after having led the Institute of Bioinformatics from 2006 to 2018. In 2017 he became the head of the Linz Institute of Technology (LIT) AI Lab. Hochreiter is also a founding director of the Institute of Advanced Research in Artificial Intelligence (IARAI). Previously, he was at Technische Universität Berlin, at University of Colorado Boulder, and at the Technical University of Munich. He is a chair of the Critical Assessment of Massive Data Analysis (CAMDA) conference.

Computational social science is an interdisciplinary academic sub-field concerned with computational approaches to the social sciences. This means that computers are used to model, simulate, and analyze social phenomena. It has been applied in areas such as computational economics, computational sociology, computational media analysis, cliodynamics, culturomics, nonprofit studies. It focuses on investigating social and behavioral relationships and interactions using data science approaches, network analysis, social simulation and studies using interactive systems.

<span class="mw-page-title-main">Social machine</span>

A social machine is an environment comprising humans and technology interacting and producing outputs or action which would not be possible without both parties present. It can also be regarded as a machine, in which specific tasks are performed by human participants, whose interaction is mediated by an infrastructure. The growth of social machines has been greatly enabled by technologies such as the Internet, the smartphone, social media and the World Wide Web, by connecting people in new ways.

In natural language processing (NLP), a word embedding is a representation of a word. The embedding is used in text analysis. Typically, the representation is a real-valued vector that encodes the meaning of the word in such a way that the words that are closer in the vector space are expected to be similar in meaning. Word embeddings can be obtained using language modeling and feature learning techniques, where words or phrases from the vocabulary are mapped to vectors of real numbers.

Multiple kernel learning refers to a set of machine learning methods that use a predefined set of kernels and learn an optimal linear or non-linear combination of kernels as part of the algorithm. Reasons to use multiple kernel learning include a) the ability to select for an optimal kernel and parameters from a larger set of kernels, reducing bias due to kernel selection while allowing for more automated machine learning methods, and b) combining data from different sources that have different notions of similarity and thus require different kernels. Instead of creating a new kernel, multiple kernel algorithms can be used to combine kernels already established for each individual data source.

The following outline is provided as an overview of and topical guide to machine learning:

References

  1. Nello Cristianini publications indexed by Google Scholar
  2. "Nello Cristianini". Mathematics Genealogy Project. Retrieved 17 December 2022.
  3. "Nello Cristianini". www.cs.berkeley.edu. Archived from the original on 5 December 2001. Retrieved 19 April 2022.
  4. "The road to artificial intelligence: A case of data over theory". New Scientist. Retrieved 26 December 2016.
  5. Cristianini, Nello (2023). The Shortcut. CRC Press. ISBN   9781032305097.
  6. "Bristol University fab five among world's finest scientists | Bristol Post". www.bristolpost.co.uk. Archived from the original on 19 August 2014. Retrieved 19 April 2022.
  7. "Bristol University | News | August: World's leading scientific minds". www.bristol.ac.uk. Archived from the original on 19 August 2014. Retrieved 19 April 2022.
  8. "Home". highlycited.com.
  9. "AMiner".
  10. "How should we manage media in the age of artificial intelligence? | Epthinktank | European Parliament". epthinktank.eu. Archived from the original on 23 September 2021. Retrieved 19 April 2022.
  11. "STOA International Advisory Board 2020-2024". 20 May 2022. Archived from the original on 20 May 2022. Retrieved 20 May 2022.