Sara Solla

Last updated

Sara A. Solla is an Argentine-American physicist and neuroscientist whose research applies ideas from statistical mechanics to problems involving neural networks, machine learning, and neuroscience. She is a professor of physics and of physiology at Northwestern University. [1] [2]

Contents

Education and career

Solla is originally from Buenos Aires, and earned a licenciatura in physics in 1974 from the University of Buenos Aires. She completed a Ph.D. in physics in 1982 at the University of Washington.

She became a postdoctoral researcher at Cornell University and at the Thomas J. Watson Research Center of IBM Research. Influenced to work in neural networks by a talk from John Hopfield at Cornell, she became a researcher in the neural networks group at Bell Labs. She took her present position at Northwestern University in 1997.

Recognition

Solla is a Member of the American Academy of Arts and Sciences (AAAS), and a Fellow of the American Physical Society (APS), Division of Biological Physics, "for applications of statistical physics to problems concerning learning, adaptation, and information coding in neural systems".

Related Research Articles

<span class="mw-page-title-main">Artificial intelligence</span> Intelligence of machines or software

Artificial intelligence (AI) is the intelligence of machines or software, as opposed to the intelligence of humans or animals. It is also the field of study in computer science that develops and studies intelligent machines. "AI" may also refer to the machines themselves.

<span class="mw-page-title-main">Artificial neural network</span> Computational model used in machine learning, based on connected, hierarchical functions

Artificial neural networks are a branch of machine learning models that are built using principles of neuronal organization discovered by connectionism in the biological neural networks constituting animal brains.

<span class="mw-page-title-main">Perceptron</span> Algorithm for supervised learning of binary classifiers

In machine learning, the perceptron is an algorithm for supervised learning of binary classifiers. A binary classifier is a function which can decide whether or not an input, represented by a vector of numbers, belongs to some specific class. It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector.

<span class="mw-page-title-main">Machine learning</span> Study of algorithms that improve automatically through experience

Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can effectively generalize and thus perform tasks without explicit instructions. Recently, generative artificial neural networks have been able to surpass many previous approaches in performance. Machine learning approaches have been applied to large language models, computer vision, speech recognition, email filtering, agriculture and medicine, where it is too costly to develop algorithms to perform the needed tasks.

John Joseph Hopfield is an American scientist most widely known for his invention of an associative neural network in 1982. It is now more commonly known as the Hopfield network.

<span class="mw-page-title-main">Alberto Calderón</span> Argentine mathematician

Alberto Pedro Calderón was an Argentinian mathematician. His name is associated with the University of Buenos Aires, but first and foremost with the University of Chicago, where Calderón and his mentor, the analyst Antoni Zygmund, developed the theory of singular integral operators. This created the "Chicago School of (hard) Analysis".

Terrence Joseph Sejnowski is the Francis Crick Professor at the Salk Institute for Biological Studies where he directs the Computational Neurobiology Laboratory and is the director of the Crick-Jacobs center for theoretical and computational biology. He has performed pioneering research in neural networks and computational neuroscience.

<span class="mw-page-title-main">Neural network</span> Structure in biology and artificial intelligence

A neural network is a neural circuit of biological neurons, sometimes also called a biological neural network, or a network of artificial neurons or nodes in the case of an artificial neural network.

Frank Rosenblatt was an American psychologist notable in the field of artificial intelligence. He is sometimes called the father of deep learning for his pioneering work on neural networks.

<span class="mw-page-title-main">Eduardo D. Sontag</span> Argentine American mathematician

Eduardo Daniel Sontag is an Argentine-American mathematician, and distinguished university professor at Northeastern University, who works in the fields control theory, dynamical systems, systems molecular biology, cancer and immunology, theoretical computer science, neural networks, and computational biology.

Juan G. Roederer is a professor of physics emeritus at the University of Alaska Fairbanks (UAF). His research fields are space physics, psychoacoustics, science policy and information theory. He conducted pioneering research on solar cosmic rays, on the theory of earth's radiation belts, neural networks for pitch processing, and currently on the foundations of information theory. He is also an accomplished organist.

<span class="mw-page-title-main">Yann LeCun</span> French computer scientist (born 1960)

Yann André LeCun is a Turing Award winning French computer scientist working primarily in the fields of machine learning, computer vision, mobile robotics and computational neuroscience. He is the Silver Professor of the Courant Institute of Mathematical Sciences at New York University and Vice-President, Chief AI Scientist at Meta.

<span class="mw-page-title-main">Deep learning</span> Branch of machine learning

Deep learning is the subset of machine learning methods which are based on artificial neural networks with representation learning. The adjective "deep" in deep learning refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.

Google Brain was a deep learning artificial intelligence research team under the umbrella of Google AI, a research division at Google dedicated to artificial intelligence. Formed in 2011, Google Brain combined open-ended machine learning research with information systems and large-scale computing resources. The team has created tools such as TensorFlow, which allow for neural networks to be used by the public, with multiple internal AI research projects. The team aims to create research opportunities in machine learning and natural language processing. The team was merged into former Google sister company DeepMind to form Google DeepMind in April 2023.

<span class="mw-page-title-main">Eric Xing</span>

Eric Poe Xing is an American computer scientist whose research spans machine learning, computational biology, and statistical methodology. Xing is founding President of the world’s first artificial intelligence university, Mohamed bin Zayed University of Artificial Intelligence (MBZUAI).

<span class="mw-page-title-main">Alicia Dickenstein</span> Argentine mathematician

Alicia Dickenstein is an Argentine mathematician known for her work on algebraic geometry, particularly toric geometry, tropical geometry, and their applications to biological systems. She is a full professor at the University of Buenos Aires, a 2019 Fellow of the American Mathematical Society, a former vice-president of the International Mathematical Union (2015–2018), and a 2015 recipient of The World Academy of Sciences prize.

Jean Marie Carlson is a professor of complexity at the University of California, Santa Barbara. She studies robustness and feedback in highly connected complex systems, which have applications in a variety of areas including earthquakes, wildfires and neuroscience.

<span class="mw-page-title-main">Kanaka Rajan</span> Indian-American computational neuroscientist

Kanaka Rajan is a computational neuroscientist in the Department of Neurobiology at Harvard Medical School and founding faculty in the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard University. Rajan trained in engineering, biophysics, and neuroscience, and has pioneered novel methods and models to understand how the brain processes sensory information. Her research seeks to understand how important cognitive functions — such as learning, remembering, and deciding — emerge from the cooperative activity of multi-scale neural processes, and how those processes are affected by various neuropsychiatric disease states. The resulting integrative theories about the brain bridge neurobiology and artificial intelligence.

Isabelle Guyon is a French-born researcher in machine learning known for her work on support-vector machines, artificial neural networks and bioinformatics. She is a Chair Professor at the University of Paris-Saclay.

Eun-Ah Kim is a Korean-American condensed matter physicist interested in high-temperature superconductivity, topological order, strange metals, and the use of neural network based machine learning to recognize patterns in these systems. She is a professor of physics at Cornell University.

References

  1. Sara a. Solla, Northwestern University Department of Physics and Astronomy, retrieved 2021-11-16
  2. Sara a. Solla, Northwestern University Feinberg School of Medicine, retrieved 2021-11-16