IEEE Transactions on Neural Networks and Learning Systems

Last updated

Editors-in-chief

Related Research Articles

<span class="mw-page-title-main">Artificial neural network</span> Computational model used in machine learning, based on connected, hierarchical functions

Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains.

Neuroevolution, or neuro-evolution, is a form of artificial intelligence that uses evolutionary algorithms to generate artificial neural networks (ANN), parameters, and rules. It is most commonly applied in artificial life, general game playing and evolutionary robotics. The main benefit is that neuroevolution can be applied more widely than supervised learning algorithms, which require a syllabus of correct input-output pairs. In contrast, neuroevolution requires only a measure of a network's performance at a task. For example, the outcome of a game can be easily measured without providing labeled examples of desired strategies. Neuroevolution is commonly used as part of the reinforcement learning paradigm, and it can be contrasted with conventional deep learning techniques that use gradient descent on a neural network with a fixed topology.

Neuromorphic engineering, also known as neuromorphic computing, is the use of electronic circuits to mimic neuro-biological architectures present in the nervous system. A neuromorphic computer/chip is any device that uses physical artificial neurons to do computations. In recent times, the term neuromorphic has been used to describe analog, digital, mixed-mode analog/digital VLSI, and software systems that implement models of neural systems. The implementation of neuromorphic computing on the hardware level can be realized by oxide-based memristors, spintronic memories, threshold switches, transistors, among others. Training software-based neuromorphic systems of spiking neural networks can be achieved using error backpropagation, e.g., using Python based frameworks such as snnTorch, or using canonical learning rules from the biological learning literature, e.g., using BindsNet.

The expression computational intelligence (CI) usually refers to the ability of a computer to learn a specific task from data or experimental observation. Even though it is commonly considered a synonym of soft computing, there is still no commonly accepted definition of computational intelligence.

Transfer learning (TL) is a research problem in machine learning (ML) that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem. For example, knowledge gained while learning to recognize cars could apply when trying to recognize trucks. This area of research bears some relation to the long history of psychological literature on transfer of learning, although practical ties between the two fields are limited. From the practical standpoint, reusing or transferring information from previously learned tasks for the learning of new tasks has the potential to significantly improve the sample efficiency of a reinforcement learning agent.

A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm. It may provide a sufficiently good solution to an optimization problem. It uses a local search technique to reduce the likelihood of premature convergence.

<span class="mw-page-title-main">Jacek M. Zurada</span>

Jacek M. Zurada serves as a Professor of Electrical and Computer Engineering Department at the University of Louisville, Kentucky. His M.S. and Ph.D degrees are from Politechnika Gdaṅska ranked as #1 among Polish universities of technology. He has held visiting appointments at Swiss Federal Institute of Technology, Zurich, Princeton, Northeastern, Auburn, and at overseas universities in Australia, Chile, China, France, Germany, Hong Kong, Italy, Japan, Poland, Singapore, Spain, and South Africa. He is a Life Fellow of IEEE and a Fellow of International Neural Networks Society and Doctor Honoris Causa of Czestochowa Institute of Technology, Poland.

<span class="mw-page-title-main">Erol Gelenbe</span> Turkish computer scientist

Sami Erol Gelenbe is a Turkish and French computer scientist, electronic engineer and applied mathematician who pioneered the field of Computer System and Network Performance in Europe, and is active in many research projects of the European Union. He is Professor in the Institute of Theoretical and Applied Informatics of the Polish Academy of Sciences (2017-), Associate Researcher in the I3S Laboratory and Abraham de Moivre Laboratory. He has held Chaired professorships at University of Liège (1974-1979), University Paris-Saclay (1979-1986), University Paris Descartes (1986-2005), Nello L. Teer Professor and ECE Chair at Duke University (1993-1998), University Chair Professor and Director of the School of EECS, University of Central Florida (1998-2003), and Dennis Gabor Professor and Head of Intelligent Systems and Network, Imperial College (2003-2019). He invented the random neural network and the eponymous G-networks. He has served as a consultant to Thomson-CSF, IBM, BT, France Telecom, Huawei, and General Dynamics. His awards include the Parlar Foundation Science Award (1994), the Grand Prix France Telecom (1996) of the French Academy of Sciences, the ACM SIGMETRICS Life-Time Achievement Award, the Oliver Lodge Medal, the "In Memoriam Dennis Gabor Award", and the Mustafa Prize (2017).

IEEE Transactions on Evolutionary Computation is a bimonthly peer-reviewed scientific journal published by the IEEE Computational Intelligence Society. It covers evolutionary computation and related areas including nature-inspired algorithms, population-based methods, and optimization where selection and variation are integral, and hybrid systems where these paradigms are combined. The editor-in-chief is Carlos A. Coello Coello (CINVESTAV). According to the Journal Citation Reports, the journal has a 2021 impact factor of 16.497.

<span class="mw-page-title-main">Long short-term memory</span> Artificial recurrent neural network architecture used in deep learning

Long short-term memory (LSTM) is an artificial neural network used in the fields of artificial intelligence and deep learning. Unlike standard feedforward neural networks, LSTM has feedback connections. Such a recurrent neural network (RNN) can process not only single data points, but also entire sequences of data. For example, LSTM is applicable to tasks such as unsegmented, connected handwriting recognition, speech recognition, machine translation, robot control, video games, and healthcare.

<span class="mw-page-title-main">Robert J. Marks II</span> American engineer and intelligent design advocate (born 1950)

Robert Jackson Marks II is an American electrical engineer, computer scientist and Distinguished Professor at Baylor University. His contributions include the Zhao-Atlas-Marks (ZAM) time-frequency distribution in the field of signal processing, the Cheung–Marks theorem in Shannon sampling theory and the Papoulis-Marks-Cheung (PMC) approach in multidimensional sampling. He was instrumental in the defining of the field of computational intelligence and co-edited the first book using computational intelligence in the title. A Christian and an old earth creationist, he is a subject of the 2008 pro-intelligent design motion picture, Expelled: No Intelligence Allowed.

Kumpati S. Narendra is an American control theorist, who currently holds the Harold W. Cheel Professorship of Electrical Engineering at Yale University. He received the Richard E. Bellman Control Heritage Award in 2003. He is noted "for pioneering contributions to stability theory, adaptive and learning systems theory". He is also well recognized for his research work towards learning including Neural Networks and Learning Automata.

Gail Alexandra Carpenter, Ph.D is a cognitive scientist, neuroscientist and mathematician. She is now a "Professor Emerita of Mathematics and Statistics, Boston University." She had also been a Professor of Cognitive and Neural Systems at Boston University, and the director of the Department of Cognitive and Neural Systems (CNS) Technology Lab at Boston University.

<span class="mw-page-title-main">Yann LeCun</span> French computer scientist (born 1960)

Yann André LeCun is a French computer scientist working primarily in the fields of machine learning, computer vision, mobile robotics and computational neuroscience. He is the Silver Professor of the Courant Institute of Mathematical Sciences at New York University and Vice-President, Chief AI Scientist at Meta.

<span class="mw-page-title-main">Deep learning</span> Branch of machine learning

Deep learning is part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised.

The IEEE Transactions on Learning Technologies (TLT) is a peer-reviewed scientific journal covering advances in the development of technologies for supporting human learning. It was established in 2008 and is published by the IEEE Education Society. The current editor-in-chief is Minjuan Wang of San Diego State University. Formerly, the journal was edited by Mark J.W. Lee of the Charles Sturt University (2019-2022), Wolfgang Nejdl of the University of Hannover (2008–2012), and by Peter Brusilovsky of the University of Pittsburgh (2013–2018).

<span class="mw-page-title-main">Amir Hussain (cognitive scientist)</span>

Amir Hussain is a cognitive scientist, the director of Cognitive Big Data and Cybersecurity (CogBID) Research Lab at Edinburgh Napier University He is a professor of computing science. He is founding Editor-in-Chief of Springer Nature's internationally leading Cognitive Computation journal and the new Big Data Analytics journal. He is founding Editor-in-Chief for two Springer Book Series: Socio-Affective Computing and Cognitive Computation Trends, and also serves on the Editorial Board of a number of other world-leading journals including, as Associate Editor for the IEEE Transactions on Neural Networks and Learning Systems, IEEE Transactions on Systems, Man and Cybernetics (Systems) and the IEEE Computational Intelligence Magazine.

Frank L. Lewis is an American electrical engineer, academic and researcher. He is a professor of electrical engineering, Moncrief-O’Donnell Endowed Chair, and head of Advanced Controls and Sensors Group at The University of Texas at Arlington (UTA). He is a member of UTA Academy of Distinguished Teachers and a charter member of UTA Academy of Distinguished Scholars.

References

  1. "IEEE Transactions on Neural Networks and Learning Systems". 2021 Journal Citation Reports. Web of Science (Science ed.). Clarivate. 2022.