Australasian Language Technology Association

Last updated
Australasian Language Technology Association
Australasian Language Technology Association (logo).jpg
Founded2002
Typeprofessional organization
Focus computational linguistics
Area served
Australia and New Zealand
Methodconferences, publications
Website www.alta.asn.au

The Australasian Language Technology Association (ALTA) promotes language technology research and development in Australia and New Zealand. ALTA organises regular events for the exchange of research results and for academic and industrial training, and co-ordinates activities with other professional societies. ALTA is a founding regional organization [1] of the Asian Federation of Natural Language Processing (AFNLP).

Contents

Every year early December ALTA organises a research workshop (commonly known as ALTW) gathering together the growing language technology community in Australia and New Zealand, both from the academic and industrial world. The workshop welcomes original work on any aspect of natural language processing, including both speech and text. Accepted papers are published in the ALTA proceedings, which are also included as part of the ACL Anthology. [2]

Since 2008 ALTA has been involved [3] in organising the Australian Computational and Linguistics Olympiad [4] (OzCLO), which is a contest for high school students in the area of linguistics and computational linguistics.

Conferences

Notes

  1. AFNLP Constitution Annex B Archived October 6, 2011, at the Wayback Machine
  2. ACL Anthology, A Digital Archive of Research Papers in Computational Linguistics
  3. OzCLO sponsors Archived September 14, 2009, at the Wayback Machine
  4. OzCLO home page


Related Research Articles

Computational linguistics is an interdisciplinary field concerned with the computational modelling of natural language, as well as the study of appropriate computational approaches to linguistic questions. In general, computational linguistics draws upon linguistics, computer science, artificial intelligence, mathematics, logic, philosophy, cognitive science, cognitive psychology, psycholinguistics, anthropology and neuroscience, among others.

Word-sense disambiguation (WSD) is an open problem in computational linguistics concerned with identifying which sense of a word is used in a sentence. The solution to this issue impacts other computer-related writing, such as discourse, improving relevance of search engines, anaphora resolution, coherence, and inference.

Michael John Witbrock is a computer scientist in the field of artificial intelligence. Witbrock is a native of New Zealand and is the former Vice President of Research at Cycorp, which is carrying out the Cyc project in an effort to produce a genuine Artificial Intelligence.

Association for Computational Linguistics

The Association for Computational Linguistics (ACL) is the international scientific and professional society for people working on problems involving natural language and computation. An annual meeting is held each summer in locations where significant computational linguistics research is carried out. It was founded in 1962, originally named the Association for Machine Translation and Computational Linguistics (AMTCL). It became the ACL in 1968.

AFNLP is the organization for coordinating the natural language processing related activities and events in the Asia-Pacific region.

David Nash is a prominent Australian field linguist, specialising in the Aboriginal languages of Australia. Brought up in Parkes, New South Wales, he received a BA in pure mathematics from the Australian National University followed by an M.A. in Linguistics. He then went to the Massachusetts Institute of Technology, where he studied with Ken Hale and received his PhD in Linguistics in 1980. Before returning to Australia, he worked on the Lexicon Project at MIT. In 2005 he was Ken Hale Professor at the Linguistic Society of America Summer Institute. He works as a consultant for various Aboriginal organisations. He is also a Visiting Fellow of the Australian Institute of Aboriginal and Torres Strait Islander Studies.

Joan Wanda Bresnan FBA is Sadie Dernham Patek Professor in Humanities Emerita at Stanford University. She is best known as one of the architects of the theoretical framework of Lexical-Functional Grammar.

Eugene Charniak is a Computer Science and Cognitive Science professor at Brown University. He has an A.B. in Physics from The University of Chicago and a Ph.D. from M.I.T. in Computer Science. His research has always been in the area of language understanding or technologies which relate to it, such as knowledge representation, reasoning under uncertainty, and learning. Since the early 1990s he has been interested in statistical techniques for language understanding. His research in this area has included work in the subareas of part-of-speech tagging, probabilistic context-free grammar induction, and, more recently, syntactic disambiguation through word statistics, efficient syntactic parsing, and lexical resource acquisition through statistical means.

The North American Chapter of the Association for Computational Linguistics (NAACL) provides a regional focus for members of the Association for Computational Linguistics (ACL) in North America as well as in Central and South America, organizes annual conferences, promotes cooperation and information exchange among related scientific and professional societies, encourages and facilitates ACL membership by people and institutions in the Americas, and provides a source of information on regional activities for the ACL Executive Committee.

Yorick Wilks

Yorick Wilks FBCS, a British computer scientist, is Emeritus Professor of Artificial Intelligence at the University of Sheffield, Visiting Professor of Artificial Intelligence at Gresham College, Former Senior Research Fellow at the Oxford Internet Institute, Senior Scientist at the Florida Institute for Human and Machine Cognition, and a member of the Epiphany Philosophers.

Justine M. Cassell is an American professor and researcher interested in human-human conversation, human-computer interaction, and storytelling. Since August 2010 she has been on the faculty of the Carnegie Mellon Human Computer Interaction Institute (HCII) and the Language Technologies Institute, with courtesy appointments in Psychology, and the Center for Neural Bases of Cognition.

Co-training is a machine learning algorithm used when there are only small amounts of labeled data and large amounts of unlabeled data. One of its uses is in text mining for search engines. It was introduced by Avrim Blum and Tom Mitchell in 1998.

Jun'ichi Tsujii is a Japanese computer scientist specializing in natural language processing and text mining, particularly in the field of biology and bioinformatics.

SemEval is an ongoing series of evaluations of computational semantic analysis systems; it evolved from the Senseval word sense evaluation series. The evaluations are intended to explore the nature of meaning in language. While meaning is intuitive to humans, transferring those intuitions to computational analysis has proved elusive.

Dragomir R. Radev is a Yale University professor of computer science working on natural language processing and information retrieval. He previously served as a University of Michigan computer science professor and Columbia University computer science adjunct professor. Radev serves as Member of the Advisory Board of Lawyaw.

Lauri Juhani Karttunen is an Adjunct Professor in Linguistics at Stanford and an ACL Fellow.

LEPOR is an automatic language independent machine translation evaluation metric with tunable parameters and reinforced factors.

The Australian Computational and Linguistics Olympiad is a linguistics and computational linguistics competition for high school students in Australia, and has been held annually since 2008. The competition aims to introduce students in Years 9-12 to language puzzles so they can develop problem-solving strategies and learn about the structures and diversity of the world's languages. The competition has grown each year, and now involves around 1500 students participating from schools around the country.

Bidirectional Encoder Representations from Transformers (BERT) is a Transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google. As of 2019, Google has been leveraging BERT to better understand user searches.

Mona Talat Diab is a Computer Science Professor at George Washington University and a Research Scientist with Facebook AI. Her research focuses on Natural Language Processing, Computational Linguistics, Cross lingual/Multilingual processing, Computational Socio-Pragmatics, Arabic Language Processing, and Applied Machine Learning.