International Journal of Computer Processing of Languages

Last updated

Abstracting and indexing

The journal is abstracted and indexed in Inspec and Linguistics and Language Behavior Abstracts.

Related Research Articles

Computational linguistics is an interdisciplinary field concerned with the computational modelling of natural language, as well as the study of appropriate computational approaches to linguistic questions. In general, computational linguistics draws upon linguistics, computer science, artificial intelligence, mathematics, logic, philosophy, cognitive science, cognitive psychology, psycholinguistics, anthropology and neuroscience, among others.

<span class="mw-page-title-main">Natural language</span> Language naturally spoken by humans, as opposed to "formal" or "built" languages

In neuropsychology, linguistics, and philosophy of language, a natural language or ordinary language is any language that has evolved naturally in humans through use and repetition without conscious planning or premeditation. Natural languages can take different forms, such as speech or signing. They are distinguished from constructed and formal languages such as those used to program computers or to study logic.

<span class="mw-page-title-main">Natural language processing</span> Field of linguistics and computer science

Natural language processing (NLP) is an interdisciplinary subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to process and analyze large amounts of natural language data. The goal is a computer capable of "understanding" the contents of documents, including the contextual nuances of the language within them. The technology can then accurately extract information and insights contained in the documents as well as categorize and organize the documents themselves.

<span class="mw-page-title-main">Tree structure</span> Way of representing the hierarchical nature of a structure in a graphical form

A tree structure, tree diagram, or tree model is a way of representing the hierarchical nature of a structure in a graphical form. It is named a "tree structure" because the classic representation resembles a tree, although the chart is generally upside down compared to a biological tree, with the "stem" at the top and the "leaves" at the bottom.

In linguistics and related fields, pragmatics is the study of how context contributes to meaning. The field of study evaluates how human language is utilized in social interactions, as well as the relationship between the interpreter and the interpreted. Linguists who specialize in pragmatics are called pragmaticians. The field has been represented since 1986 by the International Pragmatics Association (IPrA).

Natural-language understanding (NLU) or natural-language interpretation (NLI) is a subtopic of natural-language processing in artificial intelligence that deals with machine reading comprehension. Natural-language understanding is considered an AI-hard problem.

Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term parsing comes from Latin pars (orationis), meaning part.

In linguistics and semiotics, a notation is a system of graphics or symbols, characters and abbreviated expressions, used in artistic and scientific disciplines to represent technical facts and quantities by convention. Therefore, a notation is a collection of related symbols that are each given an arbitrary meaning, created to facilitate structured communication within a domain knowledge or field of study.

<span class="mw-page-title-main">Max (software)</span> Visual programming language

Max, also known as Max/MSP/Jitter, is a visual programming language for music and multimedia developed and maintained by San Francisco-based software company Cycling '74. Over its more than thirty-year history, it has been used by composers, performers, software designers, researchers, and artists to create recordings, performances, and installations.

Requirements engineering (RE) is the process of defining, documenting, and maintaining requirements in the engineering design process. It is a common role in systems engineering and software engineering.

Graphical Models is an academic journal in computer graphics and geometry processing publisher by Elsevier. As of 2021, its editor-in-chief is Bedrich Benes of the Purdue University.

The Natural Language Toolkit, or more commonly NLTK, is a suite of libraries and programs for symbolic and statistical natural language processing (NLP) for English written in the Python programming language. It was developed by Steven Bird and Edward Loper in the Department of Computer and Information Science at the University of Pennsylvania. NLTK includes graphical demonstrations and sample data. It is accompanied by a book that explains the underlying concepts behind the language processing tasks supported by the toolkit, plus a cookbook.

In linguistics, relexification is a mechanism of language change by which one language changes much or all of its lexicon, including basic vocabulary, with the lexicon of another language, without drastically changing the relexified language's grammar. The term is principally used to describe pidgins, creoles, and mixed languages.

Patrick Juola is an internationally noted expert in text analysis, security, forensics, and stylometry. He is a professor of computer science at Duquesne University. As a faculty member at Duquesne University, he has authored two books and more than 100 scientific publications as well as generated more than two million dollars in Federal research grant funding. He works in the field of computer linguistics and computer security currently serving as Director of Research at Juola & Associates and Principal of the Evaluating Variations in Language Laboratory. He is credited with co-creating the original biometric word list. Juola has also created a Java-based open source authorship attribution suite JGAAP, Java Graphical Authorship Attribution Program, with several students at Duquesne University including David Berdik, Sean Vinsick, Amanda Kroft, and Michael Ryan.

The Hamburg Sign Language Notation System, or HamNoSys, is a transcription system for all sign languages, with a direct correspondence between symbols and gesture aspects, such as hand location, shape and movement. It was developed in 1985 at the University of Hamburg, Germany. As of 2020, it is in its fourth revision.

Linguistics is the scientific study of human language. It entails the comprehensive, systematic, objective, and precise analysis of all aspects of language — cognitive, social, environmental, biological as well as structural.

Textual entailment (TE), also known as Natural Language Inference (NLI), in natural language processing is a directional relation between text fragments. The relation holds whenever the truth of one text fragment follows from another text. In the TE framework, the entailing and entailed texts are termed text (t) and hypothesis (h), respectively. Textual entailment is not the same as pure logical entailment – it has a more relaxed definition: "t entails h" (th) if, typically, a human reading t would infer that h is most likely true. (Alternatively: th if and only if, typically, a human reading t would be justified in inferring the proposition expressed by h from the proposition expressed by t.) The relation is directional because even if "t entails h", the reverse "h entails t" is much less certain.

The following outline is provided as an overview of and topical guide to natural-language processing:

DisCoCat is a mathematical framework for natural language processing which uses category theory to unify distributional semantics with the principle of compositionality. The grammatical derivations in a categorial grammar are interpreted as linear maps acting on the tensor product of word vectors to produce the meaning of a sentence or a piece of text. String diagrams are used to visualise information flow and reason about natural language semantics.

References

  1. "Announcement". International Journal of Computer Processing of Languages.