Workshop on Logic, Language, Information and Computation

Last updated

WoLLIC, the Workshop on Logic, Language, Information and Computation is an academic conference in the field of pure and applied logic and theoretical computer science. WoLLIC has been organised annually since 1994, typically in June or July; the conference is scientifically sponsored by the Association for Logic, Language and Information, the Association for Symbolic Logic, the European Association for Theoretical Computer Science and the European Association for Computer Science Logic.

Contents

Ranking

According to Computer Science Conference Ranking 2010, the conference is ranked "C" among over 1900 international conferences across the world. It is also ranked "C" on The CORE Conference Ranking Exercise - CORE Portal (2023). It is currently ranked with 7 Bars (Last 5 years), Field-Rating 1, Algorithms & Theory, at Microsoft MSAR field ratings (2014). On Google Scholar, the conference gets a score of 11 as its h5-index, and a score of 15 as its h5-median.

History

Future Venues

The meetings alternate between Latin America and US/Europe/Asia. The following locations are planned for future meetings:

Proceedings

Special Issues of Scientific Journals

Related Research Articles

In programming language theory and proof theory, the Curry–Howard correspondence is the direct relationship between computer programs and mathematical proofs.

In information science, formal concept analysis (FCA) is a principled way of deriving a concept hierarchy or formal ontology from a collection of objects and their properties. Each concept in the hierarchy represents the objects sharing some set of properties; and each sub-concept in the hierarchy represents a subset of the objects in the concepts above it. The term was introduced by Rudolf Wille in 1981, and builds on the mathematical theory of lattices and ordered sets that was developed by Garrett Birkhoff and others in the 1930s.

Computational semiotics is an interdisciplinary field that applies, conducts, and draws on research in logic, mathematics, the theory and practice of computation, formal and natural language studies, the cognitive sciences generally, and semiotics proper. The term encompasses both the application of semiotics to computer hardware and software design and, conversely, the use of computation for performing semiotic analysis. The former focuses on what semiotics can bring to computation; the latter on what computation can bring to semiotics.

Game semantics is an approach to formal semantics that grounds the concepts of truth or validity on game-theoretic concepts, such as the existence of a winning strategy for a player, somewhat resembling Socratic dialogues or medieval theory of Obligationes.

ObjVlisp is a 1984 object-oriented extension of Vlisp–Vincennes LISP, a LISP dialect developed since 1971 at the University of Paris VIII – Vincennes. It is noteworthy as one of the earliest implementations of the concept of metaclasses, and in particular explicit metaclasses. In the ObjVlisp model, "each entity is an instance of a single class. Classes are instances of other classes, called metaclasses. This model allows for extension of the static part of OOL, i.e. the structural aspects of objects considered as implementation of abstract data types"

Kleptography is the study of stealing information securely and subliminally. The term was introduced by Adam Young and Moti Yung in the Proceedings of Advances in Cryptology – Crypto '96. Kleptography is a subfield of cryptovirology and is a natural extension of the theory of subliminal channels that was pioneered by Gus Simmons while at Sandia National Laboratory. A kleptographic backdoor is synonymously referred to as an asymmetric backdoor. Kleptography encompasses secure and covert communications through cryptosystems and cryptographic protocols. This is reminiscent of, but not the same as steganography that studies covert communications through graphics, video, digital audio data, and so forth.

Ruby is a hardware description language designed by Mary Sheeran in 1986 intended to facilitate the notation and development of integrated circuits via relational algebra and functional programming.

A hash chain is the successive application of a cryptographic hash function to a piece of data. In computer security, a hash chain is a method used to produce many one-time keys from a single key or password. For non-repudiation, a hash function can be applied successively to additional pieces of data in order to record the chronology of data's existence.

The European Joint Conferences on Theory and Practice of Software (ETAPS) is a confederation of (currently) four computer science conferences taking place annually at one conference site, usually end of March or April. Three of the four conferences are top ranked in software engineering and one (ESOP) is top ranked in programming languages.

Persistent homology is a method for computing topological features of a space at different spatial resolutions. More persistent features are detected over a wide range of spatial scales and are deemed more likely to represent true features of the underlying space rather than artifacts of sampling, noise, or particular choice of parameters.

<span class="mw-page-title-main">Ruy de Queiroz</span>

Ruy J. Guerra B. de Queiroz is an associate professor at Universidade Federal de Pernambuco and holds significant works in the research fields of Mathematical logic, proof theory, foundations of mathematics and philosophy of mathematics. He is the founder of the Workshop on Logic, Language, Information and Computation (WoLLIC), which has been organised annually since 1994, typically in June or July.

<span class="mw-page-title-main">Peter Ružička</span> Slovak scientist

Peter Ružička was a Slovak computer scientist and mathematician who worked in the fields of distributed computing and computer networks. He was a professor at the Comenius University, Faculty of Mathematics, Physics and Informatics working in several research areas of theoretical computer science throughout his long career.

Cubesort is a parallel sorting algorithm that builds a self-balancing multi-dimensional array from the keys to be sorted. As the axes are of similar length the structure resembles a cube. After each key is inserted the cube can be rapidly converted to an array.

In automata theory, a self-verifying finite automaton (SVFA) is a special kind of a nondeterministic finite automaton (NFA) with a symmetric kind of nondeterminism introduced by Hromkovič and Schnitger. Generally, in self-verifying nondeterminism, each computation path is concluded with any of the three possible answers: yes, no, and I do not know. For each input string, no two paths may give contradictory answers, namely both answers yes and no on the same input are not possible. At least one path must give answer yes or no, and if it is yes then the string is considered accepted. SVFA accept the same class of languages as deterministic finite automata (DFA) and NFA but have different state complexity.

State complexity is an area of theoretical computer science dealing with the size of abstract automata, such as different kinds of finite automata. The classical result in the area is that simulating an -state nondeterministic finite automaton by a deterministic finite automaton requires exactly states in the worst case.

The International Symposium on Experimental Algorithms (SEA), previously known as Workshop on Experimental Algorithms (WEA), is a computer science conference in the area of algorithm engineering.

In cryptography, a round or round function is a basic transformation that is repeated (iterated) multiple times inside the algorithm. Splitting a large algorithmic function into rounds simplifies both implementation and cryptanalysis.

<span class="mw-page-title-main">Bailey's FFT algorithm</span> High-performance algorithm

The Bailey's FFT is a high-performance algorithm for computing the fast Fourier transform (FFT). This variation of the Cooley–Tukey FFT algorithm was originally designed for systems with hierarchical memory common in modern computers. The algorithm treats the samples as a two dimensional matrix and executes short FFT operations on the columns and rows of the matrix, with a correction multiplication by "twiddle factors" in between.

In cryptography, the branch number is a numerical value that characterizes the amount of diffusion introduced by a vectorial Boolean function F that maps an input vector a to output vector . For the (usual) case of a linear F the value of the differential branch number is produced by:

  1. applying nonzero values of a to the input of F;
  2. calculating for each input value a the Hamming weight , and adding weights and together;
  3. selecting the smallest combined weight across for all nonzero input values: .

References