Gordon Rugg

Last updated

Gordon Rugg (born 1955) is a British academic, head of the Knowledge Modelling Group at Keele University and a visiting senior research fellow at the Open University, known for his work on the Voynich manuscript. [1]

Contents

Biography

Born in Perth, Scotland, Rugg has a first degree in French and Linguistics and a PhD in psychology, both from Reading University, UK.

His background includes working as a timberyard worker, a field archaeologist and an English lecturer. He became the focus of media attention in 2004 for his work on the Voynich manuscript.

He is co-author with Marian Petre of two books for students which focus on semi-tacit skills in research. He is head of the Knowledge Modelling Group at Keele University and a visiting senior research fellow at the Open University.

Research

His main research theme is elicitation methods – techniques for eliciting information from people, for purposes such as market research and requirements gathering for software development. His main work in this field includes the following.

This work formed one main strand in the Verifier method which he developed with Joanne Hyde. This is a method for critically re-assessing previous research into difficult problems. The initial stage uses a range of elicitation methods to gain an accurate picture of the assumptions and normal working practices used in the previous work. The next stage uses a knowledge of experts’ behaviour and a range of error taxonomies to identify the places where human error is most likely to have occurred. The final stage uses various formalisms to assess whether or not an error actually has occurred. There is no guarantee that this method will catch every error – it is not a method for proving the correctness of a piece of previous work – but it improves the chances of finding key errors.

Rugg's other work is multidisciplinary, including theoretical archaeology and teaching methods.

Voynich manuscript

The Voynich manuscript is written in an unknown script. Voynich.png
The Voynich manuscript is written in an unknown script.

Rugg used an informal version of the Verifier method to re-assess previous work on the Voynich manuscript, a manuscript widely believed to be a ciphertext based on a code which had resisted decipherment since the manuscript's rediscovery by Wilfrid Voynich in 1912. Previous research had concluded that the manuscript contained linguistic features too complex to be readily explicable as a hoax, and too strange to be explicable as a transliteration of an unidentified language, leaving an uncracked cipher as the only realistic explanation.

Rugg suggested that these assessments of complexity were not based on empirical evidence. He examined a range of techniques known in the late sixteenth century, and found that, by using a modified Cardan grille combined with a large table of meaningless syllables, it was possible to produce meaningless text that had qualitative and statistical properties similar to those of "Voynichese". Rugg replicated the drawings from a range of pages in the manuscript, accompanying each with the same quantity of text as found in the original page, and discovered that most pages could be reproduced in one to two hours, as fast as they could be transcribed. This suggested that a meaningless hoax manuscript as long and as apparently linguistically complex as the Voynich manuscript could be produced, complete with coloured illustrations, by a single person in between 250 and 500 hours.

However, there is debate about the features of the manuscript which Rugg's suggested method was not able to emulate. The two main features are lines showing different linguistic features from the bulk of the manuscript, such as the Neal keys, and the statistical properties of the text produced. Rugg argues that these linguistic features are trivially easy to hoax using the same approach with a different set of tables, and would add about five minutes to the time to produce each page; the counter-argument is that this makes the hoax too complex to be plausible. Regarding statistics, Rugg points out that text produced from the same set of initial nonsense syllables but using different table structures shows widely different statistical properties. Since there are tens of thousands of permutations of table design, he argues that it would simply be a question of time to find a design which produced the same statistical properties as "Voynichese". Whether or not this would prove anything useful is another issue, since it could either be used to support Rugg's argument, or dismissed as coincidence.

Further counterarguments are:

The debate continues.

Selected publications

Articles, a selection:

Related Research Articles

Machine translation, sometimes referred to by the abbreviation MT, is a sub-field of computational linguistics that investigates the use of software to translate text or speech from one language to another.

<span class="mw-page-title-main">Voynich manuscript</span> Illustrated codex in an unknown writing system

The Voynich manuscript is an illustrated codex hand-written in an otherwise unknown writing system, referred to as 'Voynichese'. The vellum on which it is written has been carbon-dated to the early 15th century (1404–1438), and stylistic analysis indicates it may have been composed in Italy during the Italian Renaissance. The origins, authorship, and purpose of the manuscript are debated. Various hypotheses have been suggested, including that it is an otherwise unrecorded script for a natural language or constructed language; an unread code, cypher, or other form of cryptography; or simply a meaningless hoax.

In the study of language, description or descriptive linguistics is the work of objectively analyzing and describing how language is actually used by a speech community.

<span class="mw-page-title-main">Textual criticism</span> Identification of textual variants

Textual criticism is a branch of textual scholarship, philology, and of literary criticism that is concerned with the identification of textual variants, or different versions, of either manuscripts or of printed books. Such texts may range in dates from the earliest writing in cuneiform, impressed on clay, for example, to multiple unpublished versions of a 21st-century author's work. Historically, scribes who were paid to copy documents may have been literate, but many were simply copyists, mimicking the shapes of letters without necessarily understanding what they meant. This means that unintentional alterations were common when copying manuscripts by hand. Intentional alterations may have been made as well, for example, the censoring of printed work for political, religious or cultural reasons.

<span class="mw-page-title-main">Editing</span> Process of selecting and preparing media to convey information

Editing is the process of selecting and preparing written, photographic, visual, audible, or cinematic material used by a person or an entity to convey a message or information. The editing process can involve correction, condensation, organisation, and many other modifications performed with an intention of producing a correct, consistent, accurate and complete piece of work.

<span class="mw-page-title-main">Neurolinguistics</span> Neuroscience and linguistics-related studies

Neurolinguistics is the study of neural mechanisms in the human brain that controls the comprehension, production, and acquisition of language. As an interdisciplinary field, neurolinguistics draws methods and theories from fields such as neuroscience, linguistics, cognitive science, communication disorders and neuropsychology. Researchers are drawn to the field from a variety of backgrounds, bringing along a variety of experimental techniques as well as widely varying theoretical perspectives. Much work in neurolinguistics is informed by models in psycholinguistics and theoretical linguistics, and is focused on investigating how the brain can implement the processes that theoretical and psycholinguistics propose are necessary in producing and comprehending language. Neurolinguists study the physiological mechanisms by which the brain processes information related to language, and evaluate linguistic and psycholinguistic theories, using aphasiology, brain imaging, electrophysiology, and computer modeling.

Tacit knowledge or implicit knowledge—as opposed to formal, codified or explicit knowledge—is knowledge that is difficult to express or extract, and thus more difficult to transfer to others by means of writing it down or verbalizing it. This can include personal wisdom, experience, insight, and intuition.

Trial and error is a fundamental method of problem-solving characterized by repeated, varied attempts which are continued until success, or until the practicer stops trying.

<span class="mw-page-title-main">Prediction</span> Statement about a future event

A prediction, or forecast, is a statement about a future event or data. They are often, but not always, based upon experience or knowledge. There is no universal agreement about the exact difference from "estimation"; different authors and disciplines ascribe different connotations.

A nominal category or a nominal group is a group of objects or ideas that can be collectively grouped on the basis of a particular characteristic—a qualitative property. A variable that codes whether each one in a set of observations is in a particular nominal category is called a categorical variable.

The Cardan grille is a method of writing secret messages using a grid.

<span class="mw-page-title-main">Rohonc Codex</span> Manuscript

The Rohonc Codex is an illustrated manuscript book by an unknown author, with a text in an unknown language and writing system, that surfaced in Hungary in the early 19th century. The book's origin and the meaning of its text and illustrations have been investigated by many scholars and amateurs, with no definitive conclusion, although many Hungarian scholars believe that it is an 18th-century hoax.

<span class="mw-page-title-main">Data dredging</span> Misuse of data analysis

Data dredging is the misuse of data analysis to find patterns in data that can be presented as statistically significant, thus dramatically increasing and understating the risk of false positives. This is done by performing many statistical tests on the data and only reporting those that come back with significant results.

In the field of psychology, nativism is the view that certain skills or abilities are "native" or hard-wired into the brain at birth. This is in contrast to the "blank slate" or tabula rasa view, which states that the brain has inborn capabilities for learning from the environment but does not contain content such as innate beliefs. This factor contributes to the ongoing nature versus nurture dispute, one borne from the current difficulty of reverse engineering the subconscious operations of the brain, especially the human brain.

<span class="mw-page-title-main">Asemic writing</span> Wordless open semantic form of writing

Asemic writing is a wordless open semantic form of writing. The word asemic means "having no specific semantic content", or "without the smallest unit of meaning". With the non-specificity of asemic writing there comes a vacuum of meaning, which is left for the reader to fill in and interpret. All of this is similar to the way one would deduce meaning from an abstract work of art. Where asemic writing distinguishes itself among traditions of abstract art is in the asemic author's use of gestural constraint, and the retention of physical characteristics of writing such as lines and symbols. Asemic writing is a hybrid art form that fuses text and image into a unity, and then sets it free to arbitrary subjective interpretations. It may be compared to free writing or writing for its own sake, instead of writing to produce verbal context. The open nature of asemic works allows for meaning to occur across linguistic understanding; an asemic text may be "read" in a similar fashion regardless of the reader's natural language. Multiple meanings for the same symbolism are another possibility for an asemic work, that is, asemic writing can be polysemantic or have zero meaning, infinite meanings, or its meaning can evolve over time. Asemic works leave for the reader to decide how to translate and explore an asemic text; in this sense, the reader becomes co-creator of the asemic work.

Clean language is a technique primarily used in counseling, psychotherapy and coaching but now also used in education, business, organisational change and health. It has been applied as a research interview technique called clean language interviewing.

In the history of cryptography, a grille cipher was a technique for encrypting a plaintext by writing it onto a sheet of paper through a pierced sheet. The earliest known description is due to the polymath Girolamo Cardano in 1550. His proposal was for a rectangular stencil allowing single letters, syllables, or words to be written, then later read, through its various apertures. The written fragments of the plaintext could be further disguised by filling the gaps between the fragments with anodyne words or letters. This variant is also an example of steganography, as are many of the grille ciphers.

Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Linguistics is concerned with both the cognitive and social aspects of language. It is considered a scientific field as well as an academic discipline; it has been classified as a social science, natural science, cognitive science, or part of the humanities.

In linguistics, according to J. Richard et al., (2002), an error is the use of a word, speech act or grammatical items in such a way that it seems imperfect and significant of an incomplete learning (184). It is considered by Norrish as a systematic deviation which happens when a learner has not learnt something, and consistently gets it wrong. However, the attempts made to put the error into context have always gone hand in hand with either [language learning and second-language acquisition] processe, Hendrickson (1987:357) mentioned that errors are ‘signals’ that indicate an actual learning process taking place and that the learner has not yet mastered or shown a well-structured [linguistic competence|competence] in the target language.

Logico-linguistic modeling is a method for building knowledge-based systems with a learning capability using conceptual models from soft systems methodology, modal predicate logic, and logic programming languages such as Prolog.

References

  1. Schinner, Andreas. "The Voynich manuscript: evidence of the hoax hypothesis." Cryptologia 31.2 (2007): 95-107.