Typesetting is the composition of text for publication, display, or distribution by means of arranging physical type (or sort) in mechanical systems or glyphs in digital systems representing characters (letters and other symbols). [1] Stored types are retrieved and ordered according to a language's orthography for visual display. Typesetting requires one or more fonts (which are widely but erroneously confused with and substituted for typefaces). One significant effect of typesetting was that authorship of works could be spotted more easily, making it difficult for copiers who have not gained permission. [2]
During much of the letterpress era, movable type was composed by hand for each page by workers called compositors. A tray with many dividers, called a case, contained cast metal sorts , each with a single letter or symbol, but backwards (so they would print correctly). The compositor assembled these sorts into words, then lines, then pages of text, which were then bound tightly together by a frame, making up a form or page. If done correctly, all letters were of the same height, and a flat surface of type was created. The form was placed in a press and inked, and then printed (an impression made) on paper. [3] Metal type read backwards, from right to left, and a key skill of the compositor was their ability to read this backwards text.
Before computers were invented, and thus becoming computerized (or digital) typesetting, font sizes were changed by replacing the characters with a different size of type. In letterpress printing, individual letters and punctuation marks were cast on small metal blocks, known as "sorts," and then arranged to form the text for a page. The size of the type was determined by the size of the character on the face of the sort. A compositor would need to physically swap out the sorts for a different size to change the font size.
During typesetting, individual sorts are picked from a type case with the right hand, and set from left to right into a composing stick held in the left hand, appearing to the typesetter as upside down. As seen in the photo of the composing stick, a lower case 'q' looks like a 'd', a lower case 'b' looks like a 'p', a lower case 'p' looks like a 'b' and a lower case 'd' looks like a 'q'. This is reputed to be the origin of the expression "mind your p's and q's". It might just as easily have been "mind your b's and d's". [3]
A forgotten but important part of the process took place after the printing: after cleaning with a solvent the expensive sorts had to be redistributed into the typecase - called sorting or dissing - so they would be ready for reuse. Errors in sorting could later produce misprints if, say, a p was put into the b compartment.
The diagram at right illustrates a cast metal sort: a face, b body or shank, c point size, 1 shoulder, 2 nick, 3 groove, 4 foot. Wooden printing sorts were used for centuries in combination with metal type. Not shown, and more the concern of the casterman, is the "set", or width of each sort. Set width, like body size, is measured in points.
In order to extend the working life of type, and to account for the finite sorts in a case of type, copies of forms were cast when anticipating subsequent printings of a text, freeing the costly type for other work. This was particularly prevalent in book and newspaper work where rotary presses required type forms to wrap an impression cylinder rather than set in the bed of a press. In this process, called stereotyping, the entire form is pressed into a fine matrix such as plaster of Paris or papier mâché to create a flong, from which a positive form is cast in type metal.
Advances such as the typewriter and computer would push the state of the art even farther ahead. Still, hand composition and letterpress printing have not fallen completely out of use, and since the introduction of digital typesetting, it has seen a revival as an artisanal pursuit. However, it is a small niche within the larger typesetting market.
The time and effort required to manually compose the text led to several efforts in the 19th century to produce mechanical typesetting. While some, such as the Paige compositor, met with limited success, by the end of the 19th century, several methods had been devised whereby an operator working a keyboard or other devices could produce the desired text. Most of the successful systems involved the in-house casting of the type to be used, hence are termed "hot metal" typesetting. The Linotype machine, invented in 1884, used a keyboard to assemble the casting matrices, and cast an entire line of type at a time (hence its name). In the Monotype System, a keyboard was used to punch a paper tape, which was then fed to control a casting machine. The Ludlow Typograph involved hand-set matrices, but otherwise used hot metal. By the early 20th century, the various systems were nearly universal in large newspapers and publishing houses.
Phototypesetting or "cold type" systems first appeared in the early 1960s and rapidly displaced continuous casting machines. These devices consisted of glass or film disks or strips (one per font) that spun in front of a light source to selectively expose characters onto light-sensitive paper. Originally they were driven by pre-punched paper tapes. Later they were connected to computer front ends.
One of the earliest electronic photocomposition systems was introduced by Fairchild Semiconductor. The typesetter typed a line of text on a Fairchild keyboard that had no display. To verify correct content of the line it was typed a second time. If the two lines were identical a bell rang and the machine produced a punched paper tape corresponding to the text. With the completion of a block of lines the typesetter fed the corresponding paper tapes into a phototypesetting device that mechanically set type outlines printed on glass sheets into place for exposure onto a negative film. Photosensitive paper was exposed to light through the negative film, resulting in a column of black type on white paper, or a galley. The galley was then cut up and used to create a mechanical drawing or paste up of a whole page. A large film negative of the page is shot and used to make plates for offset printing.
The next generation of phototypesetting machines to emerge were those that generated characters on a cathode-ray tube display. Typical of the type were the Alphanumeric APS2 (1963), [4] IBM 2680 (1967), I.I.I. VideoComp (1973?), Autologic APS5 (1975), [5] and Linotron 202 (1978). [6] These machines were the mainstay of phototypesetting for much of the 1970s and 1980s. Such machines could be "driven online" by a computer front-end system or took their data from magnetic tape. Type fonts were stored digitally on conventional magnetic disk drives.
Computers excel at automatically typesetting and correcting documents. [7] Character-by-character, computer-aided phototypesetting was, in turn, rapidly rendered obsolete in the 1980s by fully digital systems employing a raster image processor to render an entire page to a single high-resolution digital image, now known as imagesetting.
The first commercially successful laser imagesetter, able to make use of a raster image processor, was the Monotype Lasercomp. ECRM, Compugraphic (later purchased by Agfa) and others rapidly followed suit with machines of their own.
Early minicomputer-based typesetting software introduced in the 1970s and early 1980s, such as Datalogics Pager, Penta, Atex, Miles 33, Xyvision, troff from Bell Labs, and IBM's Script product with CRT terminals, were better able to drive these electromechanical devices, and used text markup languages to describe type and other page formatting information. The descendants of these text markup languages include SGML, XML and HTML.
The minicomputer systems output columns of text on film for paste-up and eventually produced entire pages and signatures of 4, 8, 16 or more pages using imposition software on devices such as the Israeli-made Scitex Dolev. The data stream used by these systems to drive page layout on printers and imagesetters, often proprietary or specific to a manufacturer or device, drove development of generalized printer control languages, such as Adobe Systems' PostScript and Hewlett-Packard's PCL.
Computerized typesetting was so rare that BYTE magazine (comparing itself to "the proverbial shoemaker's children who went barefoot") did not use any computers in production until its August 1979 issue used a Compugraphics system for typesetting and page layout. The magazine did not yet accept articles on floppy disks, but hoped to do so "as matters progress". [8] Before the 1980s, practically all typesetting for publishers and advertisers was performed by specialist typesetting companies. These companies performed keyboarding, editing and production of paper or film output, and formed a large component of the graphic arts industry. In the United States, these companies were located in rural Pennsylvania, New England or the Midwest, where labor was cheap and paper was produced nearby, but still within a few hours' travel time of the major publishing centers.
In 1985, with the new concept of WYSIWYG (for What You See Is What You Get) in text editing and word processing on personal computers, desktop publishing became available, starting with the Apple Macintosh, Aldus PageMaker (and later QuarkXPress) and PostScript and on the PC platform with Xerox Ventura Publisher under DOS as well as Pagemaker under Windows. Improvements in software and hardware, and rapidly lowering costs, popularized desktop publishing and enabled very fine control of typeset results much less expensively than the minicomputer dedicated systems. At the same time, word processing systems, such as Wang, WordPerfect and Microsoft Word, revolutionized office documents. They did not, however, have the typographic ability or flexibility required for complicated book layout, graphics, mathematics, or advanced hyphenation and justification rules (H and J).
By 2000, this industry segment had shrunk because publishers were now capable of integrating typesetting and graphic design on their own in-house computers. Many found the cost of maintaining high standards of typographic design and technical skill made it more economical to outsource to freelancers and graphic design specialists.
The availability of cheap or free fonts made the conversion to do-it-yourself easier, but also opened up a gap between skilled designers and amateurs. The advent of PostScript, supplemented by the PDF file format, provided a universal method of proofing designs and layouts, readable on major computers and operating systems.
QuarkXPress had enjoyed a market share of 95% in the 1990s, but lost its dominance to Adobe InDesign from the mid-2000s onward. [9]
IBM created and inspired a family of typesetting languages with names that were derivatives of the word "SCRIPT". Later versions of SCRIPT included advanced features, such as automatic generation of a table of contents and index, multicolumn page layout, footnotes, boxes, automatic hyphenation and spelling verification. [10]
NSCRIPT was a port of SCRIPT to OS and TSO from CP-67/CMS SCRIPT. [11]
Waterloo Script was created at the University of Waterloo (UW) later. [12] One version of SCRIPT was created at MIT and the AA/CS at UW took over project development in 1974. The program was first used at UW in 1975. In the 1970s, SCRIPT was the only practical way to word process and format documents using a computer. By the late 1980s, the SCRIPT system had been extended to incorporate various upgrades. [13]
The initial implementation of SCRIPT at UW was documented in the May 1975 issue of the Computing Centre Newsletter, which noted some the advantages of using SCRIPT:
The article also pointed out SCRIPT had over 100 commands to assist in formatting documents, though 8 to 10 of these commands were sufficient to complete most formatting jobs. Thus, SCRIPT had many of the capabilities computer users generally associate with contemporary word processors. [14]
SCRIPT/VS was a SCRIPT variant developed at IBM in the 1980s.
DWScript is a version of SCRIPT for MS-DOS, named after its author, D. D. Williams, [15] but was never released to the public and only used internally by IBM.
Script is still available from IBM as part of the Document Composition Facility for the z/OS operating system. [16]
The standard generalized markup language (SGML) was based upon IBM Generalized Markup Language (GML). GML was a set of macros on top of IBM Script. DSSSL is an international standard developed to provide a stylesheets for SGML documents.
XML is a successor of SGML. XSL-FO is most often used to generate PDF files from XML files.
The arrival of SGML/XML as the document model made other typesetting engines popular. Such engines include Datalogics Pager, Penta, Miles 33's OASYS, Xyvision's XML Professional Publisher, FrameMaker, and Arbortext. XSL-FO compatible engines include Apache FOP, Antenna House Formatter, and RenderX's XEP. These products allow users to program their SGML/XML typesetting process with the help of scripting languages.
YesLogic's Prince is another one, which is based on CSS Paged Media.
During the mid-1970s, Joe Ossanna, working at Bell Laboratories, wrote the troff typesetting program to drive a Wang C/A/T phototypesetter owned by the Labs; it was later enhanced by Brian Kernighan to support output to different equipment, such as laser printers. While its use has fallen off, it is still included with a number of Unix and Unix-like systems, and has been used to typeset a number of high-profile technical and computer books. Some versions, as well as a GNU work-alike called groff, are now open source.
The TeX system, developed by Donald E. Knuth at the end of the 1970s, is another widespread and powerful automated typesetting system that has set high standards, especially for typesetting mathematics. LuaTeX and LuaLaTeX are variants of TeX and of LaTeX scriptable in Lua. TeX is considered fairly difficult to learn on its own, and deals more with appearance than structure. The LaTeX macro package, written by Leslie Lamport at the beginning of the 1980s, offered a simpler interface and an easier way to systematically encode the structure of a document. LaTeX markup is widely used in academic circles for published papers and books. Although standard TeX does not provide an interface of any sort, there are programs that do. These programs include Scientific Workplace and LyX, which are graphical/interactive editors; TeXmacs, while being an independent typesetting system, can also aid the preparation of TeX documents through its export capability.
GNU TeXmacs (whose name is a combination of TeX and Emacs, although it is independent from both of these programs) is a typesetting system which is at the same time a WYSIWYG word processor.
SILE borrows some algorithms from TeX and relies on other libraries such as HarfBuzz and ICU, with an extensible core engine developed in Lua. [17] [18] By default, SILE's input documents can be composed in a custom LaTeX-inspired markup (SIL) or in XML. Via the adjunction of 3rd-party modules, composition in Markdown or Djot is also possible. [19]
LaTeX is a software system for typesetting documents. LaTeX markup describes the content and layout of the document, as opposed to the formatted text found in WYSIWYG word processors like Google Docs, LibreOffice Writer and Microsoft Word. The writer uses markup tagging conventions to define the general structure of a document, to stylise text throughout a document, and to add citations and cross-references. A TeX distribution such as TeX Live or MiKTeX is used to produce an output file suitable for printing or digital distribution.
A markuplanguage is a text-encoding system which specifies the structure and formatting of a document and potentially the relationships among its parts. Markup can control the display of a document or enrich its content to facilitate automated processing.
In computing, plain text is a loose term for data that represent only characters of readable material but not its graphical representation nor other objects. It may also include a limited number of "whitespace" characters that affect simple arrangement of text, such as spaces, line breaks, or tabulation characters. Plain text is different from formatted text, where style information is included; from structured text, where structural parts of the document such as paragraphs, sections, and the like are identified; and from binary files in which some portions must be interpreted as binary objects.
The Standard Generalized Markup Language is a standard for defining generalized markup languages for documents. ISO 8879 Annex A.1 states that generalized markup is "based on two postulates":
TeX, stylized within the system as TeX, is a typesetting program which was designed and written by computer scientist and Stanford University professor Donald Knuth and first released in 1978. The term now refers to the system of extensions – which includes software programs called TeX engines, sets of TeX macros, and packages which provide extra typesetting functionality – built around the original TeX language. TeX is a popular means of typesetting complex mathematical formulae; it has been noted as one of the most sophisticated digital typographical systems.
Desktop publishing (DTP) is the creation of documents using dedicated software on a personal ("desktop") computer. It was first used almost exclusively for print publications, but now it also assists in the creation of various forms of online content. Desktop publishing software can generate page layouts and produce text and image content comparable to the simpler forms of traditional typography and printing. This technology allows individuals, businesses, and other organizations to self-publish a wide variety of content, from menus to magazines to books, without the expense of commercial printing.
A typeface is a design of letters, numbers and other symbols, to be used in printing or for electronic display. Most typefaces include variations in size, weight, slope, width, and so on. Each of these variations of the typeface is a font.
GNU TeXmacs is a scientific word processor and typesetting component of the GNU Project. It originated as a variant of GNU Emacs with TeX functionalities, though it shares no code with those programs, while using TeX fonts. It is written and maintained by Joris van der Hoeven and a group of developers. The program produces structured documents with a WYSIWYG user interface. New document styles can be created by the user. The editor provides high-quality typesetting algorithms and TeX and other fonts for publishing professional looking documents.
A document file format is a text or binary file format for storing documents on a storage media, especially for use by computers. There currently exist a multitude of incompatible document file formats.
Computer Modern is the original family of typefaces used by the typesetting program TeX. It was created by Donald Knuth with his Metafont program, and was most recently updated in 1992. Computer Modern, or variants of it, remains very widely used in scientific publishing, especially in disciplines that make frequent use of mathematical notation.
Phototypesetting is a method of setting type which uses photography to make columns of type on a scroll of photographic paper. It has been made obsolete by the popularity of the personal computer and desktop publishing which gave rise to digital typesetting.
In printing and typography, hot metal typesetting is a technology for typesetting text in letterpress printing. This method injects molten type metal into a mold that has the shape of one or more glyphs. The resulting sorts or slugs are later used to press ink onto paper. Normally the typecasting machine would be controlled by a keyboard or by a paper tape.
In graphic design, page layout is the arrangement of visual elements on a page. It generally involves organizational principles of composition to achieve specific communication objectives.
A whitespace character is a character data element that represents white space when text is rendered for display by a computer.
Scribe is a markup language and word processing system that pioneered the use of descriptive markup. Scribe was revolutionary when it was proposed, because it involved for the first time a clean separation of presentation and content.
Arbortext Advanced Print Publisher is commercial typesetting software application sold by Parametric Technology Corporation.
The GSI C/A/T is a phototypesetter developed by Graphic Systems in 1972. This phototypesetter, along with troff software for UNIX, revolutionized the typesetting and document printing industry. Phototypesetting is most often used with offset printing technology.
TeXML [tɛχːml] is – as a process – a TeX-based alternative to XSL-FO.
An XML transformation language is a programming language designed specifically to transform an input XML document into an output document which satisfies some specific goal.