Typeface as Programme

Jürg Lehni,

Technology, tools and the human condition.

Type Design in the Digital Ages

Type Design in the Digital Ages

Like many disciplines dependent on techno­logy for execu­tion or produc­tion, type design has undergone a series of funda­mental revolu­tions and transi­tions in the past century. Driven by techno­logical advance, this process has completely changed the way people work with type, to the point where someone employed in the field had to adapt to a signifi­cantly changing situa­tion multiple times throughout a career. At the beginning of the transi­tion there was the 19th century hot metal typeset­ting with its very complex and expensive mechanised equipment invented by Monotype and Linotype. A period of opto-mechanical photo­compo­sition systems followed in the 1950s and 60s, in which printing with cast letter-forms was replaced with exposure of optical outlines on spinning disks of glass onto light-sensitive paper. This was soon replaced again by the digital simu­lation of similar processes, formu­lated in computer programs and executed first by huge room-filling instal­lations and later by affordable home computers.

The advent of computer techno­logy and the digital revo­lution had similar impacts on many other creative fields, such as graphic design, photo­graphy, film editing, or audio recording, with changes often similar in nature. Highly expensive equipment was made redundant by computer techno­logy running software that simulates the same processes. The software and the user interfaces often use metaphors from within the field, known from the time before the revolu­tion, and the role of the computer is that of a machine simula­ting other machines or processes as a sort of a meta-tool. Even today, software is largely defined as that, and therefore computers function mostly as replace­ments for previ­ously existing processes, the type-writer and postal service being two of the most common examples.

Democratisation is another important part of these develop­ments. The sudden general avail­ability of processes through computer­isation has continued to increase the number of people who have access to and start engaging in them. In the creative sector, this also led to a change in the nature of the work being done, often to the disap­proval of the previous specialists in the field. While type design in the 19th century was a craft accessible to very few selected typo­graphers, who together with punch­cutters worked on designs for one of the companies producing type­setting equipment, it is now a discipline that anyone who has access to a computer and a licence for a type design software can engage in.

These are generally known aspects of this revolution that have been looked at closely many times before. But the role of software is rarely analysed beyond this point. It appears that the general function of the computer is still accepted simply as a simu­lation machine, and the question what software could or should provide in any given field is rarely raised. Instead, the status quo is often accepted as a given, a language we use in our daily work and that we have stopped questioning, since it is so ubiquitous that is it almost invisible.

Furthermore, in the historic discourse of digital typefaces, questions regarding the definition and nature of digital typefaces are hardly risen and the status quo is rarely questioned beyond the boundaries of the industrial standards.

I believe in efficiency and economy of means, and in that sense I can live well with the flaws of current type standards and the lack of leverage of type designers.

— Dimitri Bruni, 2011

Fonts, Tools and Software

Fonts, Tools and Software

Traditionally, a font was a complete set of metal characters of a parti­cular type­face in a given size and style. Etymo­logically, the word goes back to the French word fonte and the verb fondre, meaning to melt or to cast, referencing the way fonts were produced by type foundries. Fonts were one of the ingre­dients needed in the printing process in order to be able to print text, and they were bought in full sets from the foundries. A set included more copies of some letters than others, depending on the statis­tical occurrence of each letter in any given language. The structure of the letter cases that hold the letters repre­sented this distri­bution. A font was not a full, inde­pendent tool in itself, but rather a part of a tool-based process which, without it could not take place. Given its physical nature at that time, it is imagin­able that fonts were perceived as tools in themselves. At the same time they could also be seen as an artwork designed by a typo­grapher and executed by a punch cutter. Today, digital fonts are legally defined as software, once again as the digital counter­part of a tool. This has broad conse­quences for the way fonts are distri­buted and sold, and the way type designers are earning their money, since licensing schemes similar to the ones found in software appli­cations are in place: the End User License Agreements (EULA) entitle the end users of fonts to install them on a defined number of computers within the same household or office. The degree of usage of the font in this case has no impact on the price. As soon as the user has bought the license, he owns the right of usage within the defined bound­aries and there­fore can use the font as a tool as much as he likes, as long as he does not infringe the rules of the agreement. This might lead to absurd situ­ations, for example when in certain circum­stances a big newspaper may pay the same amount of money for a font that is printed in thousands or even millions of issues daily as a small graphic design office that uses the font once for a client’s job. Both buy the basic right to use the font as a tool for what­ever they need it for, and the creative work in the typeface is unaccounted for.

While there are foundries that have created compli­cated agreements for such special cases, the basic problem of unequal usage remains and is criti­cised by many type designers: the fact that the creative work is not taken into account in the definition as a tool, ignoring the fact that a type­face is also an artistic work by a creative individual.

An alternative way of defining type­faces is as library or a family of graphical shapes (glyphs) along with rules that describe how to assign these to letters and symbols (character encoding), and how to adjust the space between them (letter­spacing and kerning). If this definition was used legally, another system would suggest itself: one based on royalties, as in the music industry or applied photo­graphy, both fields where an artwork or a composition is licensed for specific media based distri­bution. The licensing costs then mostly depend on the duration of the segment, the size of the image, visi­bility, distri­bution, etc. Specific associ­ations claim these royalties and distribute them among their members, enforcing copy­right law and ensuring rights of authorship for the protected works.

Such authorship based systems are not necess­arily a viable way for type­faces, as they have their own share of problems in the digital age, namely software piracy and the limit­ations of systems that try to prevent it. Digital Rights Management (DRM) as a possible solution proposed by big corpora­tions is in the process of failing and is mostly being abandoned at the moment of writing, since the consumers are not willing to follow the rules they force upon them. Never­the­less it remains curious that this legal definition as software has become the standard for fonts, especially since there is little evidence that digital typefaces actually really require to work as software.

It is important to note that technic­ally this definition is correct, as the techno­logies used today for the digital definition of type­faces, such as PostScript or TrueType, do hold qualities of software and program­ming languages, adding to the complexity of this discussion. PostScript for example is a so-called page description language developed by Adobe Systems Inc. for the specific task of describing layouts consisting of images, graphics and text. In order to offer the greatest flexi­bility and future scala­bility, it was designed as a full-featured programming language. Type 1 defines the type-specific aspects of this language, and just like the rest of PostScript, typefaces in PostScript are formulated as sequences of program code.

Even professional users such as graphic designers often know very little about the underlying technology. Mostly they simply want to be able to use the right typeface immediately, at a low price, and without having to read complicated contracts first.

— Erik Spiekermann, 2011

Approaches to Typefaces as Software

Approaches to Typefaces as Software

The process of digital­isation and computer­isation of type-oriented techno­logy is probably a never ending one since new innovative approaches are continuously being found for how to draw and produce type designs. Yet the most fundamental changes and revolutions in the field have happened, and the process of software standardi­sation is largely completed.

At the beginning of this process, there was the question of how types­etting is best represented in software and executed or output by printing devices. With the intro­duction of pixel-based display technology such as CRT monitors, there was also the problem of how to represent glyph outlines appropri­ately on such low resolution devices and not lose the font’s main character­istics. There were many different proposals, and through a slow process of selection and advance­ment, some of them were abandoned while others merged and became standards.

This exciting time of technical innovation has lead to many different efforts and resulting systems, but now at the end of this process of standar­disation, there is primarily one system the whole industry is focused on: the previously mentioned OpenType, a standard coined by Microsoft together with Adobe Systems as a result of the “Type War” between Apple’s TrueType standard and Adobe System’s PostScript. Microsoft, who previously licensed the TrueType techno­logy from Apple, decided to move ahead and create their own standard based on TrueType in the early 1990s, after negotia­tions with Apple to license their advanced typo­graphy techno­logy called “GX Typography” failed. Adobe Systems joined in 1996 and added support for the glyph outline descrip­tions based on its PostScript’s Type 1 fonts. In 2005, OpenType started migrating to an open standard under the International Organisation for Standardisation (ISO) and the process was completed in 2007 when it was accepted as a free, publicly available standard.

This system has become the standard for type on most of today’s modern operating systems such as Mac OS X, Windows and Linux, and most typesetting applications support its special typographic features.

But there is a rather large niche in which one of the other proposals from the period of early digital type techno­logy has survived until today: the type­setting system TeX (with its spin-off project LaTeX, a collection of macros to simplify TeX) and its font system Metafont, used mostly in academia, especially in the mathe­matics, computer science, and physics communities. Both TeX and Metafont were conceived and designed by highly acclaimed computer scientist Donald E. Knuth as a solution to the problem of type­setting complex mathe­matical formulas and more generally scientific publications. TeX has been noted as one of the most sophisticated digital typo­graphic systems in the world. TeX (and therefore LaTeX) have adapted to the same wider spread font standards mentioned above. Nevertheless Metafont is still relevant, as it is largely unknown in the domain of type design and has a history that is still of interest for more recent experiments in program­matic type design based on the principles of parametric variations.

Making scripts or stand alone applications is an effort to move beyond the limitations of boxed software and provide frameworks to make fonts that can more flexibly react to the different requirements by every new project.

— Peter Biľak, 2011