Peter Wilson

[completed 2006-11-08]

Peter Wilson is the author of the well-known memoir class, has been involved with the development of several other classes, and has worked extensively with fonts. In the photo he is operating a Chandler & Price 1910 old style hand press and is in the process of inserting a sheet of paper to print on the second side.

 

 

Dave Walden, interviewer:     Please tell me a bit about yourself personally, outside the world of TeX.

Peter Wilson, interviewee:     I was born in Cambridge, England, due to that being where my mother was at the critical moment. After the war my family moved to Kenilworth (which was on the losing side in the Civil War) and went to school in Warwick (which was on the winning side); one has a magnificent castle while the other has child-friendly castle ruins. The school claimed to be the third oldest in England with a purported founding date of 914AD. I later found out that that was the date of the first mention of the town, and the school boosters made the heroic assumption that if there was a town there must have been a school.

I read Mechanical Sciences at Trinity College, Cambridge, and began employment with British Thomson Houston, the then leading electrical engineering company in the UK, working on developing planar transistors (this was in the days well before microchips), firstly at Rugby then in Lincoln where my two children were born. Later I moved to the Lucas Research Centre in Birmingham and during this time I obtained a PhD in Semiconductor Physics from Nottingham University.

I was one of the few at the Research Centre using the computer as I needed to do a lot of numerical calculations. The powers that be decided that if you could do one thing on a computer you could do anything so I was ordered to start working on Finite Element Stress Analysis and, by the way, “You'll be teaching a course on it in a week's time to 25 engineers!” Misquoting Groucho Marx, “The secret is real knowledge. If you can fake that, you've got it made.” It appeared that I could fake it well enough to survive the week. Later I ran a group developing solid modeling CAD systems and in spite of my best endeavors was elected to be European Chairman of the CAM-I Geometric Modeling Project, CAM-I being an international consortium of major industrial companies supporting relevant R&D projects. One of the many good things about CAM-I was that it rotated its meetings around Europe and the United States so I got to visit a lot of interesting places. On the personal side I married for the second time in the middle of this.

One thing led to another and I was headhunted by GE in Schenectady NY to manage the solid modeling group at their R&D Center. At this point I got involved in a national project developing a data interchange standard called PDES for CAD/CAM/CAE systems. This later became an international effort to develop an ISO standard called STEP for the same purpose. There were some major concerns by the non-US countries that they were just going to be asked to rubber stamp the PDES specification, and likewise some US representatives were suspicious of the others' motives. This was resolved by changing the underlying words of the PDES and STEP acronyms to “Product Data Exchange using STEP” and “STandard for data Exchange using PDES”.

Headhunters tend to hang heads out to dry and I became a Visiting Research Professor working on information modeling techniques at Rensselaer Polytechnic Institute, the “oldest continuously operating technical university in the United States”. I was invited to join the Editorial Board of the IEEE Computer Society “Computer Graphics” magazine and shortly after I accepted was asked if I would be willing to be put on the short list of 3 or 4 for the forthcoming election of a new Editor-in-Chief. Naively I agreed and at that point the short list was closed as they had managed to get a name to go on it. I held the position for 4 years, serving the maximum of two 2-year terms.

While at RPI I was part of the founding team of a small company called “STEPtools” started by Prof. Martin Hardwick of the Computer Science Faculty. The business plan was to commercialise results coming from the STEP endeavour and provide EXPRESS-based tools to support the ongoing development of the standard. Unlike many small companies it is still in business after 15 years (http://www.steptools.com) even though it has been many years since I was involved in it.

Circumstances changed at RPI and I spent some 3 years in the Washington DC area with joint appointments as a Research Professor at the Catholic University of America and Visiting Researcher at the National Institute of Standards and Technology in Gaithersburg, still working in the information modeling area.

I am now living in Seattle having retired after some years with the Boeing Company where I remained involved in developing the STEP standard, in particular extending it to cater for fluid dynamics analysis systems and wind tunnel experimental data.

I'm interested in the development of languages and writing systems but find that there isn't as much time for those as I thought there would be, as my wife and I are doing as much traveling as we can before moving back to England in a couple of years' time. We have no family in the US but there are two children and five grandchildren in the UK. This year the major trips have been to New Zealand and South America — both easier to get to from Seattle than from the UK.

DW:     How and when did you first become involved with TeX and its friends?

PW:     During my CAM-I days my group at Lucas had a contract to develop and demonstrate a data exchange system between solid modeling systems. We had to provide reports on the work and also user-level and programmer-level code (FORTRAN at that time, 1979) manuals. We were using a PR1ME computer and they had a simple but effective program to neatly print tagged text, which was my first exposure to “typesetting” and the joys of not having to involve a typist and typewriter. I think that the program was called RUNOFF.

At the time work on the STEP standard started ISO wanted typescripts which they would then rekey into a form suitable for their publishing system. We knew that STEP would be large by ISO terms (it now involves several thousands of pages crammed with technical information) and we wanted to try and supply camera-ready copy which would eliminate the rekeying and typo-adding aspects of ISO's traditional process. My major technical contribution to STEP was the development of the EXPRESS family of information modeling languages (ISO 10303-11:2004 Product data representation and exchange: description methods: The EXPRESS Language reference Manual; ISO/TR 10303-12:1997 Product data representation and exchange: description methods: The EXPRESS-I language reference manual; and ISO 10303-14:2005 Product data representation and exchange: description methods: the EXPRESS-X language reference manual.) As a sideline I agreed to become co-editor and integrator of the various aspects of the standard. LaTeX had just appeared and we decided to try it. It met all our needs, after we had learned how to write macros, and at the peak we produced a 1500-page document encompassing the standard. Since then it has been split into many individually published parts, although some of these are around the 1000 page mark. I developed a LaTeX class specifically for ISO standards and a corresponding package for the STEP series of standards in particular. These have proven very effective as ISO kept changing their minds about the layout of their standards. For example by changing the margins and text area, changing font size specification for figure/table captions, numbering notes continuously to numbering per clause (chapter), and so on. A few updates to the class and all were catered for.

The initial developer of EXPRESS, Doug Schenck, and I felt that the language needed more explication on usage than was allowed by ISO rules which limited the standard document to specification only. Consequently we wrote a book on it (Douglas A Schenck and Peter R Wilson, Information Modeling the EXPRESS Way, Oxford University Press, 1994), using LaTeX to prepare the camera-ready copy for the publisher. Again it was easy to make any global layout changes that they requested, like lengthening each page by two lines, or using a different font and position for chapter headings.

DW:     How did you go about learning TeX and LaTeX enough to use it for a large document and to be able to write your own classes, e.g., what learning resources did you draw on or did you mostly learn by trial and error?

PW:     I got Lamport's book which I understood, but I found that The TeXbook was too confusing, so learning during the early years was mainly through trial and error. In 1989 I learned about TUG and became a member; TUGboat was a great source of ideas and code. Sometime later I signed onto the email-based precursor to the comp.text.tex newsgroup which was another good resource. When it came out, the documented source code for LaTeX2e was, and still is, an enormous help. My progress is still basically one of trials, tribulations and errors.

DW:     What are the names of the two LaTeX classes you developed for ISO and STEP standards?

PW:     The general class for ISO standards is isoe.cls and the package is stepe.sty. Both are on CTAN: http://mirror.ctan.org/macros/latex/contrib/isostds. The distributions also include configuration files for TeX4ht to enable conversion to HTML. It is four years or so since I looked at all this, so my recollection has faded somewhat.

DW:     Until now you have been best known to me for the memoir class. Please tell me about your effort to develop that, and to what extent is it related to the above mentioned two classes?

PW:     ISO, and also the STEP management kept altering the typographical requirements and it occurred to me that a class providing easy methods for a document designer to adjust things would be very useful. I started on that in the early 1990s but never got very far. However over the years I did produce a bunch of packages that treated different aspects of document layout. Eventually I resurrected my “design” class ideas. The end result was the memoir class which started out as a bundling of many of my packages. It has now taken on a life of its own and provides the functionalities of over 30 packages. At the moment I don't think that there is a need to add any further major extensions and there appears to be nobody clamouring for more.

The STEP standards principally specify the information represented within CAD systems and standard means of representing data corresponding to this information. The EXPRESS language, which has a mix of data specification, constraint specification, and regular procedural code programming languages, is used to formally define the information. A STEP standard document typically includes both text and EXPRESS code. The LaTeX sources were organised so that the whole document could be typeset and it could also be fed to an EXPRESS compiler — the verbatim environment was used for the EXPRESS code and the other parts of a document were in the form of EXPRESS comments. I found this so useful that I started to document my normal C code in the same manner — I was aware of the various systems like web and noweb for literate programming but they seemed overkill for what I wanted to do. There is a graphical form of EXPRESS called EXPRESS-G which is a member of what a colleague used to call the BLA class — Boxes, Lines and Annotations. This was also used in STEP to provide a graphical view of the information structures. I wrote a MetaPost package, called expressg (available from CTAN), to provide a basis for drawing pretty much any kind of BLA diagram. I had used Frank Mittelbach's dtx/ins system for documenting LaTeX packages, found it very useful, and wanted to use it similarly for MetaPost. To this end I wrote a package called docmfp that provided extensions to the doc system to documenting languages other than LaTeX (docmfp roughly stands for DOCument MetaFont and MetaPost). From that time on I used it for documenting all my other code, like Java. One source document provided the commented code for typesetting and the code itself could be extracted for compiling.

I think another major contribution has been the ledmac bundle of packages for critical editions which has been used by the humanities community. I did this mainly as an intellectual exercise, having no interest in nitpicking the wording of old manuscripts. If someone would like to take it over I'll gladly hand over the glory and responsibility.

DW:     Your answer leads me to follow-up questions in both areas you mentioned. First, the memoir class includes, as I remember, a big manual on good book design as well as instructions for using the class. Please tell me a bit more about your interests and motivations that led you to carry through with such a complete class development effort.

PW:     There were LaTeX books and manuals that effectively said “If you want the output to look like that, then do this”, but there was rarely, if ever, any mention of why or why not it was a good idea for the output to look like “that”. My ISO\STEP experiences showed how awful some design decisions could look so I started to read up about typography and book design; Robert Bringhurst's The Elements of Typographic Style (Hartley & Marks, 2nd ed. 1999) is an excellent and elegant exposition. I felt that providing some background on book design would help users to, in Knuth's words, “ create masterpieces of the publishing art”. However, in spite of this I don't think that the memoir manual falls into that category.

DW:     Second, I am unfamiliar with the ledmac bundle. Will you please tell me a bit more about it. Is this related to your paper on early scripts and fonts at http://tug.org/TUGboat/Articles/tb26-3/tb84wilson.pdf?

PW:     ledmac and friends have nothing to do with my work on scripts and fonts.

In 1990 John Lavagnino and Dominik Wujastyk created the edmac set of macros for TeX (http://mirror.ctan.org/macros/plain/contrib/edmac) designed for authors of “critical editions”. A critical edition appears to be one where various versions of a manuscript are compared to try and come to a definitive version of the work. This involves lots of footnotes describing variant readings, explaining allusions that a modern reader might not follow, and so on. The main requirements seem to be that every line is numbered, and can be referred to by its number, and at least four different series of footnotes with differing layouts (for example in double or triple columns, or run together as a single paragraph).

Around 2003 I happened to notice that someone on comp.text.tex was asking for a version of edmac that would work well with LaTeX and thought that it would be interesting to try and do that. My first ledmac version was a fairly simple conversion of edmac from TeX-ese to LaTeX-ese. Then came requests for it to be able to handle verse and tabular material and I converted TeX code from Wayne Sullivan and Herbert Berger to do this. Later there were requests for a means of putting two different texts in parallel on facing pages together with separate sets of line numbers, footnotes, etc. I eventually managed to produce the ledpar package as an add-on to the ledmac package to enable this. The bundle now includes ledarab for critical editions that include Arabic texts which involve a mixture of left-to-right and right-to-left typesetting. ledmac is intrinsically complicated as it has to typeset everything twice for the line numbering mechanism. I must admit that I don't really understand exactly how it works; I have never even read a critical edition, and know nothing about Arabic.

In answer to a next-to-be-asked question: Reading books on typography one keeps on coming across different fonts and I got to wondering where the letter shapes originally came from and this got me interested in the origins of the alphabet and writing. There was nothing complicated about the shapes of the early letter forms and I got the idea to create Metafont versions of them, which was relatively straightforward. One thing leads to another and there was a gap in my understanding of the development of the letter shapes from monumental forms to printed forms, so that led me to looking at (reproductions of) manuscripts and calligraphy. I produced a Metafont series of fonts representing several of the main kinds of writing (called bookhands) used for formal purposes between roughly 1AD and 1500AD when printed books started to appear, thus completing the main thread of the Latin alphabet letter shape development from 1200BC to the present day. I have converted the archaic scripts from Metafont to PostScript Type 1 format (http://mirror.ctan.org/fonts/archaic) and am in the slow process of extending the bookhands (http://mirror.ctan.org/fonts/bookhands) and converting them to PostScript Type 1 form.

DW:     You have written a regular column for TUGboat for a number of years, called “Glisterings”. The first installment was apparently in 2001 (http://tug.org/TUGboat/Articles/tb22-4/tb72wilson.pdf). Please tell me what “glisterings” means and what was your motivation and intention with this column.

PW:     For some years Jeremy Gibbons wrote a column for TUGboat called “Hey — it works!” This contained small pieces of (La)TeX code solving some particular problems. When Jeremy wanted to stop writing his column I, for some reason, was asked to take it over. I agreed to do so but I wasn't too confident that all the code I got or wrote would work properly, so I wanted to change the column's title, and also so Jeremy could not be blamed for any of my wrongdoings. I eventually came up with “Glisterings” based on the phrase “All that glisters is not gold” (Shakespeare, Merchant of Venice, II, 7) to indicate that there might be some doubt about the quality of the contents. So far, though, there has been no fool's gold that I am aware of. The Editorial Board agreed to the title even though it is a word of my own invention. What I have tried to do with the column is to pick some frequently asked question from comp.text.tex and provide some kind of answer, usually based on one or several that have appeared on comp.text.tex; sometimes, though, I reluctantly have to create my own solution.

DW:     You mentioned Bringhurst's book and there is of course your on manual on document “styling”. Do you have any other recommended books in this area?

PW:     The UK FAQ has a section on books on typography. I think that if you only read one then it should be Bringhurst's (it is probably also the easiest to find in your local good bookshop). Others, saying nothing about their respective merits, in author order are:

DW:     Now that you are retired, do you plan to remain active with TeX et al., e.g., after you have finished converting the bookhands to Type 1?

PW:     I'll remain active — I need some intellectual challenges and TeX can be uniquely challenging. However as I have plans to do quite a bit of travelling away from computer resources my response time to any questions or requests will get longer and longer. Early in my career I found that the method of “masterly inaction” in response to others' problems to be extremely useful. Given enough time, and urgency, the problem could usually be solved by the originator. (Rereading this, I must have started to become generally realistic, also known as cynical, at an earlier age than I thought).

DW:     I'd appreciate hearing any thoughts you have on the future of TeX et al. and of how TUG and the other user groups can best help support TeX and TeX users.

PW:     (La)TeX wouldn't be where it is today without the TUGs. Fortunately there are several, as on occasions one or other of them flag somewhat. TeX Live and friends are a tremendous help in actually getting and installing a TeX system. I am disappointed that the TUGboat publishing schedule seems to be in a permanent state of slippage but the online PracTeX Journal magazine goes a way towards filling the gap. [Editor's note: since this interview was completed, the TUGboat publication schedule has caught up.]

(La)TeX still provides the best non-professional typesetting system for non-ephemeral works but (a) it is not well known (it doesn't come on your MS laptop), (b) it requires some willingness to learn (it's not iconified WYSIWYG), and (c) there is still the “LaTeX look” about too many documents. In my last company there were very few LaTeX users basically because everyone, and especially management, used MS and couldn't cope with anything outside that narrow window onto the computing world. Generating PDF is a great boon as everyone can at least read and print a PDF document. Perhaps with OpenOffice and ODF it will be easier to make bridges into the “regular” world. More examples of fine non-LaTeX-looking typesetting would help, together with how they were done. I don't think that LaTeX will ever conquer the world as too many are satisfied with “good enough” instead of “as good as possible”.

Font installation is exceedingly hard anywhere except, I gather, on a Mac. It would be a great boost to be able to push a button/type a single command and everything then happens automatically. New fonts are not just nice but if you are typesetting using a non-Latin alphabet or script they are essential. [Editor's note: since this interview was completed, XeTeX and LuaTeX have been released, offering access to native fonts on all platforms; see the interviews with Jonathan Kew and Taco Hoekwater.]

DW:     More generally, you have spent a lot of time studying, thinking about, and working with aspects of printing, fonts, and related crafts. Do you have any general thoughts on the continuing development of these fields.

PW:     I'm glad you said “crafts” as typography is a craft, not a science. You have to have an eye for what is appropriate and what is not and often the distinction is subtle. I freely admit that I don't have such an eye. I look at typography magazines and they are full of attention grabbing stuff, which is fine if you are trying to compete with the adjacent advertisement but not useful otherwise. You can find a lot of fonts on the web but there are exceedingly few that you would be comfortable reading more than one line. Creating a new font seems a fun thing to do, especially if you are using a graphical program. Perhaps folk will, in time, learn to be more constrained.

It is good that increasingly people can, and do, design and create their own document styles. Some will undoubtedly become craftsmen.

DW:     Thank you for your participation in this interview. I knew your name before, and I'm glad to now have much more understanding of your long term work with TeX.

[Update from Peter Wilson: Over the past couple of years I have been fortunate enough to have been invited to spend a day or two each week at a local printshop where I have produced a few small books and ephemera using the traditional methods of handsetting lead type and printing on a hand press. This has brought home to me more forcefully than before that typography and typesetting are skilled crafts and I am filled with admiration, and even awe, for those who worked before the advent of Linotype or Monotype typecasters and especially before \LaTeX, Adobe products, and offset lithography. Regarding the photo at the beginning of interview: The press is motor driven with a cycle time of about 15 seconds and if your fingers get caught then no fingers. In the foreground is a Heidelberg press, sometimes referred to as the ``windmill'', that we use for foiling. There is a lot more to it than is in the photo. The Chandler & Price can do nasty things to fingers but the Heidelberg with its rotating arm feeding mechanism can do horrible things to anything above the waist. You can tell it's old because the only safety guard is a small piece of sheet metal at head height. I haven't learnt how to use it; I only do small print runs and use a hand foiler instead.]


Interview pages regenerated January 26, 2017; TUG home page; join TUG/renew membership; webmaster; facebook; x; mastodon.