Ben Mitchell's typo blog charting the excitement, activities and challenges of my 12 months' studying the MA in Typeface Design at Reading University.

Now with occasional ramblings about type-related things I find interesting.

Opinions are all my own.

This week, the annual ATypI conference is taking place in Hong Kong. Unfortunately I’m not going to be attending, but its theme, ‘Between black and white’, has prompted me to think in more depth about how the principles of notan can be implemented in typeface design, not just as a curiosity, but as a pragmatic tool to enhance readability.

One of the attractions of type design is in the breadth of its influences, and a natural borrowing is from the Japanese concept of notan. The term refers to the art of balancing the opposition of black and white in Japanese paintings, and it has been somewhat hijacked of late by type designers referring to the interplay of the black (foreground) and white (background) elements we design with when creating type. Specifically, it alludes to the dissociation of the two, allowing the designer to intentionally mould the counterforms (white areas) as desired, not just to be determined by the black lettershapes. The byproduct of this separation between letter and counterspace is that it forces the type designer to abandon the stroke model and instead adjust letters’ outlines in isolation. To me, how this can be utilised in a pragmatic way is a massively interesting area for the discipline of type design, not least because it is so little explored.

As I mentioned when starting out on my MA typeface, Lumen, my aim was to make the letters flow together, so that in dictionary settings, where things are often rather choppy, smooth reading would be enhanced by legato, connected word-images. The ultimately connected type style would of course be a script face, with joined up writing. For obvious reasons this is not suitable for immersive reading or for setting a dictionary. How then can letters be linked together? How about making the white, rather than the black, be the part that joins up? A perfect opportunity to employ original creative responses to foster the principles of readability.

This sort of ‘pure’ design had been tackled by Evert Bloemsma in his Legato typeface; as Kris Sowersby notes, the design is ‘free of stylistic conceits’ with the important decisions made in pursuit of the goal of connectedness. What Bloemsma had done was rotate (or skew) the inner counters in the opposite direction to the outer black forms, breaking up the letters’ rigid uprightness and giving the white space a direction, rather than letting it simply exist as a dead space or byproduct of the black letters. While I didn’t want to follow this logic to the same conclusion as Bloemsma, I found in his theory a seed for further development.


Above: in most typefaces, based on a tool-and-stroke model, the white counterspaces are simply a byproduct of where the edges of the tool fall. In Zapf Chancery, the track of the pen (green) and the width and angle of the nib (yellow) determine the edges of the stroke at every point, and the stroke’s edges form the boundary between black and white. Now type design can be emancipated from the tool-and-stroke model with interesting results.

I took the idea of designing the whites semi-independently, but sought to make them connect across letters. Underpinning the architecture of every letterform was the theory that the white space should bend and flow to lead the eye smoothly along the reading line. The inner contours of letters mainly project outwards towards the adjacent letters, and employ curves of a lower frequency than the outside black edges. As well as helping the letters compose into harmonious images, this flattening of the counters emphasises the horizontal reading direction from left to right.


The image above shows that the countershapes in Helvetica are directed inwards (or at nothing), making the letters stand alone, referring only to themselves; the lack of activity between letters also creates dead space that breaks words apart. In Lumen, the countershapes have been designed to harmonise and lead the eye across intervening letterspaces.

These days, people are increasingly asking whether there is any need for new typefaces and whether everything interesting hasn’t already been tried. I’m strongly opposed to their arguments: type design, like the disciplines of architecture or music composition, is a response to particular circumstances of time and place, and at its best, blends together personal expression, original and critical thinking, an appreciation of type history and underlying theories, and skills polished through extensive practice. In our increasingly connected world, the interesting question is what other concepts can type design borrow and benefit from?

Posted at 1:57pm and tagged with: one column, notan, type design, font, stroke model, Bloemsma,.

Recently, we were visited by Will Hill, ex-Reading student and now Senior Lecturer in Graphic Design at Anglia Ruskin University. His lecture touched upon something that’s been bothering me for some time…

From printing’s beginnings, type has taken its cues from inscriptional lettering, handwriting and calligraphy. Over the next 500 years, type started to diverge from hand-tooled forms, becoming slowly emancipated from these external sources, and becoming more standardised; new typographic environments and developments in technology both fuelled and fed off the evolving spectrum of typeforms.

But until the end of the 20th century, type designers were still constrained to using the traditional technologies of production: drawing letter patterns by hand, cutting punches and casting metal type. With the advent of digital type drawing, those technologies are slowly being left behind, with many type designers nowadays drawing letters, unmediated by paper, directly on screen.

In The Stroke, Gerrit Noordzij reduces typeforms to handwritten strokes:  letter shapes are unavoidably composed of the strokes of our pen or pencil. The stroke is the unassailable basis (‘fundamental artefact’) of a shape. For Noordzij, outlines do not define a shape, they are simply the bounds of a shaped stroke. Unfortunately, this is only one way of seeing things, and it relies on drawing letters from the inside, as though tracking the ductus with a tool. It is not clear how his theory could apply to computer-generated outlines not conceived with penstrokes in mind.

However, Noordzij is right that most of what we read is based on models of how we write. Adobe’s Robert Slimbach states “It makes sense that type designers look to the established archetypes for inspiration…Because the familiar, traditional form — which grew out of centuries of handwriting practice — remains embedded in readers’ minds, it is crucial that designers of text typefaces work within its bounds.” (Quote from the Arno Pro specimen.)

But let’s step back and think about this: why should what we read and what we write be related? After all, the physiology of the eye and that of the hand do not in any way imply a logical connection. Are the letterforms that come out of our hands when we write the best possible forms for reading?

Some people seem to think so. So-called ‘infant’ typefaces with the single-storey /ɑ/ and /ɡ/ are very popular among children’s book publishers. But perhaps these publishers have conflated reading and writing. Studies have shown that children do not find ‘adult’ versions of these letters especially problematic, and understand that one version is for reading, the other for writing. (Sue Walker, 2003). Adults generally don’t find variant forms problematic (though some people prefer their handwriting to use typographical forms of the /a/ and /g/). And letters in other scripts often have differences between handwriting and type. Doesn’t this imply the connection between reading and writing is not as causal as we tend to think?

So here’s the question: type is not writing. So why has the influence of writing persisted for so long in type design?

Will Hill cast an interesting light over the matter in his lecture. He sees the stroke-and-tool paradigm as a model that ensures coherence in type design. It provides a set of ‘relational constraints’ or a ‘behaviour pattern’ that makes all the letters in a design belong to each other. Our firmly entrenched and largely unquestioned conservatism in following the stroke-and-tool model acts as a kind of safety net that gives us a set of design parameters that ensure consistency in our typeface.

If that’s the case, and with technology now at a stage where designers can work directly on screen, one would now expect there to be a quiet revolution in the way we think about type, and new models should have the chance to spring up.

Jeremy Tankard’s new Fenland typeface shows that this is indeed the case. Instead of basing Fenland’s ‘relational constraints’ on the stroke paradigm, the letters are formed by bending hypothetical steel tubes. In direct contradiction to Noordzij’s theory, Tankard abandons a stroke model and begins his drawings with outlines. The curves bend around the letterforms instead of following the shape of some internal ‘skeleton’. The curves really do unexpected things, collapsing in on themselves as they go around corners and throwing away the conventions of where thick and thin strokes appear.

Which brings us to a second reason why the stroke paradigm persists. All the questions the type designer needs to ask in designing letters can be answered by considering the stroke model, what tool is used and what logic is being applied to that stroke. Therefore, it is a paradigm that sets out sufficient parameters for designing type. Additionally, as Noordzij shows us, the model provides enough variability for different forms to emerge: expansion, translation, running and interrupted constructions can be freely combined to different degrees, generating a huge spectrum of possibilities.

Much as Tankard’s tubular premise is fascinating and original, it isn’t quite sufficient to provide all the answers to how the letters should look. For example, he has had to also define a particular ‘stroke’ order,  which strokes are primary, and whether they connect in a ‘running’ or ‘interrupted’ way: the tube model itself says nothing about these matters, and the answers have to be decided on a letter-by-letter basis. This doesn’t promote the consistency that the stroke paradigm is so good at ensuring. The skill in Fenland is in Tankard’s ability to reconcile the letters consistently without a sufficiently explicit behaviour pattern.

In my Mint typeface, started in 2009, I began to see the outlines as primary, rather than the strokes. Although the strokes are still very much apparent, conceiving things this way allowed some fresh thinking. The outlines alternate between shaping the black letterforms and locking in the white counterspaces. The interplay between black and white (similar to the Japanese design concept of ‘notan’) gives the white page a more active role in the typography of the text block, in a way the stroke model wouldn’t naturally elicit. But again here, the ‘outline’ model doesn’t provide exhaustive parameters to ensure consistency.



The MATDs have now submitted their typefaces (woo!) and are moving on to the next projects, but it’s definitely time to experiment with these questions and see what alternative models can offer.

Posted at 1:07pm and tagged with: typography, stroke, Noordzij, type design, handwriting, construction, type, reading, writing, design, Fenland, stroke model,.