Bouma-Enhanced Latin

quadibloc's picture

Inspired by a comment by Hrant Papazian that several other scripts have advantages over the Latin script in readability, I've come up with an alternative to learning a new script in order to obtain the benefits sought.

I started with Linden Hill, compressing the capital letters vertically by 15%, while leaving the lower-case letters unchanged.

I gave the capital letters A, G, and L small ascenders, and C, G, P, T, V, W, and Y small descenders, by restoring their uncompressed forms, and aligning them suitably. The letter J had a small ascender added the same way; so its original large descender remained unchanged.

Expanding the lower-case letters a, v, and w by 25% gave a a small ascender and v and w small descenders. m, n, and r were given small descenders by dealing with elements of the letters, and u was given a small ascender in the same fashion.

In addition, the large ascender of d and the large descender of q were diminished in size (but not reduced to the size of small ascenders and small descenders) to avoid the mirror-reflection issue.

So even words in all caps will now be more likely to have a "bouma", and lower-case words will have more frequent ascenders and descenders, but small enough to be not overly disturbing to the Latin alphabet with which everyone is comfortable.

rs_donsata's picture

It's an interesting exercise. I think this could work best in situations where fast uncontextualized word recognition may be critical

This could certainly amount to the readability of isolated words but I'm not so sure how this would work on a long text. The changes do certainly give the typeface a playful character. Do you have samples of long running text set with this?

A thing to consider is that increasing the distinctiveness of characters may ultimately hinder readability of longer texts because it's not all about distinction (which may lead to distraction) but also about rhythm and unobtrusiveness. While reading the mind is predicting and confirming predictions to make sense out of the text: context and good writing can help readability more than the best type designer could ever do.

quadibloc's picture

Incidentally, after posting this, thinking of the fact that Hrant mentioned Arabic and Korean, I thought of another way to add more bouma to the Latin script. Have the letters in each syllable on a baseline angled downwards slightly, perhaps descending in steps. Or just have the consonants before the vowel raised, those after the vowel lowered.

Also: the illustration provided is simply a mockup of a thought I had, so as to get feedback from Hrant as to whether this kind of font would address his concerns or not. It may be that I have misunderstood what he finds to be lacking in the Latin script.

As well, while it is true that the baselines are perhaps less prominent when there are more ascenders and descenders, it isn't intended to lead to a ransom note effect. This diagram, incorporating some minor corrections as well, shows the lines:

1: Ascender height
2: Old cap height - cap secondary ascender height
3: New normal cap height
4: Lowercase secondary ascender height
5: x-height
6: baseline
7: Secondary descender depth (upper and lower case)
8: Reduced full descender depth (q)
9: Descender depth

The reduced full ascender height, as evidenced by the letter d, didn't get a horizontal red line in this diagram; it would have been just two pixels above the new normal cap height, (3).

And here is a small, not extended, text sample:

rs_donsata's picture

I think the result is still too jumpy to be usable. Maybe having only a full and half descender depths would work better and also choosing only some critical characters to alter would ease the jumpiness: one of the r,n,m group, either the r or the t, the b or the d and the q or the p.

I think caps are distinct and enough from lower case and infrequent enough to distinguish themlselves. I don't think height variations on the upper side of the x line are helping since this is area with already more distinctiveness.

Thomas Phinney's picture

First, give up on the “bouma” thing, it has been pretty thoroughly debunked by science, decades ago. https://www.microsoft.com/typography/ctfonts/WordRecognition.aspx

Now, making letters more distinctive is a worthwhile goal and does contribute to legibility. But I expect that the differences need to be within the range of normal for those letters. Otherwise the reader has to essentially “learn” your specific typeface to become comfortable and speedy reading it. That seems like an awful lot to expect of a reader. Until that time, your typeface will be in effect harder to read, not easier. :(

quadibloc's picture

Until that time, your typeface will be in effect harder to read, not easier.

Worse yet, the easier it is for people to read texts in the modified script - these distortions can be applied to most text typefaces - the more important letter shapes (which aren't changed much) are, and the less important word shapes (which is what this changes) are.

In a way, that's not a cruel paradox; the less benefit it provides, the less effort it asks. But it could mean that if it can be done, it's not worth doing.

Only if such a script initially impairs readability, by no longer providing the word shapes that people have memorized through their reading, would the greater diversity of word shapes provide a benefit. I tried to minimize the problem by making the new features much smaller than previously existing ascenders and descenders, but I suspect that won't be sufficient to reduce the impact, since any contribution of word shapes to facile reading would not happen at a conscious level.

hrant's picture

Thomas, there are many indicators that Larson's conclusion is fatally flawed (and I'm not the only one who believes that). I have outlined them here and there. It's pretty straight-forward once you really think about it. Also, it's entirely routine for an experimental psychologist to miss the point. One can harbor a bit more hope for experienced typophiles however.

The reason many people want to believe in the letterwise-decipherment model might be that they instinctively realize that the alternative would put them on a road that might not end (the white being essentially indeterminate). It is mystical, and in this age we are conditioned to shun that in favor of the determinate.

BTW learning things is always harder. You benefit later. Expecting no [subconscious] effort from readers is not far removed from not asking them to read. :-)

---

John, please give me a bit more time to think about your interesting proposal...

hhp

quadibloc's picture

I have to admit that my personal bias is in favor of the primary mechanism in reading being letter by letter. However, I didn't find Larson's paper fully convincing, because proving that something is nonexistent is more difficult than proving its effect, if there, must be small.

As I noted, I was surprised by Hrant's suggestion that certain other scripts, usually viewed by Latin script users as much less legible, were more readable. This makes me think that there is a missing piece in the puzzle - have there been studies of the visual mechanisms used by native readers of the Arabic script, for example?

Also, it's entirely routine for an experimental psychologist to miss the point.

That certainly happens, although it wasn't clear to me why this was likely to have been happening here.

And then it dawned on me that perhaps the criticism being aimed at the paper might be this: experimental tests of reading speed and the like are more reliable as tests of legibility than readability. Efficiency and speed in reading short texts may depend primarily on legibility, while readability affects the effort required, and shows up in performance primarily when dealing with longer texts.

If, though, the definition of readability has retreated this far, I can see the aptness of Hrant's remarks about the mystical versus the determinate.

A potential experimental test has occurred to me. If subjects are tested on reading effectiveness using different typefaces, including ones known to be highly readable (Baskerville, Imprint, Jenson Oldstyle) and ones known to be unsuited to extended reading (Helvetica, Futura) and performance differences don't show up in tests of the kind used to debunk Bouma, then that would show the tests are missing readability, and thus we're not dealing with false subjectivity.

Also, I seem to recall that some experiments were described in Typographical Printing Surfaces that crossed the divide between legibility and readability.

William Berkson's picture

On the issue of readability. Peter Enneson and I wrote a long article on the scientific research of Matthew Luckiesh, who first introduced the concept of readability as ease of reading extended text. Luckiesh collaborated with Linotype in the late 30s and we think his research is still relevant and important. He found that increase in blink rate over time indicated lower readability, that the text was more taxing to read in extended blocks.

Our paper was published in December in Typography Papers 9 (I see that also the Princeton Architectural Press in the U.S. will be publishing it.)

On the question of letter-by-letter reading, if I remember rightly, I think that Kevin has conceded that Peter Enneson's theory of reading by whole word patterns of sub-letter parts is not hit by most of the objections in his survey paper. The paper assumes that whole word theories are only looking at the 'envelope' of the outer outline of the word.

On the specific proposal here, I am skeptical that it will help ease readability. That is because the roman lower case, particularly, is pretty highly differentiated. So I'm doubtful that additional differentiation will be of much help.

quadibloc's picture

Hrant's comments on the readability of the Latin alphabet in another thread,

Armenian and Georgian are more readable than Latin. So is Arabic, by a big margin. Hebrew is less readable. Thai is probably slightly inferior. Korean? Makes Latin look like a village idiot.

led me to construct the sample alphabet shown above simply to ask Hrant if this was the sort of improvement the Latin alphabet could use. His full post was:

You're way too rosy-eyed towards Latin.

Actually blackletter (I mean structurally, not necessarily most instances of it) is inherently more readable* than Latin:
http://themicrofoundry.com/ss_fraktur1.html

* Not at all the same thing as "legible".
And virtually nothing to do with "literacy".

Armenian and Georgian are more readable than Latin. So is Arabic, by a big margin. Hebrew is less readable. Thai is probably slightly inferior. Korean? Makes Latin look like a village idiot.

The adoption/abandonment of a script has little to do with its readability, sadly. Especially in this age of idiotic democracy. And it has even less to do with familiarity.

Is it worth switching? Hmmm, is it worth learning math in school? It's so hard!...

So I was treating this as if it was my responsibility to find a way to save the Latin script; to improve its readability to match that of other, superior scripts, by whatever criteria Hrant is applying, in order to avoid the necessity of everyone in the Latin script world having to learn to read and write all over again. (Someone might ask: why am I taking this so seriously? Who has made Hrant Papazian emperor of the world? But while it is true there is no real deadline, if he has identified a real problem, it deserves to be examined.)

And while lower-case is quite differentiated in the Latin alphabet, it's not the case that every word has a unique overall shape; ascenders and descenders aren't that frequent. This is why I thought having more of them would improve the odds.

But it wasn't just ease of adoption that led me to make the new ones smaller; if almost all the letters have ascenders, then the result is the same as if hardly any have ascenders, the words mostly have similar shapes; by having the new ascenders have a new height, I'm sure of increasing variety instead of decreasing it.

(If my approach is valid, this would also mean that the Cyrillic script, where so many lower-case letters have a small capital form, is somewhat less readable than Latin. But I am not going to attempt a reformed Cyrillic at this stage, as I don't yet know if I've understood Hrant correctly.)

So when you say,

On the specific proposal here, I am skeptical that it will help ease readability. That is because the roman lower case, particularly, is pretty highly differentiated. So I'm doubtful that additional differentiation will be of much help.

I am not disputing it. I thought that Latin was as differentiated as it needed to be too. But Hrant seems to be saying otherwise; so I am essentially here asking him if that was what he did say - or if it's something else entirely that is the limitation of the Latin script that he perceives. I'm not yet prepared to take a definitive position on whether script reform for Latin, such as the one I've tentatively proposed, or of a different nature, to improve readability is needed.

Rob O. Font's picture

But let's say you did take a definitive position on reform of Latin, a script used for say 50 languages, each with say 50,000 words, and your definition of reform, is to make 50, 5ok batches of unique as a baby, boumas.

So, "All that I neither praise nor blame...", seems to have no problems, to me, all quite different shapes, while "...has not got a lot of hot bod rot." is supposedly problematic, but I think Dr. Seuss developed reading exercises that alleviated the problem for my generation.

PabloImpallari's picture

Fun cool experiment!

It immediately reminded me of the "Relaxed" hand lettering style, so typical of the 50's advertising. For example:

We also did a fun font in a similar style some time ago:

http://www.impallari.com/projects/overview/life-savers-handlettered-stymie
Will it be more readable than... let's say... ITC American Typewriter? I don't know.

On readability/legibility:
"Legibility, in practice, amounts simply to what one is accustomed to." - Eric Gill
"The most popular typefaces are the easiest to read; their popularity has made them disappear from conscious cognition. It becomes impossible to tell if they are easy to read because they are commonly used, or if they are commonly used because they are easy to read." - Zuzana Licko

Maxim Zhukov's picture

I am not going to attempt a reformed Cyrillic at this stage

Whew… How do you like this shot at improving Cyrillic, by certain Vladimir Obodowski? It was called Реформа печати ради глазъ (“A Printing Reform for the Sake of Eyes”), and had been published in 1894 in Moscow, at The Russian Letter-Press and Lithography Printshop.

Maxim Zhukov's picture

It immediately reminded me of the "Relaxed" hand lettering style, so typical of the 50's advertising.

This just closed at the New York Public Library:


The typefaces used in the NYPL exhibit are Ed Interlock and Ed Roman, part of the “Ed Benguiat Fonts” collection (House Industries, 2004).

quadibloc's picture

It was called Реформа печати ради глазъ (“A Printing Reform for the Sake of Eyes”),

Oh, wow! Counter-free Cyrillic!

Initially, it seems to me that filling in all the counters removes major difference between the individual letters, leaving only more subtle details around the edges of the letters.

So my initial reaction is that this would cause a major decrease in legibility instead of an improvement.

froo's picture

So my initial reaction is that this would cause a major decrease in legibility instead of an improvement.

Indeed. I know Russian, but have had problems while reading:
ъ looks like а
н, ш and ж look barely the same
т and и also.

If you don't know a word, or it's a long one, you read thrice.

Syndicate content Syndicate content