Critiques of the OpenType format?

ishamid's picture

Hi all,

This is my first message to this forum. I look forward to getting to know this community!

OpenType fonts are all the rage today. Are there any critiqes of the format, or discussions of its limitations?

How many typesetting applications can actually take full advantage of opentype fonts?

What are the chances that OpenType (at least some of its advanced features) will go the way of MultipleMaster fonts?

Context: I am working on a Classical Arabic-script typesetting system with very high-quality typographical standards. I primariliy use TeX and ConTeXt.

Best to all
Idris

John Hudson's picture

It seems to me that the majority of users are interested in having fonts that are relatively simple to use and which work consistently across a wide range of applications, and this ease of use and compatibility is more important, for most users, than having fonts that do lots of clever things but inconsistently and only if the user knows how to operate them properly. Having sliders to perfectly fine tune a typeface for a particular size and output configuration sounds wonderful, but you are expecting the user to have the same expertise as the type designer in determining what that optimised font should look like. It seems to me that most users would prefer to have a simple font labeled 'Use this for ten point text on your 600 dpi printer'.

Laurie7475's picture

Hello ChrisL, I'm new here. I was wondering if you would be able to describe the terms you are using. I've heard of Multiple Master Fonts but I'm not sure what they are and how they fit in the scheme of things. Same senario for open face fonts. I new to the tech side of design. Thanks, Laurie7475.

Rob O. Font's picture

"It seems to me that the majority of users are interested in having fonts that are relatively simple"

Bitmaps it is then! Whatever tuned you in to font format by democracy cripppled your though path. Sorry for ya!

"Having sliders to perfectly fine tune a typeface..."
You're stuck on this I can see. If we had sliders for pair kerning, tracking and point size, would you run for the cover complexity too? Who said all users would be "expected to have the same expertise as the type designer". Who?

"It seems to me that most users would prefer to have a simple font labeled ‘Use this for ten point text on your 600 dpi printer’...Nice thought, has ANYONE on this list ever made a 10 pt for 600 dpi? Any size for 600 dpi? Does anyone have plans to do so? Is is reasonable to expect founders to make fonts for the "general public" for a specific size/resolution? Have I suddenly walked into a we've-got-poop-for-dna club? I hope not.

sander's picture

I at the very least plan to look at high-resolution instructing of fonts - for 300/600/1200 and inkjet resolutions. Whenever I manage to come up with a design from which everybody else doesn't run away from screaming, that is.

John Hudson's picture

So what are you proposing, then, David? I'm having trouble following your argument. You want variations in fonts, yes? and I point out that variations imply users who are knowledgeable and competent enough to be able to select appropriate variations with some kind of user interface. Maybe there are enough such users to justify the development of some clever typefaces, but are there enough such users to interest mass-market software developers in supporting the technology? Are you suggesting a format specifically for professional, experienced typographers? If that's the world you wanted to inhabit, perhaps you shouldn't have started Bitstream and begun this whole democratisation of type mess :)

Given that the current companies developing and supporting font formats and related technology are mass-market software developers, I don't think we can be surprised or very critical of what they've done. They are supporting technologies for their user base, which is people who don't know much about typography. What's the point on hammering at them? The only way they would change is if a really big OEM client came along who wanted something more and better and had enough money to make it worthwhile.

It seems to me that, although addressing a different set of issues, the SIL folk had the right idea with Graphite: if what MS, Adobe and Apple are producing is too limited for you, design your own font format or format extension, build tools to make it, and then build applications to use it. That is the only way you are going to get away from the lowest common denominator of mass-market software.

But I'm still not sure exactly what you think this wonderful new font format should do or how it should do it, or what the specific benefits are. Do you have a spec? You are very good at making gnomic criticisms of OpenType -- I'm still not sure what the heck you were getting at with the Arabic comment --, and at telling other people that they just don't get it when they can't understand your oblique references, but what are the specifics? What should be in the great font format? And why are those features better located in the font file than in the development tool? And don't presume that I'm smart enough to follow anything vague: spell it out for me.

Heck, if I understood you, I might agree with you.

Nick Shinn's picture

Rather than a critique of OpenType, I'd like to offer an alternative, hypothetical model for a font format:

SPINE-BASED METAFONT

That's spine not spline.
Each glyph's 'artwork" would be the basic, un-stroked path(s) that lie at the centre, or spine, of the glyph's stem(s).
These spines may be fleshed out by the type designer in a number of ways.
1. Most obviously, "expand stroke"; this would have sliders for width, contrast, and angle of contrast -- and these values could be varied for different sections of the spine.
2. Terminal. These could be specified from a variety of boxes for different serif types, including "no serif". Each of these options would be editable, for instance "no serif" could be varied between perpendicular and round.
3. Outline Style. This would be applied to the outline eventually generated by the user. Like the Adobe Illustrator feature. This could make a rough, size-scalable texture, for instance.

These values would not be applied by the font tool and forgotten, but would be part of the font.

Then rather than calling up an outline for each glyph, the font tables would specify values for each of the "fleshing out" variables. Variations of these values could be included in the font, specifically for different output devices. End users, should they desire, could over-ride the built-in values. Compare with the Edit Kern Table feature of Quark XPress, or Multiple Masters.

The spine-based model of font design is rather like character-building in a video game (or the recent South Park website, threaded at Typophile recently).

While it may not be as subtle as outlines in rendering display type, the spine-based format would be more amenable to joined cursive letterforms and ligatures.

It would be possible, for instance, for a display/output device to smoothly "force-join" spines that don't quite line up.

The creation of characters missing from encodings not supported by the font could be done intelligently by the user's application defaults -- such applications would have a database of spines for all encodings, compare the font with the required encoding, substitute missing characters/spines as necessary, and apply whatever the "flesh out" values of the font are.

This idea isn't new, but goes back to Donald Knuth's pioneering work of 30 years ago.

Also, it could be connected to live streaming data, as per the LettError "Twin" typeface.

This mutable basis for a font format is more appropriate to the digital era than the present one of fixed outlines.

I've just come across the SVG font format, which seems to be along these lines?

crossgrove's picture

"I’m having trouble following your argument"

Ditto. David, my issue is not that I am unimaginative or stubborn, or attached to a backlog of invested technology, but that I can't understand you. Like John, I might find that you are actually saying things I agree with and could respond to. I find your messages very cryptic. Nevertheless, I'm very interested to know about your ideas.

dezcom's picture

Laurie,
The Cliff notes version is:
MultipleMaster fonts were originated in the 80s by Adobe as a way to generate variations of different fonts based on axis (or master). A pair of axis might change weight from light to bold. Another pair of axis might change condensed to extended, Users could dial in anything between the extremes designed into the axis pairs. The idea was to let the user select the degree of boldness or extendedness, or whatever. After several years of production, Adobe abandoned the concept as an end product. However, Multiple Master (MM) axis are used today by type designers in programs like FontLab to help generate variations of weight, etc.. It is now mostly an interpolation tool and assumes some degree of cleanup by the designer afterwards but much less than starting from scratch for each variation. I am sure there is a much better explanation on the Adobe site. You can also do a search on this site and find insightful dialogue including people who have had a role in MM development.

OpenType is a developed set of standards put together by a combination of groups. The two most influential being Adobe and Microsoft. The consortium worked hard to address the shortcomings of previous standards and came up with a cross-platform solution which addresses multiple language and script usage along with including what used to be "Expert Sets" for small caps and oldstyle figures and stylistic alternates, ligatures, and contextual alternates. The real info on opentype is best read from the Adobe and Microsoft web sites.
A Google search will get you more than you ever dreamed of as resources. Again, search this site as well where you will find plenty of comments from Adobe and Microsoft staff as well as some very savvy Indi font designers.

As a start, here is a wikki entry on Opentype:

http://typophile.com/wiki/OpenType

ChrisL

John Hudson's picture

A small correction, Chris: not a pair of axis (sic - the plural is axes) but a single axis between a pair of principal masters. So a single-axis MM font had two masters. A two-axes MM font had four masters. And so on. The maximum number of axes allowed by the format was four, i.e. sixteen principal masters. [I use the term principal masters, because in later versions of MM, Adobe introduced the idea of intermediate masters, i.e. a master part way along an axis between two primary masters, which would affect the interpolation on either side of it.]

dezcom's picture

Thanks John! I hope Laurie reads you correction too.

ChrisL

PS: I was hoping the Axis was not evil :-)

Rob O. Font's picture

No. These are the Axis of Owl.

"So what are you proposing, then, David? I’m having trouble following your argument."

Progress. Do you John, Simon, Thomas, others, take the OT format to be your legally wedded format, 'till death do you part? (Not a rhetorical question).

"That is the only way you are going to get away from the lowest common denominator of mass-market software."

I had no idea that was the goal, making "the lowest common denominator of mass-market software" font format. All this time, I thought: all those scripts, all that resolution, all those customers, devices, and all that type design, were going to require the highest common denominator. /\/\/\:-o (head Scratch.)

"But I’m still not sure exactly what you think this wonderful new font format should do or how it should do it, or what the specific benefits are. Do you have a spec?"

Spec? I'd need to see, or hear, source code for OT-TT & AT&T & the rasterizers, again?

ishamid's picture

Thoughts:

Reading everyone's comments, it seems to me that, as far as the long-term (maybe long-long-term->) future is concerned, we need something like an OOo (OpenOffice.org) approach. Something as important as a universal font format should be an open source as well as openly developed thing. For example, the state of Mass. (following an international trend) has just mandated switching to open document formats for electronic communication.

Between the work of people like the Letterror crew, creative ideas like those of Nick Shinn and SIL, and the cooperation of font creation developers/managers like Adam and George Williams, there is no reason why the type-developer community cannot or should not take the lead here. If the community acts proactively, and unites around a common aproach, the commercial interests will follow its lead in the end, and help to make it successful.

A nice thing about OOo is that, since it's open-), developers can use it as a real life testing ground for ideas; there is no commercial interest to stop you. For truly high-end typesetting there is TeX, also open to hack and develop.

If we get an open standard going, meeting the needs of both the average user and the high-end typesetter, commercial interests will not wait long before jumping on the bandwagon...

Idris

paul d hunt's picture

is this the type of thing you're alluding to?:
OpenType, now more open

sander's picture

umm... dberlow, can you explain some things? Like who that new spec would be useful for? How one would decide which parts of opentype are the insufficent ones? What would convince the app writers who have so far hardly embraced Opentype that something new (in fact something new with hardly any font support) was going to be worth their while? What would convince font foundries (doesn't matter if small or large) to support it?

ishamid's picture

Hi Paul,

is this the type of thing you’re alluding to?:

That is a start, but I was also referring to a concern that has come up in this thread, including dberlow's comments, viz., that the format was designed largely with commercial interests in mind. If so, then typographers and high-end typesetters should take the lead on the Next Big Thing (tm), and not merely follow the lead of the commercial giants. OOo and TeX provide free, open source platforms for testing new ideas in both the mundane word-processing world and the high-end typesetting world, so we don't have to "wait" for the commercial interests to act first; let them follow the community's lead.

Best
Idris

ishamid's picture

Hi Sander,

Caveat: Obviously, I do NOT speak for dberlow;-)

who that new spec would be useful for?

high-end typesetters perhaps? average users who need more flexibility?

How one would decide which parts of opentype are the insufficent ones?

Open Source community style, focusing on the needs of typesetters and interested users.

What would convince the app writers who have so far hardly embraced Opentype that something new (in fact something new with hardly any font support) was going to be worth their while?

Use OOo and TeX for testing and implementation, and the commercial app writers will follow suit, if only to compete.

What would convince font foundries (doesn’t matter if small or large) to support it?

If a few foundry owners are involved in designing the spec, others will come along. The present format was developed by "companies who are overwhealmingly not type founders" to quote David. Ok, then get the foundries involved!

Best
Idris

billtroop's picture

Coming late to this thread, I would like to make a few corrections and observations and apologize in advance for length:

John Hudson writes,

'FontLab only supports primary master interpolation (but as a design tool it also supports extrapolation, thanks to Adam Twardoch, which is a nice feature that MM didn’t have).'

John, a moment's reflection will make you realize that this statement is untenable. All MMs are extrapolable. Think about it. There is nothing in an MM font that prevents this. All you need is a 'Creator' app that will allow the user to extrapolate. Adobe did, indeed, have a version of ATM (could it have been 2.4?) that permitted extrapolation. The feature was taken out because it was felt that the instances were unpleasing and did not reflect Adobe's quality image - always the risk with extrapolation. But any app developer that wished to could write a 'Creator' that did extrapolation. In other words, this is not an issue of the font format, it is an issue of the software used to support it.

'What should be in the great font format? And why are those features better located in the font file than in the development tool? And don’t presume that I’m smart enough to follow anything vague: spell it out for me.'

Let me explain why David is so impatient, John, with the thinking behind this remark. This goes back to the late 1980s and Sumner Stone's reign at Adobe. It had long been realized that multiple axes were a desirable design tool, thanks to the pioneering work of the late Stephen Harper (among others - and have I got his name right?). But it was Sumner who had the idea of putting variable type in the user's hands, and not just the type designers, and then went on to sell John Warnock on the idea. Warnock did indeed eventually embrace it, and of course MM is still a critical component of Acrobat and for that matter any Acrobat-alike, such as OS X Preview. In other words, it is already 20 years ago that someone had the idea of putting variations and other features into users' hands. Having lived through this 20 years ago, David is naturally surprised that the issue should still be up for discussion.

Mark Simonson writes 'There were only a small number of MM fonts built, mostly by Adobe, and you needed ATM to use them.' Mark, I am unaware of any MM font that cannot be used by any app in XP, OS 9 and OS X. ATM is 'hard wired' into XP, OS 9 and, roughly speaking, OS X.

On the other hand, it was as difficult as you like to use the fonts to their fullest capabilities. In this sense you are quite correct.

Thomas writes,

"Yes. Unfortunately, our experience with MMs was that applications were not very interested, and users did not easily “get it.” Of course, one could put this down to insufficient evangelism (my own explanation of it)."

I disagree with this. One program and one program alone ever had the MM interface down to the point where anyone could enjoy using MMs: Lari Software's Lightning Draw GX which worked on System 7.5 through 8.1 and perhaps beyond. Unfortunately, although the revelatory LightningDraw feature set has driven the bulk of Adobe Illustrator improvement from versions 7 through 9, Adobe itself never got the interface right. The problem wasn't evangelism. The problem was that Adobe never showed app developers how to do it right, and never helped font editor developers to make MM development easy. The reason it didn't help font editor developers was because it wanted to keep all good MMs to itself. But Adobe really shot itself in the foot by not committing interface engineers to showing everyone, especially its own app developers, how to make MMs both easy and fun to use. This is really a pity. And of course, as David points out, MMs are hopelessly primitive compared to Variations and other successor technologies.

Paul D Hunt writes,

"dream format: multimaster opentype fonts"

Paul, way back in the infancy of time, say 1997-2000, this was the format that everyone expected to have. MM was withdrawn from the OT spec because Microsoft believed it would be too much trouble for its app developers to support. This is right in the sense that it would be difficult to app developers to support well, in the absence of built-in OS support such as Apple had demonstrated with GX. But it is not right in the sense that the beauty of MMs is that the app doesn't have to know it is looking at an MM. As far as the app is concerned, an MM can be any other font. And in spite of all of that, MS deserves credit for being the first app developer to support the optical axis in MM fonts automatically (as in mid 1990s versions of MS Word). Another word for your dream format is Apple's GX, from the late 1990s, which did just about everything in MM and OT, and quite a bit more. GX almost became a dual Mac/Win standard, but there were some last minute hitches over money. This time around, instead of getting USB2, which really works, we got OT, which really doesn't.

What David is really talking about is that there is no provision in type technology for optimally implemented type. And that's because, as he instances, few here have ever had to design a 10 pt type for 600 dpi rasterization. Few here have any concept what a type can, and should do, beyond its basic unitary design. Type needs to be better, and at the same time it needs to be more fun.

I think it is unnecessary to spend time on the issue of MMs and OT, because I think it is clear, as some at Adobe and other companies believe, that MMs will rejoin OT as soon as OS's and apps run out of other features and need some novelty. In two years, five years, or ten years, we will see MMs in OT again. But, as David would point out, what is the point of doing something so old-fashioned when so many more advanced things could be done? It should be happening now. The painfully simple technology we have today has a bad effect on designers because it doesn't challenge them to go to the extremes of their craft.

Rob O. Font's picture

I was asked to critique the open type format, and I did. Now, I've been asked to tell you who would use this stuff, and I already did. You should r e a d it again if you didn't get it, but I firmly believe that real Chinese and fancy Arabic CANNOT BE MADE for TEXT & DISPLAY, or for SCREEN AND PRINT, to the standards of even metal, with OT. I pointed this out several ways, including to say that I think anyone who believes they are doing the same for Latin is a Lier. Even the great Verdana, e.g. is not suitable for anything but screen and printer agate.

Should a "Standard" be thus so limited? I have asked a dozen questions here and gotten no answers, just more baby questions :)

billtroop's picture

I have a question. Earlier, Thomas spoke of how big an advance OT is, how it shouldn't be surprising that apps can't deal with it yet (10 years later), how 'difficult' everything is.

Wait a minute, Thomas. Apple did this all with GX in the mid 1990s, and they had the code for both Windows and Mac. So where's the difficulty? Where's the 'advance'? In another ten years we might have a bit more OT progress, a bit of MM, but what is that compared to what Apple coded for both platforms in 1995? Something is wrong with this picture, because here we are ten years after OT and neither MS nor Adobe yet supports it fully in their own apps. Under these circumstances, why would, and how could, anyone else support it? And as David points out, if it really is useless for Chinese and Arabic ... what's the point?

Here's my dummy's question. Simplifying the issues to an extreme, is it roughly true to say that the philosophy of GX was to make advanced typography features easily available to app developers whereas by contrast, with OT, the burden is on the app developer to implement advanced typography features himself?

And if OT really is a cock-up, as it appears it is, isn't it better to admit that now and try to fix it before any more app developer time is lost? We've had ten years of excuses. That isn't good enough.

And finally, isn't it time for everyone - Adobe, MS, and even Apple - to look at GX again as a model? GX was not just about type, after all. The point of GX was to provide built-in code for all advanced graphical application development so that, for example, an Illustrator-quality app could be built to version 2.0 quality in six months and to version 8.0 quality in 12 months, and this by a tiny team. I give just one example. LightningDraw was the first bezier curve program to offer transparency. That's because GX had it built in and all the app had to do was call it. That happened around 1996. It took Corel, Adobe, and Macromedia another three years or so of phenomenal effort to get transparency into Illustrator, Draw, and Freehand. What a waste of intellect and shareholder dollars.

Now look at Adobe and MS and their investment in their own complex apps which now enjoy near-monopoly status. And you wonder why they don't really want to make things better for other app developers.

Now look at Berlow's comment roughly to the effect that foundries have to all intents and purposes been effectively excluded from OT development.

Now ask yourself: if you're going to stealth in a closed type situation, what better thing to call it than _Open_ Type?

What a complicated situation this is. Where do we go from here? Where are the people of intellectual calibre adequate to lead us out of this mess?

John Hudson's picture

Hello, Bill. I don't think anyone is going to look at GX again, least of all Apple. But this relates back to what I said earlier in the thread, that you can't divorce type technology from the strategic and economic interests of OS developers unless you take full responsibility for the entire rendering model, plugging into the system resources only when you need to or find it efficient without compromising what you want to do. There are good examples of software that does this, mostly specialised software dealing with e.g. music layout or mathematics typesetting. I mentioned SIL's Graphite as another example: a model that allows SIL to provide support for scripts and languages without waiting for support from MS or Adobe and without being limited to the Mac OS and AAT support.

The great thing about the sfnt file format is that it is so readily extensible and is a great basis on which to build new stuff. There is no reason at all why one couldn't spec new tables to support things one wanted and no reason why one couldn't make tools to build such tables. And now that OpenType is becoming an ISO standard (as part of the MPEG standard) there is a mechanism for having such tables formally recognised as part of the OpenType format withouth relying on MS or Adobe (although the international standards process is not for the the impatient or easily frustrated). But getting something into a font spec, or even getting the tables into a font, is not the same thing as getting the layout behaviour supported in applications. This is where you run up against the strategic and economic interests of the software developers, and so long as you are relying on them for the software to actually use the font you will have to deal with this problem.

So perhaps the answer is to consider high quality typography as a specialist software goal (which is basically what TEX did, although the results were not always so high quality), and not something we should expect from mass market software.

ishamid's picture

This is where you run up against the strategic and economic interests of the software developers, and so long as you are relying on them for the software to actually use the font you will have to deal with this problem.

This is exactly where alternatives like OOo and TeX can play a very fruitful role. Free and open source.

So perhaps the answer is to consider high quality typography as a specialist software goal (which is basically what TEX did, although the results were not always so high quality)

In the case of TeX, I think we have to distinguish the capabilities of TeX from the limitations of its users, who are not generally typographers. Also, LaTeX makes high-end typography difficult because many things are hard-wired in that format (though the relatively recent memoir package alleviates this).

Any high-end software can be used to make mediocre output. Yet few if any layout programs can match what a competent typographer/typesetter can do with TeX-) particularly in some areas like critical editions and similar complex layout, including control of orphans, widows, typesetting on a grid, etc.

The newer format ConTeXt gives non-programming users a degree of access to typographical sophistication with TeX that was previously accessible only to expert TeX programmers. When added to the microtypography features of pdfTeX there is little that ConTeXt cannot do when it comes to hq typography; it is truly an amazing engine and perhaps the best kept open secret in the high-end typesetting world.

Here is a draft manual explaining and illustrating some high-end typesetting control capabilities of TeX:
http://www.pragma-ade.com/general/manuals/style.pdf

See especially pages 29--38

It's not your grandfather's (or even your father's) TeX anymore;-)

Best
I

billtroop's picture

'I said earlier in the thread, that you can’t divorce type technology from the strategic and economic interests of OS developers unless you take full responsibility for the entire rendering model, plugging into the system resources only when you need to or find it efficient without compromising what you want to do. There are good examples of software that does this, mostly specialised software dealing with e.g. music layout or mathematics typesetting.'

John, I want to reread carefully what you've said earlier - perhaps tomorrow. For now I just want partially to parse the passage here. Today, two principal app makers - MS and Adobe, have indeed taken full responsibility for the entire rendering model. MS, because it's in full control of the rendering model for Windows, and Adobe, because of its no-longer-so-top-secret plan, Project Whatever it is called, which has now been fully implemented, whereby each Adobe app (e.g. Ill, PShop, InD) has its own entire rasterizing system built in. Yet in spite of all this control their own apps don't have anything near full OT support, and indeed, far from full support, each app seems to support a different and, remarkably, exclusive subset, if I at all understand what is going on. People are being remarkably careless with their claims here. For instance, earlier Thomas claimed something to the effect that when OT came out, more apps supported it than supported MMs when they were discontinued. But partial support is not full support. It is also possible to say, and I did, that every app on Mac and Windows supports MMs to this day. And they do: but partially, not fully.

I would also like to say that nothing can be more confusing to the user - certainly not slider bars - than any OT font, because there is no standard character set. You know the argument: what do I get new when I buy Adobe Bodoni in OT? Answer: not a single new glyph. Ah, but wait a minute. Answer me this: why did I have to buy a new set of Utopia OT because I couldn't get Utopia Type 1 to work in InDesign CS?

uh - er - umph. Gotta run.

John Hudson's picture

DB: I firmly believe that real Chinese and fancy Arabic CANNOT BE MADE for TEXT & DISPLAY, or for SCREEN AND PRINT, to the standards of even metal, with OT

I don't know enough about Chinese typography to comment on that, and I'm not sure how you compare SCREEN typography in OT with what was available in metal type -- a singular case of apples and kumquats --, but I do know quite a lot about Arabic typography. What can be done with OpenType for Arabic is better that 99% or more of the metal typesetting of Arabic ever produced. In fact, there is really only one example of metal typesetting that surpasses what has currently been done with OpenType, and that is found in one half of one book printed in the Ottoman Empire near the turn of the 19th-20th centuries. It is only found in one half of the book presumably because the system was so difficult to get right that the compositors seem to have stopped paying attention in the second half; perhaps they had a deadline to meet and couldn't spend so long selecting the appropriate special sort to use in each instance. I have not analysed this edition, but I would be surprised to find that there was anything in it that could not be modelled in OpenType. As long as you are dealing with a discreet set of fixed shape sorts that are positioned relative to one another, there isn't a lot you can't do in OpenType with some ingenuity in the layout lookups. What OpenType cannot do is the kind of thing Tom Milo is doing with his Arabic engine, but then no previous Arabic typesetting system has been able to do what Tom is doing, which is to directly model both the rules and the order of the classical calligraphic styles. This involves dynamically adjusting the glyph shapes on the fly and breaking the text string into stratified layers and writing them successively, anticipating what will be the int subsequent layers. Neither of these things can be done in OpenType, but then they couldn't be done in any other typesetting system either.

Tom's technology is another good example of a specialised typesetting system, appropriate for a high end publishing system employing the classical script with all bells and whistles. And it is crucially important that such technology exists, but you wouldn't necessarily want to use it for your Arabic e-mail. There are technologies and levels of implementation that are appropriate to different circumstances, and the level of complexity that you want for one purpose might simply be an inefficient processing overhead in another situation.

Rob O. Font's picture

"The great thing about the sfnt file format is that it is so readily extensible and is a great basis on which to build new stuff."

:) @ack

"I don’t know enough about Chinese typography to comment on that, and I’m not sure how you compare SCREEN typography in OT with what was available in metal type — a singular case of apples and kumquats"

Yes...Well, here's the way it works. The keys tinkle different, to begin where all such discussions of "universal" font formats should, with the users. First, the users who input, and then the users that read the result. (Assuming that the mill of commerce has taught the former to obey the latter even before being told to do so by the Central Party;)).

In this alternate key tinkling universe, the individual characters (40,000 or so), representing what we call words or phrases, are build up of a small number of strokes (18 or so), representing the not-so-long-ago abandoned manual strokes produced with the hand. These small number of strokes play out into hundreds of variants as they are combined into radicals, (parts of whole characters closest in Latin to something like a letter, syllable, or word), and again slight variations occur in strokes and in radicals as they are combined to form complete characters. But back at the input, there are a number of ways to accomplish it; from hunting for the complete character, to typing a phonetic translation in another script to built radicals up to characters.

The broadest method coming into general use is the input of radicals to form complete characters, and the future trend is towards the most intelligent computer-aided input possible, which would allow several parallel and instantly selectable systems, (i.e. The user can float amongst hunt and peck, input by phonetic, or by radicalanji). And what the readers want is fine Chinese, which is to say as much refinement to the strokes as the designer can provide to "even out" the great variety of density and complexity in the Characters. That is the market for text, where I think all "universal font format" critiques rest, but those ideas exend to nearly all applications. The important message I took away from learning this is/was, that the best and future input method, and best output result are both achieved by careful preparation of fonts to make them able to react through the underlying strokes and radicals of the characters,,, to the input and to the output conditions — then making a font by combining the parts in the font tools and removing overlap is obviously the wrong solution for the long term, (a long term which happens to be now relative to when Apple published their Eastern Scripts the first time...).

Why? "I firmly believe that real Chinese and fancy Arabic CANNOT BE MADE for TEXT & DISPLAY, or for SCREEN AND PRINT" — would be obvious to anyone who's designed or claimed to design baby scripts like Latin for multiple size use (and/or low resolution). In addition, when you throw the resolution problem at stroke and radical unaware Chinese fonts, well, its worse than having an elephant squat upwind of you in a hurricane... e.g... I once was asked to estimate "where it might be safe to rasterize the simplest Chinese design without hints, grey scale, or dropout control, so I proofed the first two levels of the Japanese Industry Standard character set (14,000 or so), and located a character with 33 transitions, e.g. 16 black strokes, and 17 white spaces. So the answer was 67 and that is what I still give as an answer for all scripts when anyone asks about that sort of number for any script...67.

This is why I critique this format so harshly: A. Without the use of more intelligent composites and the capability of variations at input/output, not fontput(!), OTKaput 20?? in China. B. Without the employment of more intelligent people in the driver's seat, or intelligent and talkative passengers, we only have the best "Latin First!" font format in OT.

And...Metal is not the object, it is a Marker.

John Hudson's picture

Okay, so it turns out I know more about Chinese typography than I thought, since none of this was new to me.

The broadest method coming into general use is the input of radicals to form complete characters, and the future trend is towards the most intelligent computer-aided input possible, which would allow several parallel and instantly selectable systems.

If this is true, then it will presumably drive technologies to support such input, including font technologies.

Rob O. Font's picture

"Okay, so it turns out I know more about Chinese typography than I thought, since none of this was new to me."
:) I know, You ignore this wisdom in Latin type too.

"What OpenType cannot do is the kind of thing Tom Milo is doing with his Arabic engine, but then no previous Arabic typesetting system has been able to do what Tom is doing, which is to directly model both the rules and the order of the classical calligraphic styles"

:) In some parts of the world, and I know I'm not saying anything new here, there is a great deal of interest in skipping metal typographic influence, going straight from calligraphy to computer composition. I see this need in Every Culture. The problem for Arabic is one that you don't see, even though you're staring Milo's stuff in face?

ISO, (aka, I Surrender, Okay?) didn't get hints, features, interpreters or rasterizers did they? Did they get anything more than Kontours and Kerning? (ISO gets hand-me-downs that no one thinks afford more competitive efforts). These "assets" are then frozen in a bureaucratic Tar Pit, while the important things, mentioned as not in ISO's sticky lit'le hands, will continue to divide and retard. The boat has been rocked biannually, as I promised a long ago.

William Berkson's picture

Nothing to do with the topic of the thread, but a correction:

>not-so-long-ago abandoned manual strokes produced with the hand

>coming into general use is the input of radicals to form complete characters, and the future trend is towards the most intelligent computer-aided input possible

AFAIK, neither of these statements is accurate. I was recently in China. Everybody is trained to do decent looking characters--which are not easy--and I was struck that every broken-down shop has hand-written signs and posters that generally look quite competent, and sometimes excellent. The tradition of writing characters is strong, and alive. (They generally don't understand that roman characters look wrong when written with a round brush, but that is another matter.)

The current method of inputing Chinese characters into the computer is through the romanized phonetic Chinese, Pin Yin. You type the phonetic pronunciation in roman characters, hit the tone number (1-4) and the character pops into your text. Everyone in China is now taught Pin Yin writing along with Pu Tong Hua (Mandarin), so all young people can use use the computer in this manner, if they have gone through elementary school. There is no reason to think the inputting through strokes or radicals is the wave of the future.

The traditional education involved simultaneously teaching water color painting and calligraphy. I thought I heard that this is being dropped, and the use of romanization for the computer in this way may threaten the general excellence of Chinese writing. By the way, you can't judge Chinese graphic design by the horrible examples you see in China Towns in US cities. The general level of graphics in Chinese cities is much better. How they handle words in roman text and logos, which are fashionable, is often clumsy, though.

John Hudson's picture

The problem for Arabic is one that you don’t see, even though you’re staring Milo’s stuff in face?

Our difference is that I see the need as being particular to circumstance, and I'm not convinced that the technology required to do what Tom does is the same technology as I need for other kinds of Arabic text processing. This is not simply because his technology is slower than OpenType, but that OpenType itself is too slow when it tries to emulate (imperfectly) calligraphic models. The problem is in using processing models of great complexity where a simpler model is appropriate to the particular circumstances. You might be willing to wait for a page of the Qur'an to be slowly rendered in immaculate emulation of the calligraphic norms, but you don't want to wait the same amount of time for an email using a much simpler style of the script to be displayed. This is why I favour multiple technologies tailored to different circumstances, rather than a one-size-fits-all technology, whether that is OpenType or something else.

[UPDATE 1 May 2007 : the above comments about the relative speed of OpenType vs Tom Milo's ACE layout engine were based on secondhand claims. Tom challenges them, and I have not had the opportunity to make a direct comparison myself. Tom's new Tasmeem plug-in for InDesign ME will hopefully provide such an opportunity, and I'll be more than happy to withdraw the above comments. For now, they should be taken as reportage of what I had been told, with full acknowledgement given to Tom's disagreement: 'ACE beats OT hands down. It was already fast on a 4Mhz computer in 1986. In Tasmeem there is no apreciable difference in rendering speed between ACE and OT - yet the ACE tasks are far more elaborate.']

I recently made an OpenType font that supports correct display of the Hebrew Bible text. In order to handle the complexities of the Biblical text, my font ended up being many times slower to process than a typical Hebrew font designed for use in modern Hebrew. The trouble is that my font remains that much slower even when used for modern Hebrew text. This is an example of what I am talking about: complexity that is appropriate for the circumstances of typesetting the Hebrew Bible is not appropriate for reading Hebrew email. In this case, the complexity is within the font, but in Tom's case the complexity is in the rendering engine. In either case, the complexity is excellent and necessary for its intended purpose, but is an ineffeciant overhead when the technology is employed for other purposes.

Si_Daniels's picture

>ISO, (aka, I Surrender, Okay?) didn’t get hints, features, interpreters or rasterizers did they?

ISO gets the complete OpenType specification 1.4 (which is already part of MPEG) with some implementation specific text and company names and trademarks removed - that's their rules. Once its an ISO standard I wouldn't be surprised if Apple proposes the addition of their variations stuff. I look forward to seeing what you, Erik & Just, Bill and others propose too.

sander's picture

Is an ever more perfect rendition of calligraphic arabic according to centuries old rules really the future (and end-all) of Ararbic typesetting?

ishamid's picture

Hi Sander:

Is an ever more perfect rendition of calligraphic arabic according to centuries old rules really the future (and end-all) of Ararbic [sic] typesetting?

There are two schools of thought on this. One school, perhaps the dominant school, says, ``Hey, let's adjust Arabic-script typography to fit Western digital typography technology; forget the manuscript tradition''. The other says, ``No, let's create a technology that can bring the manuscript tradition into the age of digital typography just as it was brought into the age of metal-type at the turn of the 20th century''.

In my own opinion, there has been very little that comes out of the first school that is aesthetically pleasing at all. The first school represents a defeatist attitude that is unfortunately quite common. It is also unfortunate that the only available book on Arabic typography promotes this defeatism.

There is a perhaps hidden subcontext to your question that needs fleshing out: The relation of calligraphy to manuscript writing in Arabic is a bit different than that relationship as it exists in European history. Although Islamic civilization was aware of printing technology before Europe, it was hardly accepted for serious works until the 20th century. It was never a matter of technology but of aesthetics. In the context of a Muslim civilization that maintained its territorial and intellectual independence, there was the spine to accept metal press technology only on its own terms. As Imperialism took root and the new coopted intellectual classes begin to worship everything European, that backbone gave way to the defeatist attitude that dominates so much of Arabic typography to this day.

I cringe when I read Arabic books today, just like Knuth cringed when he saw the first printing sample of one of his books using the then nascent digital typography. But instead of being defeated, Knuth spent ten years designing a system that can be programmed to implement the bulk of traditional typography plus be ready for future developments.

That is exactly what Arabic needs today, so that books can be enjoyable to read again. And so I can get on with my own critical editions...

Thomas Milo takes the second approach; opentype capabilities help as well.

By the way, one can take a proactive approach to traditional Arabic-script typography without being reactionary. Once the proper rendering algorithms, etc. are available, we can build on tradition and move into the future from a truly authentic foundation, as opposed to one based on the limitations (wrt Arabic-script) of Western digital typography technology.

All the best
I

Rob O. Font's picture

There is something called Adjacency. The issues of Adjecency manifest themselves in different ways in each of the major scripts. Having separate technologies to deal with them, is as stupid as I've ever heard, but Cutting Edge Luddites have their entertaining moments don't they!? :) Is there a type word beyond more kerning pairs, for typographic adjacency,,, please?

With Latin, and with the exception of connecting scripts and a few other type styles, in our everyday working fonts we've herded our adjacency issues into Kerning Pairs, and, standing by our love of long-necked "f's" we do all sorts of things. This had been a nontrivial exercise from Gutenbaby's time until mechanical composition nearly killed even the f ligatures. Later, just to make sure we didn't find ourselves behind in an adjacency crisis, we let the beast spread from the simple problem of adjacent characters in a font, to the far more profittable and complex problem of adjacent style issues. Chinese, solves its adjacency "upfront", in a sense, caching all the combinations with a myriad of parts into each character, and solving the issues well enough to set characters in any direction and dimension or on any curve, without kerning or substitution.

Arabic, has a few styles without adjacency issues, mostly simplified, scratched and modern, I think. It also has some with moderate, or easily predictable adjacency that I'm sure are solved from metal to OT. But the further, back (or forward !;) ) you go, perhaps in relation to some "golden age" of leisure time (or a future age of super computing power in a pack of gum), you find what I'd call roiling adjacency, where, yeah, it'd take a little while for the final forms to settle as issues resolve themselves among the input (BUT SO WHAT!). It could be unique when it finally stood still, depending on that input. But we'll not know for while yet, we're still making up our dumb "a**" logos. ;)

The important think (!) to remember, as this seems to such cause great shirking in the name of progress, is that when you put a trailer hitch on to the back of your car it does not automatically slow your car down. In fact, you have to hook it up to something for that to happen. Another important think to remember, even though you probably didn't know it before, is that this problem could be defeated much like the towing problem, by adding a hook that allowed composite parts to be stacked (like trailers rollin' down the highway), instead of being processed independantly, and all existing fonts'd still work JUST FINE. In addition, real designers would also be able to make parts react to composition the way the brain sometimes wants to read 'em.

"That (stack-based composites and variations), is exactly what Arabic needs today, so that books can be enjoyable to read again. And so I can get on with my own critical editions…"

nadine_chahine's picture

Sorry for joining in late... I'll try to be quick as there's a lot to be said and never enough time.

>There are two schools of thought on this. One school, perhaps the dominant school, says,
>“Hey, let’s adjust Arabic-script typography to fit Western digital typography technology;
>forget the manuscript tradition”. The other says, “No, let’s create a technology that can
>bring the manuscript tradition into the age of digital typography just as it was brought >into the age of metal-type at the turn of the 20th century”.

There's another approach to this and it fits neither schools: we are here to solve problems. If the design problem is a new one, then the solution will be new. If it's an old problem, then old solutions would work. I refer here to the aesthetics and not the technology as rendering calligraphic designs is still a work in progress and still unavailable to a large audience.

Example 1: An Arabic signage face for an airport is a new problem (ie 20th century and later) and as such, when you design it, you don't think: hmm, what would the Ottomans do? Rather, you look at what the problem is (legible from distances etc...) and then see what style of the script fits best and how can you make sure that it fits the design brief. Obviously, not a Thuluth or a Nastaliq.

Example 2: An Arabic face for the setting of a poetry book. Well, there's a large repertoire of script styles for that, wonderful! Call Tom Milo and you'll be happy.

I think we miss out on a lot if we say that one or the other is a better school of thought. The important thing is that the type needs to function.

A slightly off-topic note: one does not need complex technology to make good Arabic typefaces. The heritage of "unhappy" design solutions is not just because of the technology. It has a lot to do with the designers as well. Take a look at Tim Holloways' Markazi font. It can be easily represented with OT today and it is one of the most avant garde and fresh designs that has ever been designed for Arabic, absolutely wonderful. It is obvious that he is a talented designer and understand 2 important things: 1- the nature of the script, 2- what a typeface needs to do.

ishamid's picture

Hi Nadine,

U r right of course, we are here to solve problems. The "two schools" thing is a meta issue wrt the best way to approach solving the problems. What some of us believe is that, even in designing e.g. an airport logo, technological access to the tradition is important, and new trends can be developed on that basis. And one need not be reactionary, as Mohammad Zakariya points out.

Could you point me to a sample of Tim Holloways’ Markazi font? I'd like to see it-)

I do not deny that good work can be done even with limited technology. But with appropriate (as opposed to "complex") technology one is even better situated to solve problems: one has more choices.

BTW: I am sure I've seen Nastaliq airport logos before, in Iran, Pakistan, maybe even Kuwayt...
-))

Thank you for your comments!

Best
I

ishamid's picture

Hi David,

Could you flesh out your adjacency/stack-based-composites-and-variations idea a bit more? I have forwarded your ideas to the people working on the next generation of the TeX engine; would you be willing to consult on making these ideas more precise for us?

Feel free to contact me off-list to discuss this. It is all open source but there may be a funding option-)

Best
I

John Hudson's picture

David, yes. What you are saying now is much less vague (thank you), and I agree.

Idris, you can see a sample of Tim's Markazi type in the awards section of Language Culture Type. I rate Tim as one of the greatest living type designers, but his work is often overlooked because it has been exclusively in the realm of non-Latin scripts. If you have the Adobe Acrobat 7.0.5 upgrade, which implements support for Arabic, Hebrew and Thai, you should have the new Adobe Arabic fonts somewhere on your system. This new design by Tim is a good example of what Nadine is talking about: Arabic type design starting from a question of purpose (in this case the quite broad application of modern business communications in a wide range of languages -- the sample below is Persian).

Rob O. Font's picture

Hi Idris,

"Could you flesh out your adjacency/stack-based-composites-and-variations idea a bit more?"

I can try. Once you have the composites on a stack, you have two sorts of opportunities to deal with; One are additional design issues, how and when the different parts are called and what else you can do to them in combination with x-y movement, scaling, rotation, location in variation space and hints, all at your disposal to best solve the "local" marking and adjacency issues and improve to either a wider range, or a more precise target of quality, for output. There's a lot more to say about that though to make it clear in detail, but the singular most important thing is without the right APIs, the founder must precache all the "arrows" he or she wants to "target" to "users". The second opportunity is of course in the output, how does the stack with interacting components and variations get the best possible transformed image to the rasterizer, and then to the user, who presumably asked for it in the first place, completing a loop with the type designer, if they're good. Today's font API's don't dare ask much of the user, os or beyond, and therein lies the barrier, a sparsely populated loop of possibilities that practically no one can breech for the masses of PC users.

TeX, bless it's heart, has been applied to every single script, application and occupation, but has never made a mass audience happy. I will gladly help you in and out of list, but not for hire. Nadine, John, and many others have said that the spec. comes from the users. I think that the spec. "they" need, (i.e. for what could be advertised as a "World Wide Operating System") is what we've got now, plus, a "few minor tweaks" a new old table or 3, and an API adjusted here or there to "ask more." The major problem lies in the will of three companies, Yes, I can arrange some sort of thing with ISO all by myself to change the Written Spec., but it'd be so much more effective if the companies who were making billions of users wear the typographic equivalent of cardboard shoes to formal occasions Actually do it. They are the API Barons and without really getting them overboard, as I'm sure SII can attest if the Seahawk Bowl hasn't completely consumed his reason, canoeing without a paddle comes to mind.

And John, (thank a higher deity of your choice), agrees with me fully, or at least I've rocked enough so he can see over the side,,,;) which means I can call my brain surgeon off defcon 4, where he was fully prepared to downgrade me to a clam shucker, or pack me off to a grape stomping school, he's not quite sure, my brain surgeon having received a C- in the phrenology of type from the University of Southwestern Kamchatka at Murlké. This Arabic Specimen is absolutely gorgeous! Crisp, Clean and Precise. For numerous applications, I'd think it'd be perfectly suitable. But on the other hand, I could easily imagine applications where it would be cold, monotonous and inappropriate. But I can also see a relative of this type squirming and twitching around slightly into 37 different word spaces, 4 different 2-dot-accent variations, 246 versions of the 75 glyphs present, not just for better line endings, but then it'd be appropriate for something else, and then something else...for something else.

John Hudson's picture

This Arabic Specimen is absolutely gorgeous!

I'm very pleased to report that today we received word that Tim's Adobe Arabic design was selected in the TDC type design competition.

ishamid's picture

Hi David,

I can try. Once you have the composites on a stack,

Thnx for the clarifications. I also got some feedback from the pdfTeX team on your original suggestions, which I will forward soon.

TeX, bless it’s heart, has been applied to every single script, application and occupation, but has never made a mass audience happy.

With ConTeXt there is now hope-)

I will gladly help you in and out of list, but not for hire.

That's fine, it's all open source anyway-) Thank you very much for offering your help. I turned my email setting on so you can contact me directly (and i don't need to invite spam by posting it publically):

http://typophile.com/user/10590/contact

We are preparing for heavy movement in the pdfTeX team so now is the best time to get things right-))

Thank you again David.

Best
I

sander's picture

Idris: There are two schools of thought on this. One school, perhaps the dominant school, says, “Hey, let’s adjust Arabic-script typography to fit Western digital typography technology; forget the manuscript tradition”. The other says, “No, let’s create a technology that can bring the manuscript tradition into the age of digital typography just as it was brought into the age of metal-type at the turn of the 20th century”.

But do you see these two sides as the only two possible sides? Besides, aren't these technological sides, and not really font design and looks related as such? Is part of what you are saying with bringing the manuscript tradion into 20th century is akin to being able to properly spell naïve? Or am I confused?

I'm still also mystified at the lack of the evidence in form of additional features (which can do pretty darn anything[1] with just the existing lookup types) for a push in the direction of change. After all, if it is being held back, surely pressure for such with accompanied patches for pango / qt would cause some to exist by now?

[1] and by anything I mean that you can make a virtual machine equivalent to Turing machine

ishamid's picture

But do you see these two sides as the only two possible sides? Besides, aren’t these technological sides, and not really font design and looks related as such?

The two are related. Without the adequate technology, font design suffers. Why design a serious font that implements fine traditional typography if there is no engine to render it?

Again, I don't deny that there are works of beauty out there with the limited technology, but even those designs could be better if there were a technological paradigm appropriate for Arabic out there.

The ligature approach is extremely inefficient as far as traditional Arabic is concerned, and puts far too much burden on the type designer. With the correct technology one can do really fine Arabic typography with no ligatures, and type designers would take less time on any given advanced font.

Is part of what you are saying with bringing the manuscript tradion into 20th century is akin to being able to properly spell naïve?

Yes, that is part of it. For example, there is no application that makes it easy/painless to implement vowels and diacritics. Try editing/correcting anything more than a couple of vowelized sentences in Word or anything else and u'll see what I mean.

Best
I

ishamid's picture

Hi David,

The following is from Taco Hoekwater, one of the pdfTeX core developers:

``have to read through that thread more carefully, but I think I
understand what David is talking about. What he has written so far
is not very helpful in the practical sense, so I am looking forward
to some more detailed elaboration from him. Saying that glyph resolution is a matter of 'Adjacency' is like stating that hyphenation words is a matter of 'Syllable recognition'. Definately correct, but not very practical (yet).

``I find it intrigueing that this is very close indeed to the Knuthian
concept of a ligature&kern-program. I suspect this also brings the
greatest flaw in his proposition to the spotlight: for the average
user it will be too hard to program such a stack.

``Perhaps the solution for character to glyph mapping could be something like this: Assume a 'script' definition as in Unicode. Add a 'language' setting, and a 'typographic style'. Call that a 'Script environment'. All Adjacency within such a Script environment can be written down independant of actual fonts. You have to make some combinations in a Script environment optional of course, so that font families do not have to be perfect.

``The font-specific information then comes down to a quantification
of the current 'Script environment'-s parameters, and that is something that should be doable for a non-programming typophile.''

Let us continue this discussion-)

Best
Idris

Thomas Phinney's picture

Various comments:

coming into general use is the input of radicals to form complete characters, and the future trend is towards the most intelligent computer-aided input possible....

Even if this is true, it is essentially independent of fonts. Each collection of radicals need to be mapped to a character, and this can happen before or after mapping to glyphs in the font. Or contrariwise, a character could be mapped down to a collection of radicals for drawing purposes. You still need to know the character regardless.

Incidentally, there is more than one proprietary Asian font development environment that uses a stroke-based system to assemble glyphs from collections of radicals. But that's a separate question from what format the font ships in. It's akin to the fact that we can use multiple master tech as a design tool regardless of what format we ship the final fonts in.

There is something called Adjacency. The issues of Adjecency manifest themselves in different ways in each of the major scripts. Having separate technologies to deal with them, is as stupid as I’ve ever heard....

I agree. It's a good thing that OpenType uses the same technologies of GPOS, GSUB, and contextual layout regardless of the script. Sometimes the feature tags differ, but that's pretty trivial, really. (Of course, one could say something similar about GX/AAT as well.)

Is there a type word beyond more kerning pairs, for typographic adjacency,,, please?

Contextual layout behaviors. This encompasses kerning and more. Note btw that kerning in OpenType is not restricted to pairs....

BTW, David, Chinese does not solve the adjacency problem by grouping radicals into characters. Chinese solves the adjacency problem by then making those characters monospaced, at least in metal and AFAIK film composition systems. However, Chinese writing is not necessarily monospaced; the typesetting systems imposed new constraints upon the pre-existing script. (All of this is true for Japanese as well, btw.)

WRT GX/AAT, whether or not Apple ever developed a version for Windows is irrelevant - they never deployed it on Windows, and they refused to license it to Microsoft. Result: GX/AAT is and has always been a Mac-only technology in any way that matters.

For multiple master fonts, ATM Light is required on Windows 2000 and XP to get them to work. (Perhaps the default instance will show up without ATM, I forget how that worked.)

I'm not going to respond to Bill's comments and speculations, regardless of the truth or falsehoods in them. I did a big public presentation at ATypI a while back on why MM fonts went away, and I was there when the decisions were made. Bill can believe what he wants.

Cheers,

T

Rob O. Font's picture

"more than one proprietary Asian font development environment that uses a stroke-based system to assemble glyphs from collections of radicals. But that’s a separate question from what format the font ships in"

I'm sorry Thomas, but this is not correct. Imagine for a second, that you were in charge of the Corporate annual report, but the font your company uses does not come with the Warnok glyph? This' to say, it may be true for Latin, that there can be a clean and complete break between the technology of the founder and that of the user, (though I for 1 (and apparently, only), do not believe it), but the universal requirement for where the break is, is broader. Broad enough, in fact, that I believe some users need access to composite controls, and even more need it for variation control, or, as I said before, foundries of quality are forced to ship 100's of styles per family.

"BTW, David, Chinese does not solve the adjacency problem by grouping radicals into characters. Chinese solves the adjacency problem by then making those characters monospaced,"

Or not. I don't think I understand your point, but that may be because you don't understand mine? "Chinese does not solve the adjacency problem by grouping radicals into characters" could only mean that you don't consider the Chinese character to be a word, or what? They solve the problem by caching the word into a uniform cell, and then using those cells uniformly, for the most part.

Idris,

I'm a bit confused by Taco.
"I suspect this also brings the greatest flaw in his proposition to the spotlight: for the average user it will be too hard to program such a stack."

Does he mean the average type street user, or main street user? There is no intent for the average user to ever even know the stack exists, or that a stack could or should exist for them to look for...So, maybe the issue of discussing this is that I have no idea what common ground Taco, you and I have in anything from hinting to semantics. I mean, it took me two or three weeks to convince a few people around here, and they and I should be, more or less, on the same patch of intellectual ground. ;)

ishamid's picture

Hi David,

I’m a bit confused by Taco.

It seems to me that the point is to put a mechanism in place where the typical user can, at a high-level, define how TeX uses a given font for the kinds of advanced capabilities that we have been discussing, and do it in a way that fits your ideal of

and all existing fonts’d still work JUST FINE

I don't think TeX is concerned about hinting and related font technology (should it be?). The main idea I get from your comments is a mechanism to implement advanced-script needs with at least a reasonable degree of font format agnosticism.

What Taco is trying to say (I think) is that, the way Knuth did it, the "ligature&kern-program" is done at a level to which the normal user would not have high-level access. But for issues of adjacency some high-level access may be required because the various script-issues involved may not be reducible to a single paradigm, as in the case of traditional Western typography. TeX may implement, for a given advanced script, a generalized model, and users can give the parameters for a given font.

But perhaps this makes font format agnosticism difficult, because you need something like an afm file to provide the needed parameters for generalized kerning aka adjacency. Anyway, the engagement of typophiles like you is a great asset as the right course of action is plotted.

Note that I refer to use 'advanced-' instead of 'complex-' or 'problem-' with respect to scripts like traditional Arabic and associated technology. The latter two adjective-prefixes have pejorative connotations that feed an incorrect framing of the overall problem. To say that ``This problem/complex script requires complex technology'' basically frames the issue in a way that lets font designers and software companies off the hook...
Best
I

PS. The font that typophile.com uses a right single quote that is out of harmony with the left single quote. I think that a site like this one should look into that-)

John Hudson's picture

The font that typophile.com uses a right single quote that is out of harmony with the left single quote. I think that a site like this one should look into that

Idris, ` is not a left quote, it is a grave accent (U+0060). You should use the ' (U+0027) key for both left and right quotes, and let the clever algorithm figure out how to display them 'thus'.

ishamid's picture

Thank you John. I fixed it but the algorithm is still inconsistent: ``'' works for dbl-grave+dbl-apostrophe but the analogous situation does not work for sngl-grave+sngl-apostrophe. Indeed, I cannot input a single quote pair at all without a context. More consistent would be to force the use of the dbl-quote symbol twice, which does work.

So the typophile algorithm should either remove the dbl-grave+dbl-apostrophe or add support for sngl-grave+sngl-apostrophe==>left-right quotes.

-)

Best and Thnx
Idris

William Berkson's picture

Hmm. David and Thomas, are you saying that Chinese language programs assemble Chinese characters on the fly from radicals?

I don't think this is true. Also it seems mind-boggling to think of a program that could do this, given that there are none, it seems, that automate kerning and bolding that well. Assembling the radicals would also involve rescaling as well as bolding and thinning, and the characters splash around more than roman, so the fitting problems would be greater.

In 'Now Read This,' on the new Microsoft Clear Type fonts, they say that the Kanji characters (= Han Zi = Chinese characters) for the font Meiryo were drawn by a team. The first 3000 were drawn, and "carefully analyzed as models to make subsequent production of the rest as automated as possible." The others were produced by the Japanese firm C & G, "using their in-house automated production method working with the analyzed data of kanji radicals." I gather from this that they needed additional information on each radical--perhaps for just this typeface--as to how to scale it, and perhaps also needed to look and correct.

In any case the all the Kanji characters are in the font, and not assembled on the fly. As I said, Chinese is input using the Pin Yin romanization, and I would guess that Japanese is input using the Japanese alphabet, hiragana, rather than radicals.

Rob O. Font's picture

" TeX may implement, for a given advanced script, a generalized model, and users can give the parameters for a given font."

I see. I think this should say:
TeX may allow the implementation of a generalized model,for a given enriched opportunity script ;). Then, typographers can create fonts and the typogrphy required for this opportunity, and users can interact if they need to with the parameters for a given font's typography, or in cases of extremely rare opportunities, with individual characters within the font, but in general, the user's are specifying no more than line length, leading point size and other "last opportunity" variables, and the font/TeX takes the other cares away...

"Hmm. David and Thomas, are you saying that Chinese language programs assemble Chinese characters on the fly from radicals?"

Hmm? There are some for sure, that assemble on-the-fly, which is to say, when the character is called by the font manager little parts recipies and interpreters whirl out Chinese glyphs. My guess, is when the MS and Apple fonts were being specified, the imagination of the specifiers was to ask the Japanese partner for "The" definition of "Quality". That definition was of fonts made in digital format to mimic a size of a font made for metal or film. Narrow specification complete for wide application, figure out the input your selves... Other font makers "over there" have other standards of quality more appropriate to applications outside of print, where vast glyph armies that don't scale, rotate, skew, contour, shadow or color well, are... less useful to the market.

billtroop's picture

Thomas writes,

'I’m not going to respond to Bill’s comments and speculations, regardless of the truth or falsehoods in them. I did a big public presentation at ATypI a while back on why MM fonts went away, and I was there when the decisions were made. Bill can believe what he wants.'

Thomas, I think you should hold the heat. This is a pretty fact-oriented discussion and anytime someone says something that someone else doesn't understand or disagrees with, there is appropriate give and take until the issue is worked out. We can all understand if you just want to vent ire, but if you have a specific problem with something I've said, tell us what it is, and back up your position with some verifiable facts. Do not waste everyone's time with 'I forget how that worked' kind of remarks. 1. you should know. 2. you should spend the two minutes it takes to find out before you waste the bandwidth. Everyone realizes that you are not one of the towering intellectual giants of type development. But tidier mental habits would bring you a lot closer to that goal, and your intellect, functioning at its possible best, is desperately needed in the type industry. In the meantime let's focus on what is so importantly emerging here, a potential synergy between Berlow and these TeX people.

Syndicate content Syndicate content