So: Why did it take ten years to get ClearType turned on by default?

joeclark's picture

According to Dick Brass, a former Microsoft VP, it was internal political sabotage:

“[O]ther Microsoft groups… felt threatened by our success.

“Engineers in the Windows group falsely claimed it made the display go haywire when certain colors were used. The head of Office products said it was fuzzy and gave him headaches. The vice president for pocket devices was blunter: He’d support ClearType and use it, but only if I transferred the program and the programmers to his control. As a result, even though it received much public praise, internal promotion and patents, a decade passed before a fully operational version of ClearType finally made it into Windows.”

(Via Vince Connare’s Facebook.)

Si_Daniels's picture

> So: Why did it take ten years to get ClearType turned on by default?

I'd put it closer to 8 and a half years, and the answer in my opinion is "because of CRTs and Vista". The technology announced towards the end of 1998 missed Windows 2000 (came in a bit too late) and shipped with Windows XP in 2001 - that's three years. With CRT's still in the majority it didn't make sense to make it the default, although many laptop OEMs, including all tablet makers did turn it on. I think it was always the plan to make it the default setting for Longhorn (based purely on LCD panel penetration), but the schedule slipped - Vista shipped in Jan 2007 - about 8 years after CT was announced.

Richard Fink's picture

I like the irony in the article's title: "Microsoft's Creative Destruction".
It's an allusion to economist Joseph Schumpeter's phrase, "Creative Destruction" to describe how the process of entrepreneurial innovation destroys what came before.
Schumpeter was being laudatory. "Creative Destruction" is a good and necessary thing.

Brass - or an editor at the NYTimes - cleverly uses the term to describe another kind of destruction: one that frustrates and defeats exactly the kind of innovative birthing that Schumpeter's phrase was coined to describe.

Cool. A double entendre. Somebody's on the ball.

henrypijames's picture

The more fundamental question is: Why should sub-pixel rendering be turned on by default at all? Despite all the arguments for it and against aggressive hinting (as laid out in extensive detail by the Anti-Grain Geometry project), it remains a subjective and personal decision whether "the trade-off between sharpness and functionality" is worth it.

I've tried ClearType many times over the years, and every time I end up turning it off after a few days because I continue to find it unpleasant. Has anyone ever done a scientific study (with large sample size) on whether the "common" computer user actually prefer one over the other, and by what margin?

quadibloc's picture

Except for processor speed issues, the case for ClearType would seem clear. After all, compare how one can, on an ordinary TV set, read printing in real typefaces in things like movie credits or titles, compared to computer printing at a similar resolution.

It is true, of course, that if sub-pixel rendering is allowed to turn hinting off, you will get situations where a lowercase "e" looks like a dot. So one does need a high enough screen resolution before it is useful.

Syndicate content Syndicate content