Redesigining the human eye

John Hudson's picture

There's been a lot of discussion about readability over the past few years, on Typophile and elsewhere, much of it focused on the question of whether readability can be significantly improved at the type design level, or whether we're looking at diminishing returns with no significant gains in reading speed and comprehension ahead of us.

Today, while reading a book on animal perception and behaviour (Animals in translation by Temple Grandin), it struck me that we're going at this from the wrong end. Rather than designing new typefaces to try to gain some small improvement in readability of text, we should look at redesigning the human eye to make it much better at reading. I was prompted to this thought by the observation that, while human eyes have a concentration of cones in the round fovea at the back of the eye giving us sharp focused vision in a small circular area, many animals -- especially predators and prey in open plains -- have a horizontal concentration of cones, giving them a vertically shallow but much greater horizontal sharpness of focus. [This is why you often see e.g. sheep dogs lower their heads to observe a field of sheep: they are aligning their 'streak' of high focus sight.]

It seems to me that if human eyes were redesigned to have a similar horizontal concentration of cones instead of a round fovea, we would be able to massively extend the length of our saccadic jumps, and perhaps even read entire lines of text in a single fixation. Of course, this presumes we are dealing with horizontal text; Chinese and Mongolian readers may wish to have their new eyes fitted with vertical concentrations of cones. Actually, the ideal situation would be to have the arrangement user-adjustable, so that one could use a slider to move between a round fovea to a horizontal or vertical arrangement of cones depending on the task one was doing.

Of course, many people might object to the idea of having their eyeballs surgically altered in order to improve reading speed, even more than they object to having their writing systems cut up and rearranged. But if we do eventually reach the stage of being able to cure blindness with artificial eyes, i.e. by plugging into the visual cortex and feeding it signals, one could conceivably have a set of reading eyes, just as one now has reading glasses, which would not replace your normal eyes but be carried about and plugged in when one wants to read something.

hrant's picture

This reminds me of a speculative article I read in the late 70s in OMNI magazine: countries were genetically modifying their athletes to perform better at the Olympics. I remember the Soviet boxer simply could never be knocked out: his brain was in his butt.

But I don't mean to sound dismissive: even if something
seems crazy-impractical it can still teach us things.

hhp

hrant's picture

But I do think there's a far less invasive approach that would yield equivalent if not better results as restructuring the retina (and I've mentioned it once or twice here and there): have dynamic typography that changes depending on where you're looking; text that's further from the fovea is simply set larger, to give more bouma resolution in proportion to acuity loss along the parafovea. Some simple eye-tracking hardware/software should do it.

> sheep dogs lower their heads to observe a field of sheep

Actually that works a little bit with humans too.
As does looking away from something to see movement better.
Try it!

hhp

hrant's picture

And I've reminded myself of a very good reason not to dump your rods: movement. Rods are much better at detecting it than cones. Plus for humans, movement tends to come from the sides... Like if you ever have to drive through a blind intersection, look straight ahead.

Yes, I'm a big fan of speeding up reading, but
I'm a bigger fan of not having car accidents! :-)

hhp

John Hudson's picture

Ah, but I don't drive :)

Actually, when you suggested that I should try looking away from something to better see movement, the first thing I noticed is just how still everything around me is. The movement of the fronds on the tips of the cedar branches is so subtle and gentle, you actually have to be staring straight at it to see it at all.

Shifting or dumping the rods to the left and right of the fovea would certainly be bad for many human activities. This is why the new eyes should be adjustable, or else there should be separate eyes for different tasks. People with vision problems have different kinds of glasses for different purposes, so it isn't difficult to imagine different kinds of artificial eyes, or eyes that adapt to particular tasks in the way that polarising glasses adapt to different light levels.* We would expect such a technology to correct vision impairment -- and won't that be a welcome end to the large-print edition! -- but why not also provide task-specific vision enhancement. You could also have driving eyes to go with your natty driving gloves, with extra rods to detect movement even faster.

* Speaking of polarising, here's another thing I learned today: 'Dung beetles can perceive the polarisation of moonlight'. That's incredibly cool; even cooler than redesigning eyeballs.

hrant's picture

> I don’t drive :)

But do you cross roads? :-)

> you actually have to be staring straight at it to see it at all.

Maybe it depends on the scale and frequency. Because the greater general capacity of rods in detecting movement is well-established. Another anecdotal test: you can see the interlacing on certain computer monitors only via the parafovea.

Speaking of animal vision, one of the most impressive is that of the eagle: where the human lens can be made globally thinner or thicker to focus various distances on the fovea, the eagle's lens can change thickness all across its surface; the entire field of vision, not just the few degrees right in the middle, is kept in focus. Wait, maybe this sort of thing could be almost as good for reading as rod/cone stuff? Actually I guess you'd need to do both...

"Yes doctor, I've decided to go for the full RetRes with
LensMorph option! I have to read War & Peace tomorrow."

hhp

bieler's picture

John and Hrant

You might want to read this

http://jeb.biologists.org/cgi/content/abstract/116/1/385

The complete article with bibliographic references is also available in PDF form.

You might want to stick to type design.

Gerald

The Bieler Press
http://BielerPress.blogspot.com

YvesPeeters's picture

another intresting read about online text can be found at:
http://venturebeat.com/2007/05/10/live-ink-offers-better-way-to-read-tex...

dberlow's picture

"Actually, the ideal situation would be to have the arrangement user-adjustable, so that one could use a slider to move between a round fovea to a horizontal or vertical arrangement of cones depending on the task one was doing."

Most SciFi seems to make this as an enclosed lens attachment having multiple modes effecting the particles of a liquid or gas solution filling the space between eyes and lens. I would like to see a representation of the finished surgical solution before committing either way though.

Kevin Larson's picture

It’s just a matter of time before this comes to pass. For deafness caused by the cochlea, it’s possible to place a microphone near the ear and feed the results directly into the brain.
http://www.cochlearamericas.com/Products/11.asp

There’s no reason that we won’t be able to do the same thing to bypass retinal problems, such as retinal tears. It would just take a camera and an understanding of what information needs to be fed to each neuron. This is only more complicated because there are many more neurons involved than with hearing.

If we’re able to use a microphone to replace the initial step of hearing and a camera to replace the initial step of vision, then there’s all kinds of possibilities to make modifications. For example, it’s possible today with the cochlear implants to change the range of hearing. The normal range of perfect hearing is 20hz to 20,000 hz. But what would happen if the inputs were changed to 0 to 50hz? Would the wearer be able to listed to seismic activity?

Kevin Larson's picture

I believe the ganglion cells in the retina are the first cells that detect motion. In the fovea there can be as high as a 1:1 ratio between cones and ganglion cells, but the ratio goes down further from the retina. In the periphery the ratio between rods and ganglion cells are as high as 1000:1.

Under normal daylight conditions we only use the cones for vision, as there is too much light for the rods to be effective. In low light conditions we use our rods for vision. It is difficult to read in low light because there are no rods in the fovea. The ganglion cells detect motion with data from the rods or cones.

John Hudson's picture

The normal range of perfect hearing is 20hz to 20,000 hz. But what would happen if the inputs were changed to 0 to 50hz? Would the wearer be able to listed to seismic activity?

Yes, and also the conversation of elephants bellowing at infrasonic frequencies.

hrant's picture

Actually hearing is much more complex than a microphone can pick up.
There's a phenomenon called binaural hearing where the outer ear -which
is of a telling complexity- plays a very large role.

Now try to gauge the true complexity of perception...
It's E B Huey all over again!

> bypass retinal problems

What about just bypass the retina and feed text directly into cognition?

hhp

Kevin Larson's picture

> What about just bypass the retina and feed text directly into cognition?

I certainly was thinking about that. There is a realistic possibility of a short-term understanding of the neurons that exit the eye. The entire brain consists of 100 billion neurons and 100 trillion synapses, and will remain a mystery for my lifetime. But imagine eventually being able to send waves of full sentence units directly into the cortex… there could be some serious speed benefits.

dberlow's picture

"What about just bypass the retina and feed text directly into cognition?

No more of those pesky scripts or messy differences in type styles, ueowkkk. Just call us the cognophiles. Hi mom, I know I just finished explaining this to you over the last 30 years, but I'm now a cognographer. I make cogneatos for people who have cognogs intalled in their cognoggens by cogknowitalls.

"imagine eventually being able to send waves of full sentence units directly into the cortex"
You want me to read this to you, I can do so?

John Hudson's picture

Actually hearing is much more complex than a microphone can pick up.
There’s a phenomenon called binaural hearing where the outer ear -which
is of a telling complexity- plays a very large role.

Binaural hearing is the auditory processing that compares what each ear hears; it helps us to work out direction and distance of auditory signals. Binaural sound recording, using two or more microphones, has been around since the late 19th century, when it was first used in the telephone broadcast of live opera performances. Later there were expriments with binaural radio broadcasts on dual frequencies, which didn't catch on because it required listeners to have two radios, each connected to a separate earphone. Since the development of stereo, however, binaural recording techniques have advanced significantly. Binaural reproduction requires the listener to use headphones, because it utilises the natural crossfeed of the head: the recording technique mimics binaural hearing, so can only be properly appreciated when played back directly to the individual ears.

Sonic shaping of the ear, i.e. the reaction and adapdation of the muscles of the ear to sound -- presumably the aspect of binaural hearing to which Hrant refers --, is important when we're picking up sound signals from our environment, but when we're listening to binaural recordings, through headphones, we're being directly fed the information that we normally have to use the muscles in the ear to help us direct. Indeed, the headphones probably actively inhibit sonic shaping, which is applied in the recording process rather than in the listening process.

The difficulty of on-the-fly, direct to brain signals from microphones, if one were to use a binaural recording model, would be getting the mix right, i.e. accurately matching the microphone-to-brain signal to the direction and distance of the external sound signal.

david h's picture

> But if we do eventually reach the stage of being able to cure blindness with artificial eyes

Bringing Sight to the Blind:
http://www.doheny.org/PDF/Bringing_Sight_to_the_Blind.pdf

Artificial Eyes:
http://www.businessweek.com/2000/00_12/b3673025.htm

hrant's picture

> No more of those pesky scripts or messy differences in type styles

No, you'd just have a separate set of wires for that. :-)

> cognographer

:->

> it utilises the natural crossfeed of the head

And mimicks the multi-dimensional filtering of the outer ear's cartilege.

My point was that Kevin's "microphone near the ear" is over-simple.

Some people look at the outer ear and think: "Whatever, there's no Formal Proof that it's shaped funny for a reason, so I'll pretend it doesn't matter." Other people think: "Hmmm, I really need to worry about why the outer ear is shaped so funny!" The best scientist is also an exceptional anecdotalist.

> The difficulty of on-the-fly, direct to brain signals from microphones ...

No, you just place the microphone inside the ear, like you mentioned people have been doing already. Now, if the outer ear is missing, then you need a complex algorithm* to filter the incoming sounds as the natural ear would. In fact a big issue is that processing binaural sound is computationally intensive; at least when I was reading about it (some years ago, in a book about cutting-edge multimedia design) it wasn't possible to make the hardware both affordable and comfortable.

* Although not nearly as complex as its counterpart in perception...

hhp

gthompson's picture

There is an article in the April issue of Scientific American by neuroscientist Frank Werblin about recent discoveries about how the retina operates. It's a much more complex process than we have thought. The retina actually begins processing information rather than merely passing it on, and that it breaks visual information down into several different kinds of "tracks" that are sent to the brain separately by different sets of neurons. The process actually sends a series of abstracted images to the brain which get assembled in different ways depending upon the part of the brain involved.

George
I felt bad because I had no shoes, until I met a man who had no Bodoni

hrant's picture

> The retina actually begins processing information
> rather than merely passing it on

Because it's basically an integral part of the brain, physically extended. Probably due to performance issues. So the question of eye versus brain fatigue is also more complex than we have thought...

hhp

Syndicate content Syndicate content