We can get a lot of information about the surroundings for the wearer. The question is, how much of that should we pass on, and how?
The information we collect is presented on high-resolution screens positioned near to the eyes. We have made a set of glasses that use transparent OLED displays. This means that the wearer can see the enhanced images while still being able to use their remaining sight in much the same was as they are used to.
Transparent displays have the added advantage of making a pair of glasses look relatively normal in public.
These photos show views of a prototype pair of glasses. A depth camera detected a nearby person and displayed a simplified and easy to see image on the inside of the lenses.
We can also feed people information through a headphone, using text-to speech software. This could give directions, read bus numbers or signs.
How we do it >