Do you have any thoughts on computer interfaces for the blind? What would make them more usable? (self.Blind)
submitted 5.1921549639917695y ago by VerbosePineMarten
I've spent a couple of years working on novel UI designs as a hobby, especially with wearable and accessible computing. I got to thinking about interfaces tailored specifically to blind users, and I really wanted to hear your guy's end of things. I've used screenreaders before to see what it's like, and the problem is that they're inherently low\-bandwidth interfaces \(i.e. not much info per unit\-rate of interaction\). It also doesn't help that they're abstracting an inherently visual interface, which adds more overhead.
My current designs are based on stereo soundscapes that abstract the computer/program state directly, with spoken word used mostly for things that are already inherently textual \(error messages, webpage content, text editing, etc.\), with lots of haptic feedback. I'm used to a command\-line interface myself, so I've been looking at ways to "annotate" text outputs with other sounds to increase the information density rate.
A lot of the stuff I've tried is based purely on sound, or cheap haptic feedback devices, to minimize the need for specialized equipment.
I guess what it really boils down to is this: we have the "desktop metaphor" for abstracting files/navigation on computers. We also have the WIMP \(windows, icons, menus, pointer\) metaphor, and more recently the "everything is a tile" paradigm on mobile devices. What metaphor would work best for the blind, especially in terms of audio\-only interfaces? Are there any interface elements that work well for you on screen readers? What about the ones that don't? Are there any things you can think of that would make computers faster and easier to use for you, especially if we're talking a ground\-up design instead of mapping over an existing interface?