I've spent the last few years working on a concept for a human-computer interface that would operate primarily by non-visual means -- sound, mostly, and some tactile feedback. I'm sighted, but my visual processing ability is poor and my hearing is much better. A vision-based system also has severe limitations for sighted people in terms of truly mobile or wearable computing, since it uses our most intensive sense and mental circuits to use.
Currently, the design revolves around abstracting computation in terms of point-source objects placed in a tableau; the tableau reflects the underlying state of the machine and it's processes (battery life and network connectivity, for instance), and the objects serving as the main interaction points. How an object moves relative to the user, its pitch, "texture," volume, a variety of cue sounds, and spoken words communicate information and state back to the user. The overall interface prioritizes tasks as the main unit of computation, instead of files, to minimize overload. A major principle of the design is that it relies on our spatial sense relative to hearing.
I'm really interested in what blind users think of that idea, as people who use computers via screen-readers and other clunky abstraction layers. Alternatively, I'm interested to know how you would imagine an interface perfectly suited to the blind would behave, if it weren't sitting on top of a sighted system.
Any ideas, thoughts, or perspectives would be awesome.