MRXGray [OP] 1 points 3y ago
Yes. I tried haptic.
This was for the open source, 3D printed DIY eyeglasses that I built last year.
Here's a longer video that documents my R&D progress: https://www.youtube.com/watch?v=PB9R9DMvgug
Anyway, what I did was to place tiny ERM vibrating motors behind miniature ultrasonic rangers.
These rangers are sonar sensors, with a min-max distance ranging capability of 2cm to 5m ...
I then connected the ERM vibrating motors to a tiny haptic driver MCU.
This microcontroller unit is for programmatic control ...
So fast vibrations, nearer objects.
And slower vibrations, farther objects ...
I then tested it along with my blind peers.
Quite nauseating after half an hour of using it, frankly.
This was our unanimous feedback ...
But I believe when built like a refreshable braille pad and placed flat at the back — This could very well simulate 3D perception.
Without nauseating the hell out of us. Ha. Ha. :D
So instead of sonar sensors, OAK-D looks like a better alternative to test out.
I'm saying this because, contour detection and depth-sensing are built right into the unit.
Now it'll be straightforward to programmatically translate those contour and depth details into haptic feedback that us blind users can understand when vibrations start happening on our backs ...
Thoughts?