Bring your karma
Join the waitlist today
HUMBLECAT.ORG

Blind and Visually Impaired Community

Full History - 2020 - 07 - 21 - ID#hvclcm
3
" Announcing the OpenCV Spatial AI Competition Sponsored By Intel Phase 1 Winners! " >> And My Project Won! >> " Artificial 3D Perception for the Blind by Marx Melencio: A chest-strapped mobile 3D perception device developed and created by a completely blind assistive tech user. " :D (opencv.org)
submitted by MRXGray
PerryThePlatypusBear 2 points 3y ago
Congratulations! It's a very impressive project
MRXGray [OP] 1 points 3y ago
Thanks a lot!! :D
PerryThePlatypusBear 2 points 3y ago
Have you ever given thought about what would be other ways, other than audio, of transmitting the information to the user? I've been thinking of doing a similar project and I thought maybe something like haptic feedback could be interesting to tell the user in what direction the obstacle is and at what distance. What do you think?
MRXGray [OP] 1 points 3y ago
Yes. I tried haptic.
This was for the open source, 3D printed DIY eyeglasses that I built last year.
Here's a longer video that documents my R&D progress: https://www.youtube.com/watch?v=PB9R9DMvgug

Anyway, what I did was to place tiny ERM vibrating motors behind miniature ultrasonic rangers.
These rangers are sonar sensors, with a min-max distance ranging capability of 2cm to 5m ...

I then connected the ERM vibrating motors to a tiny haptic driver MCU.
This microcontroller unit is for programmatic control ...

So fast vibrations, nearer objects.
And slower vibrations, farther objects ...

I then tested it along with my blind peers.
Quite nauseating after half an hour of using it, frankly.
This was our unanimous feedback ...

But I believe when built like a refreshable braille pad and placed flat at the back — This could very well simulate 3D perception.
Without nauseating the hell out of us. Ha. Ha. :D

So instead of sonar sensors, OAK-D looks like a better alternative to test out.
I'm saying this because, contour detection and depth-sensing are built right into the unit.
Now it'll be straightforward to programmatically translate those contour and depth details into haptic feedback that us blind users can understand when vibrations start happening on our backs ...

Thoughts?
PerryThePlatypusBear 1 points 3y ago
Thats interesting! I wouldn't have guessed it would be nauseating. Were the ERM motors placed in the back or somewhere in the glasses? Was it nauseating because the vibration was too strong that you could feel it in your head?
This nonprofit website is run by volunteers.
Please contribute if you can. Thank you!
Our mission is to provide everyone with access to large-
scale community websites for the good of humanity.
Without ads, without tracking, without greed.
©2023 HumbleCat Inc   •   HumbleCat is a 501(c)3 nonprofit based in Michigan, USA.