I believe that could be something useful. I'm completely blind.
​
Also, I've done a 3D printed pair of eyeglasses with a sonar sensor and an ERM vibrating motor last year.
Both are integrated into the front panel of the eyeglasses frame. The sonar sensor's obviously pointing outwards.
And I placed the vibrating motor back-to-back against it. So its flat surface rests on the user's nosebridge.
Plus, I integrated bone-conducting speakers near the edge of the eyeglasses frame's arms (or legs).
That's because I also integrated a tiny camera into the eyeglasses frame's front panel.
And I implemented machine learning and deep learning models for object detection, classification, OCR in the wild, and OCR for physical documents ...
​
Though I recently moved on to upgrading my project into a wearable 3D perception device for the blind.
This is through binocular depth-sensing and distance estimation.
That's to provide, as a wearable device, real time, simultaneous multiple object detection, classification, clockface placement localization, depth-sensing, distance estimation, OCR in the wild, and OCR for physical documents ...
​
Here's a video demo that I recently did for my project — First part is the current progress of my upgrade. And second part's my 3D printed AI-powered eyeglasses for the blind prototype:
$1​
Best of luck with your project! :)