Hey r/blind, I saw an idea
$1 and was wondering if it would actually be useful to someone visually impaired?
The idea is an app on the smartphone that uses Google's Cloud Vision API to analyse frames from a live video camera feed, and uses it to describe key objects or read out a menu or use facial recognition to describe expressions. I imagine the cell phone would need to be chest mounted or something to keep the hands free and and some earbuds so you could hear the audio.
Just thought it was interesting and wanted to hear /r/blind's thoughts on it!