I wrote a post here a couple of months ago. I received valuable feedback from the community members and I am grateful to those who spend their time and completed the survey.
Now I am attaching a short demonstration
$1 of the app (This is not a final version of the product yet).
Let me describe what's happening there. The user holds a phone in one hand (phone holder can be used as well) and aims its camera towards a tactile graphics (TG). The UK map is printed on the image together with markers which are located in the corners. These markers are required to calibrate the image location at every frame instance and compensate for the camera fluctuations. At least three markers have to be visible for the app to operate properly (This might be the main challenge for the user but if he/she has an experience with using a phone camera, it should be OK ). In the meantime, the user starts exploring the TG with the other hand. The app detects the index finger position and corresponding audio feedback is triggered notifying the user what he/she is pointing upon at the moment. Here it is important to note that the app is capable of detecting each finger class independently, so the user can explore the image with the whole hand (or both). Then the user starts exploring a new TG with the image of a human skeleton. First, a QR code located on the image's backside is scanned and corresponding information is downloaded from the server. The internet connection is required only for this step. Finally, the image is explored in the same manner.
With the help of deep learning algorithms, the app is capable of working in real-time and even under poor lighting conditions (as in the video). Also, the app might be useful during the current situation when most of the students are studying from home. Since no guidance from the instructor is required.
If you want this app to be polished and then released or have any ideas for the additional features, please follow this survey
$1 $1 Do not hesitate to pm me, if you have any questions.
Thanks!