Bring your karma
Join the waitlist today
HUMBLECAT.ORG

Blind and Visually Impaired Community

Full History - 2021 - 02 - 19 - ID#lnm796
9
App to Scan Digital Timers (self.Blind)
submitted by I_LOVE_HAMSTER
I made an iOS app to read digital timers. It works on seven-segment displays like the ones on my microwave and washing machine. Its not on the app store yet, but here is a link to the website:

https://vdwees.github.io/7seg/

Would this app be useful to anyone besides me? If it gets enough interest I’ll pay for the apple developer account and publish it on the app store.

Also, I am looking for feedback on the idea if you have any.

EDIT:

Check it out on TestFlight: https://testflight.apple.com/join/DdPnqnlA
It's very new, so please be gentle. Any feedback is welcome!
Rethunker 3 points 2y ago
Does is work with VoiceOver?


Can a user point a camera and detect whether there is a 7-segment display in view?
I_LOVE_HAMSTER [OP] 3 points 2y ago
Yes, works with VoiceOver. The app is structured so that voiceover selects the digits automatically when the app loads and reads the digits out loud as they change. Also there is slight haptic feedback when a new display comes into view. Its pretty easy to aim.
Rethunker 2 points 2y ago
Without a developer's license, how did you run the app on your iPhone?

It looks to me like you have something useful here. The only way to be certain is to make it available to people. I'll have similar functionality in an app I'm working on, though my app has a different focus---the countdown timer will be an optional feature.


If you want to have others test your app, then I would suggest going through TestFlight first to make your app available as a beta. Even then the TestFlight team at Apple is likely to ask you to create a demo video. That was my experience.

Having an accessible video is useful both for your TestFlight submission and for your testers. Getting into TestFlight is less demanding than submitting directly to the App Store, so it's a good way to get used to the process of submission. TestFlight is set up for you to communicate with testers, work through iterations of your app, etc.


Here are just a few of the steps involved in TestFlight submission. You may be familiar with some or all of this stuff, but I'll jot it down for whoever may pop by this post.

* Get your developer license. If you've spent this much time on an app, I'd suggest just getting the developer license at this point.
* Log into App Store Connect
* Prepare your App Store Connect account for app upload
* Archive your app
* Run validation on the app
* Upload your app via App Store Connect. There are some videos explaining this process.
* Provide a privacy policy. This will be a link to a page on your website.
* Provide a marketing web page.
* Describe your app to the TestFlight team.
* Provide names and email addresses for internal testers. If you're a one-person development team, then you likely won't have internal testers.
* Provide names and email addresses for one or more groups of external testers.
* Assign uploaded app builds to testers.


That aside, a few comments on your work so far.

In Create a Reminder, don't use italics. Italics are harder to read. Also, consider using some other title text than "Create a Reminder" for an error condition. That is, your error message should be completely distinct from the presentation of the Create a Reminder feature to actually select a reminder time.


For the time display, include the colon character ":" separating hour and minutes. If your image processing reads the digits but not the colon character, then to set the timer I'm assuming you have logic to parse the digits and determine hours and minutes.


Try to read the hour/minutes display from the LCD. I noticed that the Create a Reminder screen offers 9 minutes 58 seconds and 9 hours 59 minutes as two options for time read as 9:59. Leaving this in the hands of the user is a good workaround, but the user may wonder whether they set the time correctly. Maybe they really did set a time of 10 hours when they meant to set 10 minutes. I've done that, and I'm sighted.


To record a video of the app, a good first video can simply be a live screen recording. In the Settings for Control Center, you can add Screen Recording to the list of Included Controls. Then you can record the screen with audio, including VoiceOver.


It's good that on the web page you provided you mention that the app may not work in all cases. Eventually you'll want to codify this in a legal agreement, though for a while you can use the default beta agreement that Apple provides for TestFlight if you don't provide your own beta agreement.

As a machine vision engineer I'd quibble with the phrase "advanced computer vision techniques." Digital readouts could be read by vision systems made decades ago. But more importantly, from a marketing perspective "advanced" can sound hollow. Better to drop any implication that your app is "advanced" or "new" or whatever, make the marketing page shorter, and focus on conveying just what the audience needs to know.


Nice start!
I_LOVE_HAMSTER [OP] 2 points 2y ago
Just realised that I didn't answer your first question- I found out when I started this app that you can sign and install an app directly on personal devices with just a personal apple id.

Your advice was pretty helpful- I followed up on most of your suggestions, and SevenSeg is on TestFlight now: https://testflight.apple.com/join/DdPnqnlA

With regard to the separator: I experimented with adding the ':' in, and in many cases it works well. My rationale for taking it out again is that SevenSeg already works on stuff like the wattage reading on the microwave and the digital readout of the thermostat. I wanted to focus first on getting the OCR components working, and maybe later add I can add different modes (e.g. timer mode) if I could figure out how to do it without complicating the UI.

That said, the OCR layer can be improved a lot- I'd like to get the app a lot more tolerant of glare. Many seven-segment displays have glossy coatings, and as a sighted person, I can see the glare and adjust the angle I am holding the phone at relative to the display, and of course many people who might want to use the app cannot compensate like that. From a computer vision perspective, do you have any suggestions? I am thinking I could probably compose the output of the now-common dual-cameras to avoid some of it.
Rethunker 1 points 2y ago
Glad my suggestions were of some use. And congrats in getting your app in TestFlight! I hope to check it out within the next week.

Glare is a big problem, especially if we only use the existing hardware. A traditional way to reduce glare is to use a polarizing filter, which helps somewhat, but having someone add a filter to their phone is probably too much to ask.

Solutions to address glare that I plan to pursue are going to be rather involved, given the needs for my app. For now I’d have just two suggestions that may be relevant to your app:

1. Identify whether there is glare, and if so where it may be.
2. If there is glare, prompt the user to try a different phone angle, and then read the LCD again.
Rethunker 1 points 2y ago
A thought: instead of "advanced," if you still want to keep such a phrase I'd suggest "state of the art" instead. If you're relying on the Vision and/or Core ML frameworks, or if you're using algorithms from OpenCV, Tesseract, etc., then "state of the art" would be a more accurate description. "Advanced" implies something novel, such as the first real implementation of an algorithm previously described only described in academic papers, or a new algorithm that outperforms existing state of the art algorithms. That's exceedingly hard to pull off unless you have PhD-level training in image processing and/or a number of years of image processing experience.


Pedantic, I'm sure, but since I've worked with commercial vision libraries, novel algorithms, etc., I have a somewhat persnickety view of such things.
I_LOVE_HAMSTER [OP] 2 points 2y ago
You’re right, the algorithm is probably not ‘advanced’ in the academic sense and vague from a marketing perspective. I’ll change it. I do leverage some of the Vision APIs where possible to ensure real-time performance, but mostly the image processing pipeline is written in swift.
siriuslylupin6 3 points 2y ago
Hmm... interesting..
Laser_Lens_4 1 points 2y ago
Shut up and take my money. Is there a way I can load this onto my iphone myself to try it out?
I_LOVE_HAMSTER [OP] 2 points 2y ago
I'm working on getting the app into TestFlight. I'll update you when its accepted... its still really new though, so I certainly won't be charging money haha.
I_LOVE_HAMSTER [OP] 1 points 2y ago
SevenSeg just got accepted to TestFlight: https://testflight.apple.com/join/DdPnqnlA
Any feedback is appreciated and will help me get the app fully working.
ukifrit 1 points 2y ago
this sounds useful a lot
drF1234 1 points 2y ago
Very cool!!
neroute2 1 points 2y ago
Nice. I know this would be useful for my girlfriend.
This nonprofit website is run by volunteers.
Please contribute if you can. Thank you!
Our mission is to provide everyone with access to large-
scale community websites for the good of humanity.
Without ads, without tracking, without greed.
©2023 HumbleCat Inc   •   HumbleCat is a 501(c)3 nonprofit based in Michigan, USA.