Bring your karma
Join the waitlist today
HUMBLECAT.ORG

Blind and Visually Impaired Community

Full History - 2020 - 08 - 30 - ID#ijcids
24
Idea for an app (self.Blind)
submitted by bscross32
I often see well-meaning sighted people who come in here and want to build this device that will revolutionize the way we interact with our environments. I read these posts, and the first thought that goes through my mind is, why? Why do we need high tech options for simple aspects of life? Well, I got to thinking about things, and I've come up with one area that might put a few programmers to work if they'd be willing to try something, though it's not exactly to do with navigation.

​

We do have accessible ways to get weather information. From using digital assistants to the stock weather apps and even some 3rd party apps, like Weather Gods. That's great, and I do make use of them. But I'd like to have more. I would like a way to be able to accessibly read the weather radar, and I've got some preliminary thoughts as to how it might be done.

​

let's start by putting my assumptions on the table, because this will alter things if they tend not to be correct. Having never had sight good enough to be able to read a weather radar map, my assumptions are thus. First, that the map includes city and state names as well as draws the geographical border lines. Second, that temperature and other information is shown where it significantly differs from surrounding regions. Next, there is a way to mark storms of varying natures, from developing to full out storm cell. Then, we have a projection of the course that meteorologists think the storm will take.

​

Assuming that information is correct, here are my ideas. On iOS, there is a feature in VoiceOver that lets you walk through the streets on a map. You just find the street name and pause your finger over it until you hear a sound. Then, you can move your finger along the screen to get an idea of the path the street runs. So, taking inspiration from that, what if we were able to swipe around on the screen or explore it to find hot spots. If we found the location of a storm, we could then pause over it which would trigger a mode in which we could follow the projected course.

​

We could have two sort of looping drone or pad sounds, similar to how Seeing AI does with it's explore photo mode. One could be a lower sound, that indicates you're not moving in the direction of the projected path, and the other meaning you're within the projected path. Also while dragging, it would be good to hear the cities and states or provinces your finger crosses over.

​

In such cases that we're tracking a big storm cell, it might be necessary to have a sort of panning view that starts to move automatically as our finger nears the edge of the screen. Though we would need some way of knowing that it's happening. Perhaps a series of clicks to indicate the speed that we're panning over the map. The closer the finger to the edge of the screen, the faster it moves.

​

I started thinking about this when the hurricanes slammed into Louisiana. I was thinking that it might be good to have some idea as to the path it would take, not just a general idea as to direction. But I think this could work for things as mundane as a summer thunder banger as well as tornados and so forth.
Epileptic-dan 3 points 2y ago
This would be a great tool. I think it would be a bit difficult given the varying strength and speed of land based storms. But I like this idea. Technology in 2020 has blown my mind a few times already.
bscross32 [OP] 1 points 2y ago
It would not be 100% reliable. It would be based on the data that was given to it by which ever service it collects it from.
burg_to_314 2 points 2y ago
Wow! Really cool idea. I feel like instead of a path there could be different sounds for different intensities and you just jump through different times and run your finger accross to get an idea of where the storm is at that time.

But not sure how you would get the information about location from that. Maybe you just pick a center and radius to start?
bscross32 [OP] 1 points 2y ago
I don't know how that would work. I would prefer to see the path of the storm if that is available. But it all depends on what kinds of data can be gathered.
Epileptic-dan 2 points 2y ago
Here where I live, in Iowa, we had that derecho roll through with almost no warning with 120+ mph winds. Thankfully I was inside and heard the sirens, but others were not so lucky.
This would be a great tool.
Altie-McAltface 1 points 2y ago
Sounds like you're mostly thinking about radar, but you've also got some ideas about surface observation maps.

The most common product we get from radar is reflectivity. The colors on the map show how strong of a signal was reflected back from the rain or other stuff in the air. Reflectivity is measured in decibels, with the reference value (0 dB) representing the signature of the typical raindrop. Apps like Radarscope send this reflectivity data raw to the phone, which assembles a picture with a map overlay. Projecting a storm's path is also doable. I can see an app similar to radarscope that uses the same raw data to generate a product consumable by the blind. I have sufficient vision to use the app with a magnifier, and don't really have experience with making visual data audibly accessible, but I thought I'd throw this out there to say that I think it's very doable.

Lightning detection is another product that I think would be useful. Locating velocity couplets associated with tornadoes can also be more or less automated I believe.
Rethunker 1 points 2y ago
There's been a bit of academic work in this area. I'd recommend that you google to find out what research may already have been conducted. It'd be worth investing a day or two to figure out what's been tried, what evidence shows may or may not work, and so on.

A study by Knudsen and Holone included a test of gestural interaction with a weather map to yield distance and direction of a weather condition.

The paper is not directly downloadable at ResearchGate, but here's the link:
$1


The paper was quoted in another paper I'm reading. Here's their description:


>A research project tested voice input compared to gestures and keyboard with visually impaired users \[Knudsen and Holone 2012\]. The researchers offered gestural input, on-screen keyboard input and speech input for browsing the web. They tested the system on a weather map. Gestures allowed the users to find weather in a given direction and distance. Voice input allowed the users to search for specific time and date. Keyboard input allowed them to enter information in text fields to search for the same. Most users found speech input to be fastest and easiest even though it was more error prone. Gestures were not as useful though the users liked them. On screen keyboard was found to be slow and tedious.

Apps are expensive to make because programmers with sufficient skill and experience are expensive to hire and tend to be busy. You might get lucky and find a programmer who has app develop experience and months of time to create the app.

This could be a good project for a group of undergraduate students studying computer science. Their grades would be their compensation. They would need regular check-ins with members of the blind community to ensure their design is on track, and that they are guided by what users actually want rather than what they think users want. We can certainly use more programmers who can start their careers with development experience in accessibility.

Good luck!
This nonprofit website is run by volunteers.
Please contribute if you can. Thank you!
Our mission is to provide everyone with access to large-
scale community websites for the good of humanity.
Without ads, without tracking, without greed.
©2023 HumbleCat Inc   •   HumbleCat is a 501(c)3 nonprofit based in Michigan, USA.