There's been a bit of academic work in this area. I'd recommend that you google to find out what research may already have been conducted. It'd be worth investing a day or two to figure out what's been tried, what evidence shows may or may not work, and so on.
A study by Knudsen and Holone included a test of gestural interaction with a weather map to yield distance and direction of a weather condition.
The paper is not directly downloadable at ResearchGate, but here's the link:
$1 The paper was quoted in another paper I'm reading. Here's their description:
>A research project tested voice input compared to gestures and keyboard with visually impaired users \[Knudsen and Holone 2012\]. The researchers offered gestural input, on-screen keyboard input and speech input for browsing the web. They tested the system on a weather map. Gestures allowed the users to find weather in a given direction and distance. Voice input allowed the users to search for specific time and date. Keyboard input allowed them to enter information in text fields to search for the same. Most users found speech input to be fastest and easiest even though it was more error prone. Gestures were not as useful though the users liked them. On screen keyboard was found to be slow and tedious.
Apps are expensive to make because programmers with sufficient skill and experience are expensive to hire and tend to be busy. You might get lucky and find a programmer who has app develop experience and months of time to create the app.
This could be a good project for a group of undergraduate students studying computer science. Their grades would be their compensation. They would need regular check-ins with members of the blind community to ensure their design is on track, and that they are guided by what users actually want rather than what they think users want. We can certainly use more programmers who can start their careers with development experience in accessibility.
Good luck!