Inspiration

A map is, by design, a very visual medium. Paper maps can be augmented for use by vision-impaired individuals but what about interactive digital maps on a mobile device?

What it does

The app runs on an iPhone or iPad with Apple's VoiceOver technology enabled. VoiceOver speaks the name of user interface elements as the user taps on them, providing audible guidance on the app's functionality. The "Accessible Map" app extends this feature to provide more mapping-centric information to the user. This functionality allows the user to hear:

  • the address of their current location
  • the number of points of interest near their location
  • the name of a tapped-on point of interest
  • audible directions to a tapped on location

How we built it

The app is created in Swift by combining the ArcGIS Runtime SDK for iOS' mapping, location and routing ability with Apple's VoiceOver technology. VoiceOver allows you to specify textual labels for user interface elements and trigger audible cues and information in reponse to app events.

Example: when the user activates the callout's Navigate button, the word "Navigate button" will be spoken. Triggering the button's action will initiate the routing task; when the route task is complete, the list of directions to the destination will be spoken.

We chose a number of actions to provide audible cues for, such as current location, number of POIs in a give radius around the user, walking directions, and others.

Challenges we ran into

While the VoiceOver implementation was fairly straightforward, figuring out when and how VoiceOver triggered certain cues (and disabling the ones we didn't want), was a little tricky.

In order to "tap on a button", you first needed to tap once to "select" the button, then double-tap to activate it. This fundamental change in tap behavior did not play well with the MapView, such that we were never sure how to pan or zoom the map using gestures. We may just not have been able to figure it out, or it may be something that the MapView needs to take into account at the SDK level.

Also, tapping on the map with VoiceOver enabled did not provide the actual location tapped, but the center of the screen. The correct GeoViewTouchDelegate method was received but the tapped map point was incorrect.

Accomplishments that we're proud of

We have a functioning app that demonstrates the major features we wanted to highlight. We successfully integrated VoiceOver to provide useful enhancements to the map experience and were able to incorporate a number of ArcGIS technologies, such as routing, feature services, location awareness, geocoding and vector base maps.

What we learned

  • a lot about VoiceOver!
  • potential (?) conflicts with MapView and VoiceOver
  • some cool Swift stuff (unownedSelf, DispatchQueue.main.asyncAfter, dispatchGroup.enter(), dispatchGroup.leave()

What's next for Accessible Map

  • styling of the vector base map to be more accessible to low-vision users, such as better colors and increased label size
  • provide actual navigation as the user traverses the supplied route, speaking the next direction as the user nears the end of the current maneuver.

Built With

  • arcgis-runtime-sdk-for-ios
  • ios
  • voiceover
Share this project:

Updates