Our first idea for an accessibility-focused app was a touch-based braille reader. Due to the current technical limitations of consumer smartphones, there's no way to provide the three concurrent streams of tactile feedback necessary for braille reading. That consideration introduced us to a problem more sophisticated than reading on its own: As a blind person, how do you navigate a large body of text without getting trapped in an infinite stream of characters, with no tactile feedback, and still maintain your privacy and independence?

The answer is that you don't.

What it does

Straightkey, or ... -.-, is an SMS client that allows navigation and conversation without the need to see what's on the display. Navigation through lists and between views is handled entirely with the phone's volume buttons, and text is entered as Morse code.

Straightkey's primary target demographic is the blind, but it also can be beneficial to those with dexterity impairments and serve as a Morse code learning tool for the sighted.

Straightkey works with 6,814 Android phones.

How we built it


Straightkey's parameters are simple: It needs to provide the core functionality of the system SMS app, to eliminate the cognitive load of memorizing and recalling the locations of onscreen elements, and to preserve the user's privacy when texting around other people.

  • In the absence of braille, it became obvious that Morse code would be the most widely accepted standard for nonverbal, nonvisual communication that could be efficiently digitized. ITU-R M.1677 is the world standard as defined by the UN and supports common accented characters, punctuation, and symbols in addition to alphanumeric characters.
  • Navigation was moved to the volume buttons, the only two tactile surfaces available for use on every smartphone. In translating these behaviors we decided on a strict metaphor that would be compatible with the user's understanding of the volume buttons as meaning up/more and down/less. This control scheme frees up the screen for precision-independent input, which is used for entering, manipulating, and sending text, as well as rereading the currently selected item.
  • With primary navigation moved offscreen, the display was freed up for precision-independent input.
  • Since the primary demo doesn't need to see the display, the UI was designed with long battery life in mind. With a black background and whitespace-driven layout, Straightkey taxes the system minimally and will illuminate a minority of the screen on AMOLED displays.
  • For sighted users, Straightkey supports standard taps and swipes for navigation when the display is not needed for text input. All important elements are white with crisp edges for high contrast and glanceability, reducing friction for dexterously impaired users. Use of a monospace font and generous padding makes skimming through items much easier, with the added benefit of complementing Morse code's consistent timing.

We architected Straightkey informally on paper, mocked it up in Sketch, and organized project assets with Zeplin.


Mike spent all night in Android Studio and wants to die now.

Challenges we ran into

  • Parsing Morse code input into a verbal language is apparently a common challenge, with inherent human inconsistency making it difficult to determine if a pause marks a space between parts of a character, between characters, or between words. Our solution was to provide basic speed presets that can accommodate some margin of error for each corresponding level of proficiency.
  • On screens with text input, we render Morse code in addition to English so sighted users can see the Morse representation of what they're typing. Rendering the code in a manner that preserved spacing and readability across multiple lines required a few hours of tweaking.
  • Only one of us is a developer.

Accomplishments that we're proud of

  • Ask Mike about the oscillator he made to capture taps and releases for Morse code input, it's pretty rad.

What we learned

Morse code

What's next for Straightkey

  • An easy, conversational onboarding experience so users can quickly learn navigation and gestures
  • Adaptive input rates that allow more variance in typing speed
  • Initiate calls by holding the phone up to your ear in a message thread
  • Custom keyboard to allow systemwide Morse code input (and screen reading, potentially)
Share this project: