Inspiration

Hypercard

Built With

Share this project:

Updates

posted an update

Submission Disasters

Two things went horribly wrong when finalising the submission - the video upload on Youtube had the wrong aspect ratio, which I didn't notice at the time. Worse, almost the entire submission text was accidentally deleted when submitting. So, here's the full text (and I honestly feel stupid for taking nearly a day to realise we could post updates!).

Inspiration

Hypercard.

Early in my career, Apple brought out Hypercard as something that anybody could use to make decks of cards that reacted to buttons and taps by changing content, showing images, playing sound. This was before the web.

The big vision for Touchgram is to provide that ability for users to create an interactive experience, for the touch generation.

I wrote a couple of longer-form pieces about the personal background and the technical inspiration, including Hypercard which led to Touchgram.

More recently, I wrote The Foot Mom about how a conversation with a Mom at a local meetup helped me focus on the simple value that could be delivered.

What it does

Touchgram works inside Apple Messages, mostly as an iMessage App Extension providing a way to build up multi-page messages that react to touch.

You could think of it like a small version of PowerPoint with added touch recognition, inside a message. Or, like interactive greeting cards on your phone.

Currently you can trigger sounds playing or just change to a different page, with an effect. You can see simple examples, including the range of touch types, on the Touchgram home page.

The messages are delivered like any other message in Apple Messages - you see a bubble in the conversation with someone. When you tap the bubble, it launches a nearly full-screen experience. From then on, what happens is up to the person who made the message.

The next stage in Touchgram's evolution is to provide a store where people can give away or resell their art works, sounds and even fully-composed messages. Adding Bitmoji as a source of great stickers is a natural fit for people composing experiences.

The first version just allows choosing Bitmoji stickers as art backgrounds for pages. This uses the same mechanism that lets someone send a picture or sound from another app to Touchgram, or transfer them from a Mac.

These sound and picture files are managed by the Touchgram app and made available inside the different editors when you make a message. The video shows a couple of saved stickers being used as page backgrounds.

The next step is also to use those as stickers on top of a page building up a collage on the page background, rather than just being the entire background. Once that is possible, the stickers will react to touch.

That could be a simple choose your own adventure where touching different stickers takes you off to different pages. Or, an interactive experience on the page where stickers move, fade, are replaced or go up in flames. Sounds might play depending on how and where you touch.

I'm still hoping to work around some iMessage issues and make the original sticker picker available directly inside the iMessage composer.

How I built it

The first version was a cross-platform prototype in a 2D gaming engine (Cocos2D-X) which was good enough to get a very rough demo of the tickling Mom's foot scenario going. It worked on Android and iOS.

That was in early 2015. Shortly after that prototype was finished, my co-founder quit being unhappy about the potential erotic use of the app (blame the sexy laugh track on the foot-tickling example). Ironically, this echoes the early SnapChat backlash. I had to put the company on hold.

Restarting in 2017, a mentor advised trying to get it work inside either Apple Messages or Facebook Messenger. Apple provide a great, very private extension capability so the prototype was rewritten as an iMessage extension. It was ported from C++ to Swift and the game engine changed to SpriteKit. I wanted to use as much native Apple tech as possible.

The first version hit the app store in Sep 2019 and has been slowly improved since. I've concentrated on iterating the features with user feedback and alternating between building core playback features and trying to improve the user experience.

A rich document editor inside 90% of the iPhone screen is a UX challenge.

Built With

sprite-kit swift

Log in or sign up for Devpost to join the conversation.