Inspiration

At the time the Bolt hackathon kicked off, I started reading "Leonardo da Vinci" by Walter Isaacson.

A sentence that really struck me was a simple note to Leonardo left to himself;

"Describe a woodpecker's tongue".

Here was one of our greatest minds wondering at the smallest of life's mysteries. And it all started from this wonder aligned to an intense focus.

And I wondered - what would happen to Leonardo today? His curiosity would most likely be algorithmically gamed by social media platforms. The average teenager now spends 9+ hours a day on their phone. 17,000+ hours during their formative teenage years. There's all hope of mastery lost.

Woodpecker was inspired by the idea of building a digital environment that protects and cultivates curiosity — rather than eroding it it. An interface designed not for engagement metrics, but for depth, wonder, and creative flow.

(And to CREATE BOLT WEBSITES WITH SMART PEN AND PAPER!)

What it does

[STAY TO THE END FOR BOLT INCEPTION ] Woodpecker is a paper-like, AI-assisted OS layer that runs across all your devices.

It helps solve two key issues I found as I tried to regain focus. The need to be available in an always on world and the need to summon information. Often the time we reach for our phone and get sucked down a dopamine rabbit hole.

Woodpecker helps you send and receive messages without constant notifications, sketch ideas and get instant feedback, and explore quick, playful curiosities (like a math puzzle or story fragment) through simple gestures.

Key features include:

Woodpecker Gesture!

Circle & Send: A low-distraction messaging system.

Idea Feedback Loop: Jot down an idea, circle it, and get AI reflections instantly

All of it designed around preserving focus and encouraging creative expression.

And finally, we developed a custom BOLT INTEGRATION to allow you to CREATE BOLT WEBSITES FROM SMART PAPER!

How we built it

I designed it as a web-app that can sit above your favourite e-ink display or second brain like obsidian.

It has Twilio integration for native SMS message, LLM integration for multimodal recognition, smart task delegation and of course, MULTIMODAL BOLT WEBSITE INTEGRATION!

Challenges we ran into

The main challenge was the "woodpecker" gesture. We tried a double loop instead of a press and old. As I was vibe coding I kept finding edge cases which would trigger the double loop (i.e. writing an "a" or a "p").

I got around this by logging out all gestures as unit tests and then when I found a false positive, I would take the unit test from the browser logs and ask AI to fix. This ensured I wasn't regressing on past fixes while zeroing in on the right mix.

But in the end I decided press and hold was safer.

The bolt integration was also tricky. I had to develop a browser automation. I tried an LLM one called browser use but it was slow and expensive so I just LLM'd a direct playwright one with good instructions.

Finally the smart messaging from paper took a lot of design iteration.

Accomplishments that we're proud of

I feel proud that I built something better than Humane did in a few days 🤣. No in all seriousness I genuinely find this useful and I think there are some seeds of ideas here for how we may interact with technology moving forwards.

Phones and tablets will likely be banned from a lot of schools but we want to get the AI upsides for our kids. This could be the solution.

What we learned

This really advanced my thinking around how we can completely reimagine interfaces for an AI age. With advances in vision we can have more organic UX and UI.

What's next for Woodpecker

Next steps are wrapping an Android app around it and running it on a boox so I have true smart paper!

Built With

Share this project:

Updates