💡 Inspiration The rise of Decentralized Physical Infrastructure Networks (DePIN) is revolutionizing how we collect planetary data. However, the current landscape is fundamentally flawed: participating in weather or environmental DePINs typically requires users to buy proprietary hardware nodes costing upwards of $500. This creates massive geographic monopolies and prices out everyday people.

We realized that every single operating smartphone in the world is a dormant, pocket-sized supercomputer packed to the brim with environmental sensors. What if we could completely strip away the hardware entry barriers? We set out to democratize infrastructure by turning existing Android devices into a globally distributed sensor grid, allowing anyone to earn Bitcoin for streaming hyper-local street data to institutional buyers.

🌐 What it does PulseMesh operates as a two-sided marketplace designed to aggregate and monetize hyper-local environmental data autonomously:

For the Contributor (Data Providers): PulseMesh acts as a lightweight, "set-it-and-forget-it" Android application. Once installed and given permission, a foreground service silently pulses the device's hardware sensors every 60 seconds. It collects barometric pressure (valuable for weather patterns), ambient light lux, and noise decibels. The data is batched into 5-minute spatial windows and shipped to our backend. In exchange for providing validated, high-quality data, the user's built-in Lightning Wallet (powered by the Breez SDK) gets credited with Bitcoin Satoshis which they can withdraw seamlessly over the Lightning network.

For the Buyer (Data Consumers): Hedge funds, meteorologists, or urban planners log onto our interactive Web UI Dashboard to view global heatmaps of the sensor data. All queries to pull this raw structural data are blocked out-of-the-box by an Aperture Reverse Proxy enforcing an L402 Protocol. When a buyer wants data for a specific Uber H3 grid block, their software receives a 402 Payment Required challenge alongside a Lightning Invoice. They pay the micro-transaction instantly, receive a cryptographically signed Macaroon, and immediately gain access to the data stream.

⚙️ How I built it PulseMesh is split into three primary engineering components:

  1. The Android Sensor Node & integrated Lightning Wallet I built the mobile frontend natively in Kotlin using Jetpack Compose and Clean Architecture. The app utilizes an Android ForegroundService to silently wake up the phone's barometer, light sensor, and microphone every 60 seconds without draining the battery.

To compensate contributors autonomously, I integrated the Breez SDK, giving every user a non-custodial, nodeless Lightning wallet right inside the app. When environmental data passes backend validation, contributors simply click "Claim" to receive instant streaming micropayments in Satoshis.

  1. The Ktor Backend & Geospatial Engine The backend service routes through Ktor, built defensively to handle thousands of concurrent geospatial inserts. Rather than saving generic longitude/latitude clusters, I indexed all spatial data utilizing Uber's H3 hierarchical hex grid (running at Resolution 9).

Because sensor data is inherently time-series, standard SQL doesn't cut it. The data is pushed into heavy-duty TimescaleDB hypertables running atop PostgreSQL and PostGIS, which efficiently aggregates hourly sensor averages for buyers.

  1. The L402 Marketplace & Dashboard UI Finally, data buyers utilize a custom-built Website UI Dashboard to view heatmaps of available sensor data. When they attempt to query the API for raw data, the request is intercepted by an Aperture L402 Reverse Proxy connected to a custom LND node. The HTTP 402 Payment Required triggers an instant Lightning invoice challenge to the buyer. Only once paid does the reverse proxy dispense the decoding Macaroon and serve the environmental data.

🚧 Challenges I ran into Building a background-first DePIN app presents intense barriers on modern mobile OS architectures.

Android Doze & Battery Optimization: Keeping a WorkManager and foreground service alive for real-time sensor polling required careful threading and sticky notifications to avoid being killed by the Android OS's aggressive battery restrictions.

Privacy by Design: I needed to measure environmental noise, but actively recording audio would be a catastrophic privacy invasion. I had to build a custom SplCalculator that records 1 second of raw PCM data, processes the mathematical Root Mean Square (RMS) in memory, returns the decibel intensity, and immediately dumps the memory buffer so that no human voices are ever processed or transmitted.

L402 Lifecycle: Properly managing L402 macaroon caching on both the buyer interfaces and the backend validation required complex cryptographic header intercepts.

🏆 Accomplishments that we're proud of

Frictionless Native Lightning Integration: Successfully embedding the Breez SDK into the Kotlin Jetpack Compose app. Bringing non-custodial, nodeless Lightning infrastructure directly to mobile users—without forcing them to understand routing liquidity or channel management—is a massive leap for mainstream crypto adoption.

The 4-Stage Mathematical Validation Pipeline: Crowdsourced data is famously noisy. We are incredibly proud of the comprehensive logic constructed on our Ktor backend. Before a user is ever rewarded, their data undergoes rigid bounds-checking, $\sigma$-deviation neighbor consensus (cross-referencing other active nodes in the same geospatial block), and a strict Proof-of-Location tracking system to instantly flag device teleportation and spoofing.

True 'Privacy by Design': Utilizing microphones in the background is a terrifying concept for users. We are intensely proud of the algorithmic approach we took. We built a system that buffers raw PCM audio for exactly one second, calculates the mathematical Sound Pressure Level (dB) locally at the edge, and aggressively destroys the buffer frame. No human voice is ever recorded, written to disk, or transmitted to our servers.

End-to-End L402 Implementation: We proved that SaaS subscriptions are a thing of the past. Getting the Aperture gateway to speak perfectly with an LND node and securely wrap our Kotlin REST API behind instantaneous, per-request Lightning paywalls felt incredibly validating as the puzzle pieces snapped into place!

🧠 What I learned From a technical standpoint, I heavily leveled up my architectural design within Kotlin. I learned how to build intricate continuous-aggregate queries inside TimescaleDB and mapped those securely to H3 geospatial buckets.

More importantly, I learned the incredible math behind processing audio signals strictly through programmatic buffers. I implemented Sound Pressure Level ($L_{SPL}$) calculations dynamically on the edge:

$$ L_{SPL} = 20 \log_{10} \left( \frac{p_{\text{rms}}}{p_{\text{ref}}} \right) + C_{\text{device}} $$

Where $p_{\text{rms}}$ is the root-mean-square of the PCM buffer and $C_{\text{device}}$ is a dynamically injected calibration offset specific to that exact Android hardware model (e.g., Pixel 9 vs Galaxy S24).

Most fundamentally, I learned the mechanics of L402 protocol standards and how frictionless the internet can become when REST APIs are monetized via micro-transactions seamlessly verified by the Lightning network rather than relying on legacy Stripe subscription gateways.

🚀 What's next for PulseMesh

The zero-hardware barrier changes everything. Next, I plan to launch the application onto the Google Play Store, aggregate a wider array of device-specific calibration offsets for audio normalization, and build automated enterprise SDKs so weather forecast agencies can effortlessly consume PulseMesh data endpoints natively in their weather models.

Built With

Share this project:

Updates