We place a high value on flow state (‘flow’), characterized as being in the zone, completely immersed. Functional near-infrared spectroscopy (fNIRS) technology data enables drawing meaningful insight into one’s flow patterns. And we think fNIRS has reached a point of favourable cost:utility. Cost as in: hardware cost & invasiveness, privacy management etc. Utility as in: sample accuracy, new dapp tech (i.e. TEE compute), capacity for fine grained ML user-trajectory influencing, mobile stimuli to close the feedback loop etc.

What it does

uflo steers users toward flow, in private & maximally trustless fashion.


uflo obtains data from: 1) a hardware headset 2) a user’s phone sensors, 3) local device & remote api calls

uflo steers users toward flow - closing the feedback loop - with: 1) explicit app-directed stimuli(s) - at ETHWaterloo we chose classes of sounds (i.e. binaural beats, classic music, nature) 2) actionable analytics 3) prediction-provided suggestions on conditions for flow.

With these dapp flow-analytics, we privately train & apply machine learning in the TEE context of an Enigma smart contract. Our Enigma contract performs private computation in response to queries with encrypted inputs - so biometric data & the algorithms/intellectual property applied to this data remain private.

User Experience Workflow

  1. Training Phase - triggered: a. On 1st usage of app b. Whenever user seeks to ‘tune’ their private ML model. This phase collects training data [timestamp, ΔHb-O2, sound classes] in response to app-applied stimuli (we chose classes of sound for this initial hack), pulls the ambient local conditions (i.e. geolocation, temperature, browsing data), and ambient global conditions (i.e. financial market data, weather).

  2. Application Phase - triggered: a. When a sustained exit of flow state occurs [or when such is predicted] - the app renders queued notifications (signal, slack, email etc.). b. when ‘flow induction’ is sought - stimuli (i.e. classes of sounds), changing ambient conditions [classes of sounds, room temp/lighting], and recommendations (i.e. reduce coinmarketcap screentime) are used to induce flow & coordinate group flowhesion.

  3. Continuously available: 1) A ‘Flow status’ dashboard, which renders analytics & predictions. i.e. ‘are your teammates currently in flow?’ ‘At what time is peak flowhesion expected?’. 2) Privately computed values obtained from an Enigma contract. Example: relative team flow insights where inspecting the contract data for absolute biometric values is not possible. The inputs, computation, and data related to these computations are private.

How we built it

We’re using the BlueberryX, a military-developed functional neuro infrared spectroscopy hardware (fNIRS) headset ( Our main data stream stream consists of left temporal/prefrontal cortex ΔHb-O2 concentration over bluetooth (indicator of localized cortical tissue activation) & HR/HR variability. The hardware's stream is downsampled/denoised from a rate of 50hz to 5hz. The pipeline is: hardware -> android app -> -> socket -> react front end.

We use private machine learning inside an Enigma secret contract to detect which types of ambient conditions & stimuli (audio here) are most effective at inducing a flow state, and to detect when a user has exited flow.

Challenges we ran into

Piping/socketing streams while being constrained by hardware reliant on unmaintained Java libraries. Integrating 3rd party ML libraries within Enigma smart contracts. Getting acquainted with Rust on the fly.

Accomplishments that we're proud of

Our team's cohesion, openness & perseverance.

What's next for uflow:

  • High efficacy directed closure of the neurofeedback loop.
  • Applications: identity-integration, game-integration, cultivate healthier/more productive work environments, cognitive enhancement etc.
  • In application phase:
    • Browser extension to pause/queue & render notifications (signal, slack, email etc.)
    • Render opt-in ads - private ad matching for those that minimally disrupt flow.
  • In passive phase:
    • More useful queryable Enigma contract computed fields (i.e. who’s max flow, without showing ranking; ‘what stimuli & ambient conditions are predicted to produce max aggregate team flow?’).
  • Integrate subjective cognition labels.
  • Integration with other biometrics (blood sugar, ketones), bio-inputs (i.e. compound dosages), and stimuli.
  • Multi-sensor hardware (fused EEG-fNIRS for higher resolution, or TDCS concurrent fNIRS for direct brain stimuli).
  • Kiss Java goodbye.
  • Wifi direct.

  • Special thank you to John Chibuk of BlueberryX, the Enigma team for their technical support, and the volunteers of ETHWaterloo.

Built With

Share this project: