Project Team

Evan Oskierko-Jeznacki
Christina Kim
Jiaang Hu
Advised by Dr. Nalaka Gooneratne and Wenjie Wei

Baseline Goals

Our baseline goals were organized by input, output, processing, and interfacing components. We were able to achieve each of our baseline goals, which were the following:

Input
Signal capture was the first task at hand. Of primary concern for the baseline was to capture an ECG (or EKG) signal, movement of the hands, and breathing rate. Baseline 1 was completely met. However, the ECG signal was noisy causing reach goal to change. Additionally, parts for this baseline were delayed, causing us to alter its final implementation after the baseline demo.

Output
For baseline the output goal was to trigger a haptic stimulus using a vibration motor that corresponded to the device wearer's heart rate. This we were able to achieve on each hand. The vibration cue would indicate bpm as per customer's wishes to indicate current heart rate status.

Processing
The baseline processing component focused on accurately discerning at least 10% differential in heart and respiratory rates in beats-per-minute (bpm). Baseline 2 was achieved, for the most part. The algorithm was capable of interpreting the signals, but could not accurately measure bpm in real-time or by stage of meditation.

Interface
An interface to view the data in real-time was provided, but a method to see the bpm or meditation stages were completed only as part of the intended reach goals. While the ability to display the signal in real-time after processing was useful, the team chose to develop other aspects of the project with higher priority. Given that the intention for the reach goal was to export this data for post-processing and diagnostic purposes anyway, our advisor agreed.

Reach Goals

Input
It was our intention to double the number of input data points to both increase accuracy but also to allow for some comparison between signals to identify patterns that might provide additional insight into the physiological behavior of the wearer with respect to mindfulness. We implemented a second respiratory belt, which was critical because as the wearer takes a breath, the chest and abdomen expand at different rates and to different diameters. This is clear in the raw data capture (See image below. The bottom two graphs in the image below correspond to the two respiratory rates. Graph four is the chest and graph three is the torso with smaller amplitudes). Furthermore, following our consultations with Dr. Gooneratne, it was decided that of the possible avenues for discerning agitation, head movement might actually be more significant than hand movement. Thus, we added an additional accelerometer onto the headset device to capture head movement. This reach goal was mostly met, however, the ECG signal was noisy and too weak to be read by our algorithm, causing a need for new PCB design. 20181205-122216

Processing
We struggled to accurately process both ECG and respiratory rates because of the significant amount of noise on the channels. Thus, we aimed to implement a preliminary signal pre-processing phase to eliminate and filter out as much noise as possible, which we were successful doing. Furthermore, to achieve true real-time processing we needed to implement a parallel processing methodology on the raspberry pi. That is, we captured the incoming signals with process, saved the rate values to shared memory and and analyzed them using another process that determined the mindfulness state and provided feedback to the wearer.

Output
In consultation with Dr. Gooneratne and Dr. Wei we decided that it was important to implement a two-fold approach to provided negative feedback stimuli to wearer. The haptic feedback implemented for the baseline is thus used to indicate to the wearer their own heart rate, which can be difficult to discern without the aid of instrumentation, etc.. Additionally, to provide reinforcement to adjust one's own mindfulness state we implemented a headset that provides the wearer visual feedback as well (see image below). We noticed, however, that the PWM signal used to modulate the haptic vibration motor caused significant noise on the highly sensitive, low voltage (mV) respiratory and ECG signal channels, this could be modified in the future to eliminate this noise by more adequately isolating the transmission lines. Group-17-Mindfulness-Monitor-3

The headset displays a color field modulated in real-time in response to the various rate changes to either increase or decrease breathing and heart rates. The color field varies in temperature from 2000-7000 Kelvin, where hotter temperatures are more blue on the color spectrum, which literature indicates can be used to excite or motivate; conversely, cooler temperatures (more orange on the color spectrum) are used to calm or soothe the wearer if their rates are above where they should be at a given state in the meditation process. We initially proposed to provide an auditory feedback cue as well, however in consultation with Dr. Gooneratne, it was decided that this would have the least significant effect on the wearer, and might, in fact, be distracting during the meditation process.

While the visual feedback component (implemented in the headset) represents the primary interface for the wearer during the meditation process,, we incorporated the ability to export data from the duration of the meditation process as well. The team decided early-on that the ability to use the device as both a meditation tool as well as diagnostic tool would be helpful for providing more insight into the relationship between the wearer's physiologic signals and mindfulness state. This relationship is only speculated, and a formal link is not well understood. It was our intention with this project to help elucidate this aspect of meditation by exporting the calculated rate data and a plot of this data as well, per mindfulness state.

Problem Identification and understanding



How can we identify progressively deeper levels of mindfulness (the relative shift in mindfulness/relaxation of a patient), as indicated by a series of physiological signals, compared to a wearer's baseline; produce a stimulus/stimuli to notify the user/wearer; and introduce reinforcing feedback to elicit a response?

Mindfulness meditation is a mental technique that helps patients focus on the present and correspondingly reduces stress levels. Primary physiologic characteristics include decreased heart rate and respiratory rate. We have also chosen to include cumulative head and hand movement as indicators of agitation.

Because meditation is an inherently personal experience, the challenges faced by the team, in all aspects, were not significant. Fundamental to the project is the ability to relate specific changes in specific physiologic parameters to mindfulness states--which, in and of themselves, are not statically defined. First, there are many types meditation and many types of mindfulness meditation as well. Fortunately, Dr. Gooneratne provided us with a template for what he hypothesized should be the relationships among variables in each mindfulness state, which we used as the foundation for our device (below).
Group-17-Mindfulness-Monitor-2

Overall system architecture

Group-17-Mindfulness-Monitor

Hardware effort

Design of the device from system to part level was completed. A circuit/PCB was designed for retrieving data including an alternative to using an EKG breakout board due to the original EKG circuit being too sensitive. This allowed one version to be packaged nicely with modular pins for all signal inputs.

An ADC was used to convert the incoming analog signals and input into the RasberryPi for processing. This was mounted on its own breadboard to isolate it from the ECG circuit. Once the heart rate was established, PWM signals were generated to power the haptic vibration motors located in each hand. Care was taken to isolated the buzzer from the accelerometer also located at each hand. To optimize ADC channels and processing, only the Z value was used from the accelerometers (indicating movement perpendicular to the sensor, mounted flat against the top of the hand). This was deemed adequate to characterize general agitation. Furthermore, the z-values from each of the sensors were read cumulatively as a metric for agitation. That is, we were less concerned with exactly where movement was occurring (left hand, right hand, or head), rather at what magnitude. Lastly, instead of using a series of RGB LEDs and an LED driver, we chose to use an off-the-shelf FPV drone racing headset with LCD screen as our hardware for the visual feedback, which worked quite well as it had adequate resolution and had its own onboard rechargeable battery. Group-17-Mindfulness-Monitor-1

Software effort

Designed and implemented basic structure for reading and processing signal
Designed two stage filter for signal processing and find the peak numbers inside one interval
Designed and implement the visual system to take inputs from the goal statue value and implement in a different processor with shared memory communication
Implemented the treatment conditions to follow and design the feedback system correlated to the visual screen and buzzer

Other effort

A significant portion of our time was spent constantly in coordination and consultation with our advisor and 'client' for this project, Dr. Nalaka Gooneratne, whose motivation inspired this project. Scheduled weekly meetings helped clarify project goals from theoretical, clinical, and mechanical perspectives. While the project focus and intended outcome was always clear, the path was ever-dynamic to ensure a sensible and validate approach.

System evaluation

One of the obvious and most troublesome challenges the team encountered during the development of this project was the inability to accurately test the device on someone. During early stages, when the hardware component was still in development, a bottleneck formed at the software testing phase due to the lack of a real-time, raw signal. Thus, we developed a method to use two function generators to simulate separate respiratory and heart rates into the device (see image below). The full testing demo can be viewedhere. testing-demo

Awesomeness

Currently, there doesn’t exist a reliable method for distinguishing, in real-time, various states of mindfulness unique to a particular person. Research has suggested that the real-time variability of certain physiological factors, including respiratory and heart rates, can be indicators of these states. However, given the inherently black-box system presented by the unique physiologic responses a person has, there is a need for a device to test how reliable these physiological factors are indicating these states especially with the addition of a stimulus.

Possible Improvements

Higher quality components, specifically the operational amplifiers used, would resolve much of the headache on the hardware side with respect to capturing, filtering, and transmitting the weak and erratic ECG signal.

In the future, the agitation metric we calculated (cumulative movement of the hands and head) could be used to help process the ECG data. ECG data collected during periods of increased agitation could be dismissed as a preliminary stage, removing much of the work the software filtering portions have to do.

While using an off-the-shelf FPV headset helps minimize cost and complexity, it presents the following challenges: it is uncomfortable (heavy, cumbersome) and can interfere with the meditation process; furthermore, device-users who wear glasses will have to be specially accommodated.

Implementing this device using a Raspberry Pi Zero would drastically reduce size, weight, footprint, and power consumption, allowing it to be belt or chest mounted, or even be embedded within the headset itself.

To ensure accurate ECG signals, with reduced noise, the wearer should be obviously seated/laying down, and stationary, however needs to also be well-grounded. It was observed that when not grounded (the ground electrode for the ECG circuit is located at the ankle), the circuit was significantly more prone to noise.

Progressive blog posts chronicling the early and mid-stage development of this device can also be found at my website.

Video

https://youtu.be/syIf_MEi_hI

Github

https://github.com/EvanOJ/ESE-519/tree/data/Final%20Project

Built With

Share this project:

Updates