Inspiration
I've been getting into day-trading cryptocurrencies recently. While this can occasionally be rewarding, one downside to it is that I have to constantly be staring at the computer screen, keeping my eyes focused on real-time fluctuations in price. This is quite inconvenient and takes a lot of unnecessary time out of my day. I'm sure a lot of other individuals who do day-trading with stocks and cryptocurrencies also face this issue.
Fortunately, however, I've developed a pretty neat solution to this problem by combining computer science with some cutting edge neuroscience. I recently received this interesting device called the Neosensory Buzz. The Buzz, developed by Stanford neuroscientists David Eagleman and Scott Novich, helps deaf individuals hear sound as vibrations through non-invasive haptic feedback on the wrist. The concept is that over time, their brains will become adapted to the signals from the haptic feedback device (through a phenomenon called neuroplasticity) and integrate this input as a sixth sense. Luckily, the Buzz also contains an SDK for reprogramming through Python. This means that I can effectively take any sort of data out there, and map it to Buzz to present it in a way that is perceivable by humans.
After a couple of days of hard work, I was able to develop cryptoSense, a Raspberry Pi Zero W based device that allows its user to feel changes in cryptocurrency (Bitcoin) prices in real-time through non-invasive haptic feedback via the Neosensory Buzz.
What it does
CryptoSense empowers its user with a sixth sense, enabling them to feel changes in Bitcoin prices in real time through non-invasive haptic feedback. The user is not only able to feel the direction of the price change through a unique system of vibration patterns, but also the magnitude of the change (ex: how many $ did the price go up/down) through varying vibration intensities of these patterns. This eliminates the need for the user (day-trader) to have to constantly look at the cryptocurrency prices for hours at a time and instead simply receive information about these price changes through a different sensory input. This can also enable the day-trader to capture key opportunities in the market even when they aren't actively staring at the stock price charts, which is honestly quite incredible.
How I built it
The software for cryptoSense was developed in the Thonny Python IDE using Neosensory's experimental Python SDK. This SDK currently has a few bugs and is unsupported by Neosensory, but I was able to work around them and develop a fully functional project. The software itself is deployable on any Windows, MacOS, or Linux computer with Bluetooth capabilities. I deployed it on a super tiny Raspberry Pi Zero W to demonstrate the versatility of the application (and because it's a cute little computer). More details regarding how the software itself works can be found in the demo video.
Challenges I ran into
The Neosensory Buzz SDK is still experimental and isn't supported by the company yet. Working around its bugs was tricky at times. Additionally, creating an understandable haptic feedback system was difficult at first, but with some brainstorming, planning, and testing I think the end product works flawlessly.
Accomplishments that I'm proud of
At a ton of hackathons I've been to recently, everyone (including myself) has just been developing the same 5 types of web apps or machine learning projects. I'm really glad that I was able to think outside the box this time around and come up with something that I think is super creative. I'm also glad that I was able to finish this project in time!
What I learned
I learned a lot about how to process unfiltered data and present it to a user in a way that makes sense. I particularly was able to explore and learn about how to present this data through haptic feedback, which can be particularly challenging and quite tricky at times.
What's next for cryptoSense
I think that cryptoSense can be taken to the next level with some machine learning or AI integration. Perhaps I can have a pattern recognition model, and the results of this model can be relayed to the user through haptic feedback. There are tons of possibilities, and I'm hoping to implement them soon.
Important Note
This hackathon project was for educational purposes. Any opinions I may have presented are not in any way meant to guide your financial decisions.
Built With
- neosensory-buzz
- python
- raspberry-pi
Log in or sign up for Devpost to join the conversation.