Inspiration

As developers who want to push how technology can be integrated with different platforms, we saw an opportunity to see how Wii Remote technology can control different desktop applications. As avid fans of Minecraft, we decided to work with that game for this hackathon.

What it does

This application processes Wii Remote input/motion and maps that to Minecraft actions using an API written in Java and C. After connecting the Wii Remote to a computer via Bluetooth, users can launch Minecraft using the mod we developed and start playing Minecraft! At the moment, users are limited to using Wii Remote to toggle between standing idle and mining items in the Minecraft world. There is also a separate machine learning model that processes data provided by the Wii Remote (pitch, yaw, acceleration, velocity, etc.) and predicts the specific movement (mining vs. running) being done.

How we built it

We used an API developed with Java, C to extract information that was provided by the Wii Remote via a Bluetooth connection. We then mapped simple commands to simple actions in Minecraft using a Java program. More complex actions were predicted by feeding the Wii Remote’s data into a Tensorflow machine learning model. All this data then gets fed into a Java mod for Minecraft, which then translates to actions.

Challenges we ran into

Improving our model and testing proved to be difficult. Pre-existing libraries do not meet the requirements that we hoped for.

Accomplishments that we're proud of

We are proud of creating a machine learning model and understanding how to process data from the Wii remote. We also take pride in the fact that we were able to send some form of data to our Minecraft mod via the Wii remote.

What we learned

We learned how to develop Tensorflow models from scratch and use Java to develop a Minecraft mod. We also learned more about how to extrapolate information from IoT devices.

What's next for Wii Love Minecraft

The future plans for Wii love Minecraft is to map the rest of the movement commands to the buttons on the Wii remote to allow full gameplay with just the remote. In addition, we plan to improve the ML model to increase the amount of recognizable gestures. This would cause more movements and less button presses to play the game.

Built With

Share this project:

Updates