Inspiration

I (Jacob) spent 2 hours in the library training a machine learning model. It was annoying and my computer felt like it was going to crash. I needed a better way, and I figured I could send my code to my friend with a gaming computer--it would run much faster.

What it does

It lets machine learning researchers run their Python code on gamers' computers easily.

How we built it

We built a NodeJS server that serves a separate API to both researchers and gamers. We also built a TensorFlow model that determines which computer (of a list of computers and their specs), would yield the best runtime for a given Python script. The client for running the Python scripts was built with NodeJS as well.

Challenges we ran into

We found it difficult to connect the three code bases/parts of the application.

Accomplishments that we're proud of

The TensorFlow model hosted on an AWS EC2 instance with Flask.

What we learned

Test connections between different code bases BEFORE developing out the separate code bases.

What's next for OffloadML

Connecting all the codebases together.

Built With

Share this project:
×

Updates