All 3 of us have a lot of interest and passion into Machine Learning and IOT (Internet of Things) devices. From this cross-section, we learned about TinyML and its use cases. TinyML can be used in tons of situations such as the current pandemic, agriculture, and a lot more. Its flexibility is derived from its lack of internet-dependency. Current solutions that attempt to bridge the gap between TinyML and IOT devices are inefficient, cumbersome, and long which poses a strong risk of lower adoption. This is when it hit all of us that we can make a better pipeline that bridges these two things.
What it does
Project Metalink is a platform that allows for the rapid development and deployment for TinyML models onto microcontrollers and mini-computers. It’s a simple 4-step procedure. The user logs onto the client area through a Discord account to access a short API key which they can run on a CLI (Command Line Interface). From there, they can insert data points and generate a model. In the website’s client area, they can see datapoints being recorded in real-time as well, providing a visual view. Once the model is generated, they just have to enter one line to deploy it on their IOT device.
How we built it
The intersection of many languages has helped build this project. The frontend is built with Node.js in conjunction with the Bootstrap and Express.js framework as well as the Jade/Pug view engine. The backend is built with Python and Flask.
Challenges we ran into
There were many challenges we ran into and we had to rapidly improvise due to time constraints. The first issue we had was developing the login system. Initially, we planned on using an email/password login/signup system, however the feature was very unstable and was not a viable option. Therefore, we decided to use Discord’s OAuth, which after some tweaking and tests worked perfectly for our system. Building a pipeline for transferring the models to edge device using cloud storage services and using the test data available to run predictions required some figuring out, we decided to go with an approach of using pickle file and uploading them to cloudinary.
Accomplishments that we're proud of
We are very proud to have made a self-sufficient system that produces accurate models with easy functionality.
What we learned
It never hurts to downscale certain aspects of a project as long as the intended goal is reached and you have a fully completed product. We had to make sacrifices when it came to certain aspects, and overall it gave us a deeper understanding and appreciation for the project and its prerequisites.
What's next for Project Metalink
Project Metalink is not a one-and-done project. We have a detailed roadmap for the future. The core tenet for the future is to add more features and follow the motto of “No buzzwords, only results.” The biggest feature we plan to add in the coming weeks is the inclusion of k-classification models. Due to the immensely short development time, we hope to add a ton more.