Inspiration
Each year, wasted resources drain over 8 trillion dollars from the US manufacturing industry alone. That’s roughly ⅓ of the United States' current GDP, which is enough to be the third-highest GDP in the world. Working to minimize this waste not only has the potential to significantly benefit the economy but also countless smaller-scale plants across the country.
Over the past year, one of our team members worked as a data analyst through the Purdue Corporate Partners program. They aimed to minimize the manufacturing factory’s scrap created through data analysis and machine learning prediction. Learning from the project, our team realized we could take this idea and apply it at a larger scale. We hope to create a customized solution to optimize plants across the nation.
What it does
SweepAI is a data analytics and visualization platform that allows manufacturing companies to upload their factories' datasets. These datasets can be in any shape, size, or form, and we parse through the dataset and build a visual representation of the statistics. We analyze the potential problems in the manufacturing process and provide insightful and actionable solutions on ways to fix the problem. Finally, given the configurations of the hardware at the factory, we suggest the optimal configurations for the automation to minimze scrap.
How we built it
We built the frontend landing page with Dora AI. It created a template for the website and we added captivating visuals, descriptions, and a link to our platform. We hosted the platform on Streamlit, which allowed us to create interactive graphs, seamless file uploads, and clean text outputs. The website is hosted on the Streamlit cloud. The backend of our project involves 4 main steps. First, we parse through the dataset provided and convert it into a text string format while maintaining the general structure. Second, we send the dataset into a GPT-4 model which processes the dataset regardless of the structure provided. It analyzes the weightage for each column by finding their respective correlation to scrap. It creates actionable steps on ways to fix the scrap production in the factory. Third, we use statistical models to create interactive graphs on a dashboard that help visualize the data. Finally, we use GPT-4 to find the optimal values for each column to configure the hardware in the factory correctly.
Challenges we ran into
The main problem we encountered was figuring out the correct prompt to input to GPT to get the desired results. Since dealing with all sorts of datasets, we struggled to find a general prompt for GPT to adhere to. Additionally prompting GPT correctly to ensure that its results were not only accurate but consistent proved difficult to complete. Another issue that we ran into was deciding how we wanted to identify leading factors in scrap. However, as discussion and brainstorming, we decided on analyzing the correlation coefficient between waste and other variables.
Accomplishments that we're proud of
We're proud of the fact we were able to take in any type of dataset and perform in-depth analysis. We were able to identify deficient products, which will be useful in the next manufacturing line.
What we learned
Front-end: Throughout the development of Sweep, we gained valuable insights into user interface design and user experience with a variety of front-end tools. Crafting an intuitive and visually appealing frontend was crucial for attracting manufacturing companies and facilitating their interaction with the platform. We learned how to leverage tools like Dora AI to streamline the frontend development process and create a compelling landing page. Additionally, we honed our skills in incorporating captivating visuals and clear descriptions to communicate the platform's functionalities effectively.
Back-end: On the backend, our journey involved overcoming various challenges associated with data processing, machine learning model integration, and statistical analysis. Parsing diverse datasets and ensuring their compatibility with our algorithms demanded innovative approaches and robust data handling techniques. Integrating GPT-4 models into our backend workflow required meticulous experimentation to optimize prompt generation and result interpretation. Moreover, developing interactive dashboards for data visualization involved mastering statistical modeling techniques and leveraging frameworks to create dynamic and informative graphical representations.
Overall, our experience in developing Sweep's backend provided us with a deeper understanding of data analytics, machine learning, and automation technologies. We learned how to harness the power of AI models to derive actionable insights from complex datasets, empowering manufacturing companies to enhance their operational efficiency and minimize waste.
How to Use
You can learn more about the business idea on the doraAI website which links to a streamlit website (solution) that hosts the backend logic.
What's next for Sweep
Currently, we are actively seeking pre-seed investments to fuel the growth and development of our venture - Sweep. As we embark on this journey, we recognize the vital role that early-stage funding plays in bringing our innovative ideas to fruition.
With a focus on laying the groundwork for success, we are eager to collaborate with investors who share our vision and passion for creating impactful solutions. This pre-seed stage represents a crucial opportunity for us to establish a solid foundation and propel our venture towards achieving its full potential. Please reach out to simrith.ranjan@gmail.com with serious inquiries only!
Log in or sign up for Devpost to join the conversation.