Inspiration
Our app was born from a desire to empower consumers and shape the future through conscious choices. There is a growing need for individuals to align their purchases with their values and contribute to the potential impact of collective action. Our app is a simple tool to solve that problem: enabling users to make informed decisions about what they buy. We aim to foster a more purposeful and responsible community and economy by bridging personal values and economic power, putting power in purchases and driving positive change in our society and economy.
What it does
CorporaTea enables the everyday consumer to identify the history of brands they purchase from. By simply scanning the barcode of a product, CorporaTea returns the product name, the company behind the product, and the history of social ills, lawsuits, settlements, etc with sources listed, if those events exist and are known.
How we built it
Front-End Development:
The front-end was completed using the React-Native framework and tested using the Expo framework. We also used Tailwind/Nativewind to stylize and customize the app design.
The main libraries and key dependencies are: react-native, react-native-safe-area-context, expo-router, expo-barcode-scanner. A present issue is that expo-barcode-scanner is currently a deprecated library. This will be updated using expo-camera in future versions of CorporaTea.
Current file organization system:
app(contains main driver code)tabs(contains five files for the four tabs:_layout,about-us,companycard,home, andscan)search(containsproductscannerto open the camera and scan and return the barcode)
assets(contains icons, images, and fonts for stylizing the app)components(contains four files to help the main driver code:CustomButton,PopularCompanies,StoryCard, andindex)constants(imports the icons, images, and fonts for stylizing the app)
Back-End Development:
The barcode scanner uses the expo-native-camera library to operate the device camera, followed by passing the barcode string through barcodelookup.com which gives the information of the products in JSON format.
The fields we extract from barcodelookup include: {name,description, brand, manufacturer} All of which are strings. From this the name and description are shown on the app, while the brand (or the manufacturer if the brand is unavailable) is then passed into the Wikipedia API to source for the company’s wiki page.
Using the wiki on the company’s information, combined with the cited sources at the end of the article, we feed this information into the uncensored LLM cognitivecomputations/dolphin-2.9.2-qwen2-7b, hosted on HuggingFace Inference Endpoint, which consists of 2 sequential prompts,
1st prompt: We input the wiki article, and prompt the LLM to summarize the controversies of the company listed there. 2nd prompt: We input the summarized list of controversies, and the references list in the wiki page. We prompt the LLM to match the controversy to the source article. We then output both the list summary and the sources to frontend.
For a more in depth explanation on the AI part and its challenges, see this link for an explanation: Link
Both the Barcode API and LLM Endpoints are configured using express.js and hosted on the cloud on Google Firebase functions. The app makes standard axios calls to our Firebase project while it runs.
Challenges we ran into
The biggest issue we faced was integrating the wikipedia-LLM into the app. The LLM generation is the biggest overhead in our app, requiring up to 30s to generate in worst cases. Due to financial constraints, we also could only operate the LLM for brief periods of time to prevent overcharges above the free tier limit, which led to some issues in debugging. Should our idea go into production, we could afford faster LLMs that can operate 24/7.
Finding the appropriate model is challenging:
- During the researching phase, it was challenging to find an uncensored model that is willing to mention the controversies of the companies. Most LLM deployed as API from OpenAI or Anthropic were finetuned for alignment, to protect those companies from litigation, they would ignore any extremities.
- Then, the context window poses another challenge. Long context window in the prompt bloats up the GPU VRAM footprint, and our Qwen2 was finetuned on 16k tokens long context. We only used the last 5000 words of the wiki text, to save time and memory. Plus, the scandals are usually at the back-half of the wiki article, we want to remove the noise from the top-half which is generally the introduction of the company.
- We chose a model with instruction tuning, to be able to prompt it to output the correct format for easy parsing.
- For a more in depth explanation on the AI part and its challenges, see this link for an explanation: Link
Accomplishments that we're proud of
We did manage to get a working prototype done and filmed before the Hackathon deadline. Our app was also working when testing on various household products and with products at local grocery stores. Communication throughout the project was also effective among all members.
What we learned
Our team of three, comprising of university students and recent graduates, gained valuable experience in various aspects of app development:
- We mastered the React-Native framework and its ecosystem, including common libraries and components.
- We successfully integrated front-end and back-end elements
- We developed and integrated an LLM to generate data for front-end display
- We learned to work effectively as a team, dividing tasks based on our strengths and supporting each other through challenges.
- We faced and resolved numerous technical difficulties, particularly in harmonizing the APIs and LLM, as well as optimizing backend performance.
What's next for CorporaTea
We would like to incorporate more features into our app, which includes nutritional and eco-friendliness scores, for the users to compare between products in the supermarket. This would make the app more functional and complete to our purpose of supporting ethical consumption.
Built With
- expo.io
- express.js
- firebase
- huggingfaceinferenceendpoint
- javascript
- react-native
- tailwind/nativewind
Log in or sign up for Devpost to join the conversation.