Inspiration
Need to give users complete over the data they generate while working with LLMs
What it does
Provides an quick and easy way to interact with the LLMs users that run using ollama.
How we built it
Using ollama and python flask
Challenges we ran into
Accomplishments that we're proud of
Completed all the main requirements:
- UI Development: Design and implement a user-friendly interface using Flask/Django, ensuring intuitive navigation and accessibility.
- Authentication Feature: Integrate OAuth or similar authentication mechanisms to secure user access and protect sensitive data.
- Chat Log Feature: Implement local storage for chat logs, ensuring data encryption and compliance with privacy regulations.
- Weekly Email Report: Develop a scheduled task to compile and send weekly chat logs to designated email addresses, enhancing user engagement and transparenc
What we learned
What's next for local llm with authentication
- Security Enhancements: Implement SSL/TLS encryption for secure data transmission and storage, safeguarding user information.
- User Feedback: Incorporate feedback mechanisms to iterate UI/UX design based on user preferences and usability testing.
- Integration Capabilities: Explore API integrations for seamless data exchange with external systems, enhancing functionality and scalability.
Built With
- flask
- ollama
Log in or sign up for Devpost to join the conversation.