Inspiration

Need to give users complete over the data they generate while working with LLMs

What it does

Provides an quick and easy way to interact with the LLMs users that run using ollama.

How we built it

Using ollama and python flask

Challenges we ran into

Accomplishments that we're proud of

Completed all the main requirements:

  1. UI Development: Design and implement a user-friendly interface using Flask/Django, ensuring intuitive navigation and accessibility.
  2. Authentication Feature: Integrate OAuth or similar authentication mechanisms to secure user access and protect sensitive data.
  3. Chat Log Feature: Implement local storage for chat logs, ensuring data encryption and compliance with privacy regulations.
  4. Weekly Email Report: Develop a scheduled task to compile and send weekly chat logs to designated email addresses, enhancing user engagement and transparenc

What we learned

What's next for local llm with authentication

  1. Security Enhancements: Implement SSL/TLS encryption for secure data transmission and storage, safeguarding user information.
  2. User Feedback: Incorporate feedback mechanisms to iterate UI/UX design based on user preferences and usability testing.
  3. Integration Capabilities: Explore API integrations for seamless data exchange with external systems, enhancing functionality and scalability.

Built With

Share this project:

Updates