Inspiration

Pedagogue.ai, inspired by the Greek word for "teacher," began as a simple grading and lesson planning tool for K-12 teachers. It evolved into the Living Books project, a dynamic platform curating customized learning paths. This transformation was driven by the need to bridge the gap between static reading materials and interactive learning experiences.

What it does

Pedagogue.ai's Living Books project offers a database of AI-enhanced web books that provide an interactive reading experience. These books continuously update their content and engage readers by conducting research, answering questions, and ensuring comprehensive understanding, making each book a living entity that grows and adapts to the reader's needs.

How we built it

We developed the platform using advanced AI models like Snowflake Arctic and GPT-4, integrating them with user-friendly interfaces in Streamlit. Snowflake Arctic handles code-related and SQL questions, while GPT-4 manages general inquiries and research. Our first book, "Aspiring Developers," was crafted from community feedback from our Newsletter, Global Tongues, by Oblack Technologies. This exemplifies our commitment to adaptive and supportive learning resources.

Challenges we ran into

One significant challenge was creating two custom memories that inherited from the StreamlitChatMessageHistory. Combining these two memories to function as one and integrating them with MongoDB Atlas for long-term storage was difficult. We extended StreamlitChatMessageHistory to MongoStreamlitChatMessageHistory for persistent storage in MongoDB, and implemented CustomSnowflakesMemory for managing chat history. Integrating these components to work seamlessly required substantial effort and debugging.

Accomplishments that we're proud of

We successfully launched the BETA version of Pedagogue.ai. "Aspiring Developers" has made a notable impact, helping individuals understand complex software development concepts through a dynamic, interactive learning process. Our platform's ability to stay current and tailor content to individual needs showcases our innovative approach to education.

What we learned

We learned how powerful Streamlit is for building web applications, enabling us to launch the beta version quickly. Streamlit's support for connecting to Snowflake through the st.conn and secrets management made it easy to store and query data. However, we found Snowflake's responses were heavily SQL and code-related, prompting us to use GPT-4 for general questions. The importance of flexibility and user feedback in developing educational tools became evident, revealing the needs of modern learners and the potential of AI to meet these needs. We also gained insights into the challenges of integrating AI with traditional educational content, refining our approach.

What's next for pedagogue.ai - The living books initiative.

We plan to add more tools for the "Aspiring Developers" book to enhance the personalized learning experience. Additionally, Pedagogue.ai aims to expand its library by collaborating with authors from various disciplines, introducing more books into the Living Books project. Each book will provide an immersive learning experience with the latest information and interactive tools. We are committed to redefining educational boundaries and empowering learners worldwide.

Codebase Highlights

Main Application (pedagogue_ai.py)\n

  • Authentication: Implements user login with MongoDB and bcrypt for password management.
  • Dynamic Content: Uses Streamlit for interactive web app functionality.
  • AI Integration: Utilizes Snowflake Arctic for code-related and SQL questions, and GPT-4 for general inquiries and research.

Utility Functions (utils.py)\n

  • Chat History: Extends Streamlit's chat history to use MongoDB for persistent storage.
  • Custom Memory: Implements custom memory management by creating two custom memories that inherit from StreamlitChatMessageHistory, combining them to function as one with long-term storage via MongoDB Atlas.

Preprocessing (preprocessor.py)\n

  • Text to Vector Conversion: Converts text documents to vectors using OpenAI embeddings.
  • File Conversion: Converts various file types to JSON using the Unstructured API.

Library Configuration (library.py)\n

  • Books Metadata: Stores metadata for available books including vector paths and prompts for categorization.

Account Creation (create_account.py)\n

  • Secure Account Creation: Generates secure passwords and stores hashed passwords in MongoDB.

AI Interaction (alis.py & alis_chat.py)\n

  • Agent Initialization: Sets up AI agents with chat history and vector retrieval capabilities.
  • Chat Interface: Provides a user interface for interacting with the AI agent, with feedback options for continuous improvement.

Built With

  • bcrypt
  • decouple
  • fiass
  • gpt-4
  • langchain
  • mongodb
  • openai
  • pthon
  • pydantic
  • replicate
  • snowflake
  • snowflake-artic-instruct
  • streamlit
  • unstructured-api
Share this project:

Updates