Contextual

Developed by: Xingye, Chai Yin, Sirui and Izzat

Project Description

Contextual is an AI-powered chat application designed to deliver highly specialised and customised AI bots rather than general-purpose AI. The goal is to provide users with AI assistants that are truly helpful and context-aware, leveraging user preferred memory, knowledge-based reasoning, and eventually retrieval-augmented generation to improve relevance and accuracy of responses. Unlike general AI chatbots, our system can maintain specialised memories and knowledge for different topics, enabling more meaningful and tailored interactions.

We foresee a future where AI offers tailored experiences even in professional contexts, such as law, medical consultation, and other specialised domains, providing expert guidance that is precise and personalised.

Features and Functionality

  • Specialised AI Responses: Each bot is trained or configured to handle specific domains or topics, ensuring the answers are highly relevant.
  • Memory Management: Users can store and retrieve specialised memories, allowing the bot to recall prior context and conversations.
  • Conversation History: Chat sessions are persisted and retrievable from a database, enabling continuity across interactions.
  • Reply and Highlight Functionality: Users can directly reference specific messages in their replies, eliminating manual copy-pasting. This feature provides a clear visual reference while allowing for free scrolling of the conversation history.
  • Customisable UI: Interactive components, including input boxes, chat bubbles, and optional reply bars, improve user experience.

The Tech Stack: A Robust Foundation

Our development was conducted in Visual Studio Code, utilising a modern technology stack designed for type safety and an optimal developer experience:

  • Lynx JS: Served as the core framework for constructing native-quality mobile user interface components.
  • TypeScript: Was essential for managing the complex application state involving chats, folders, and memories with full type safety, which prevented a whole class of bugs during development.
  • React-like Patterns: The codebase employs .tsx components and a familiar React component structure to create a maintainable and declarative user interface.
  • Firebase: Used as the backend to handle data storage and real-time database functionalities.

The Core Problem: Making AI Useful for Real Work

We recognised that real work is not comprised of one-off prompts. It involves projects, context, and continuity. Contextual is designed to bring that essential structure to AI interactions.

1. A System Built on Projects, Not Just Chats

We designed a system of Folders that function as dedicated workspaces. This replaces the isolated "Chat with GPT-4" model with a practical structure, such as a "Website Redesign" folder containing all related conversations. The entire state is managed in the Firebase Realtime Database via a clean REST API interface, ensuring that the user's organisational structure always persists and synchronises across sessions.

2. Providing AI with Memory to Eliminate Repetition

The need to repeat information is a major inefficiency. Our Memory system is engineered to solve that.

  • A Memory functions as a persistent note provided to the AI. Users can create one for "My Writing Style" or "Project Phoenix API Keys."
  • We built a custom user interface with a dynamic dropdown selector to make the process of assigning these memories to chats seamless and intuitive.
  • When a user initiates a chat within a specific project folder and assigns a relevant memory, the AI operates with full context from the first message. This makes the AI feel significantly more intelligent and useful.

3. A Seamless and Intentional Cross-Platform Experience

We designed the application with a strong focus on providing a seamless experience across both iOS and Android devices:

  • Custom Navigation: We implemented a fluid application experience using a custom Navigation Context component to handle routing between the home screen, chat views, and modals without requiring page refreshes.
  • Icon-Driven Interface: The interface incorporates a comprehensive suite of custom assets to enable intuitive, tap-friendly actions that feel native on mobile devices.
  • Lynx-Powered Components: Leveraging Lynx's built-in elements, such as <scroll-view>, we ensure smooth scrolling through extensive chat histories, simplified development, and improved performance, all while maintaining robust support from JavaScript and React.

4. Designed for Modern AI Interaction Patterns

This project is an exploration of emerging patterns in AI interaction:

  • Human-in-the-Loop Design: Our application is not built on autopilot. It is designed for the user to act as the director, curating Memories and organising Folders. Features that allow users to edit chats to modify their context ensure the user remains firmly in control.
  • A Foundation for the Future: The technical architecture, a strongly-typed data model synchronised to Firebase, is intentionally built to support the integration of future AI-era features, such as citations (storing source URLs within Memories) or agent confirmations (where the AI could suggest an appropriate folder for a new chat).

Our Closing Thoughts

Many general-purpose AI chatbots provide generic answers that are not tailored to a user’s specific context or needs. Our project addresses this gap by creating specialised AI bots that maintain domain-specific memories and knowledge, allowing for personalised and contextually relevant assistance. With planned retrieval-augmented generation, we aim to ensure that the AI is not only responsive but also grounded in relevant and accurate information, paving the way for professional and expert-level AI assistance.

Built With

Share this project:

Updates