Inspiration
While studying and working on technical subjects, I noticed that students and researchers often store information across multiple documents but struggle to quickly find relevant context or clear explanations. Traditional keyword search only points to files, and generic AI chatbots often give answers without grounding them in the actual document content.
This motivated me to build NebulaSearch — a system that combines document search with a contextual AI assistant, making documents interactive and easier to understand.
What I Learned
Through this project, I gained hands-on experience in:
- Building a "custom search engine" using tokenization and ranking
- Designing a backend API using Python’s HTTP server
- Integrating a local LLM using Ollama, avoiding cloud dependency
- Handling frontend–backend communication with JavaScript
- Designing systems with graceful degradation when AI is unavailable Most importantly, I learned how to think about projects from a "product and user-experience perspective", not just code. ## How I Built the Project NebulaSearch is built using a simple yet effective architecture:
- Documents are indexed using a "keyword-based inverted index"
- Search results are ranked based on term frequency
- Each document has an embedded "AI assistant panel"
- The AI assistant runs "locally via Ollama"
- The backend API invokes the local LLM and restricts responses to the document context
- If the AI is unavailable, the system falls back to keyword search without breaking functionality
This modular approach keeps the system lightweight, fast, and reliable.
Challenges Faced
One major challenge was integrating a local LLM on Windows, especially ensuring Python could correctly invoke the Ollama CLI. Another challenge was managing frontend JavaScript inside backend-generated HTML without breaking template rendering.
I solved these issues by isolating failures, testing each system layer independently, and implementing fallback mechanisms so the application remains usable even when AI services are unavailable.
Conclusion
NebulaSearch demonstrates how AI can be used responsibly and effectively by grounding responses in user-provided data. By combining search, local AI, and clean UI design, this project aims to improve how students and researchers interact with their own knowledge base.
What's next for Nebulla_Search
As for now nebulla_search is based on the local LLM next step is to make it global by itroducings its own agent and allowing some intellectual users to upload more documents through their login credentials.
Built With
- css
- html
- javascript
- ollama
- python
Log in or sign up for Devpost to join the conversation.