Inspiration
The idea for SenseAI was born out of a moment of extreme frustration in 2023, when two AI engineers at a healthcare startup spent five days searching through Hugging Face, GitHub, Kaggle, and Stack Overflow to piece together a working prototype for detecting pneumonia in chest X-rays. Despite the explosion of open-source AI models and datasets, the journey from intent ("build a model for X") to result ("working code and API") was painfully slow, fragmented, and filled with copy-paste chaos. This wasn’t a lack of knowledge, it was a broken discovery process. inspired by this inefficiency, i asked myself a radical question: "What if AI search gave us working results not just documents?" Just like Google transformed the web, i imagined a new kind of search engine, built by developers, for developers, where the answer isn’t a webpage but a deployable AI solution. SenseAI was created to bridge the gap between research and real-world application instantly. It's a tool designed for scientists, engineers, and students who don’t have hours to waste. It’s the AI-native interface that turns curiosity into execution with a single query. That’s the mission: to make building with AI feel as natural as thinking about it.
What it does
SenseAI is the world’s first browser + browserless, AI-native search engine that delivers working AI models, datasets, code, and APIs not just documents or links. Instead of sending you to GitHub or a research paper, SenseAI interprets your query, understands your intent, and returns a fully usable AI stack: A pretrained model matched to your use case. A real-world dataset ready for fine-tuning or analysis, Executable code in Python, JavaScript, or other languages. A deployable REST API, ready to integrate, Whether you're using the web interface, the command line, or the VS Code plugin, you get the same powerful output instantly and without friction. With a single prompt like: “Build a text classifier for toxic comments” You receive a full package including: The right model (e.g. BERT or RoBERTa) Dataset (e.g. Jigsaw Toxic Comments) Code snippet API endpoint (live or local deploy) SenseAI cuts out the manual searching, testing, and stitching-together of resources, helping developers and researchers go from idea → model → API in seconds. It’s like having an AI engineer built into your workflow.
How we built it
We built SenseAI using a streamlined AI-first tech stack designed for speed, automation, and real-time delivery.
At the core, we used Bolt AI to rapidly prototype and generate functional components—everything from the search engine logic to code generation and edge functions. Bolt AI allowed us to define natural language descriptions of what SenseAI should do, and instantly turn them into working backend and frontend blocks. This let us focus on innovation instead of boilerplate.
For the backend, we integrated Supabase, which powers our authentication, real-time database, file storage, and edge functions. Supabase handles user sessions, stores generated queries, links models and datasets, and delivers seamless sync across the browser, CLI, and IDE environments. Supabase Edge Functions fetch and index external APIs like Hugging Face, GitHub, and Kaggle, making real-time search possible.
We deployed the browser interface and CLI dashboard to Netlify, allowing for fast, global delivery of the frontend with instant updates and easy CI/CD. Our CLI tools and VS Code plugin also connect back to the same Supabase core, making all experiences unified.
By combining these powerful tools, we turned the vision of zero-click AI development into a fully working, multi-surface product.
Challenges we ran into
Building SenseAI wasn’t without its challenges especially because we were designing something that didn’t previously exist: a search engine that returns usable AI artifacts, not just documents. One of the biggest hurdles was building a unified backend that could interpret natural language queries and dynamically match them to real models, datasets, and code. Integrating APIs from platforms like Hugging Face, GitHub, Kaggle, and Papers with Code brought inconsistencies in metadata, response formats, and model availability. Normalizing and syncing that data in Supabase while keeping performance fast was a major technical lift. Another challenge was creating a seamless experience across browser and browserless environments. Making sure results rendered equally well in the browser, command line, and IDE plugin while syncing user state in real-time pushed our architecture to evolve into a more modular, reactive system. We also faced issues with API rate limits, Edge Function cold starts, and dynamically deploying APIs from scratch in a secure way. Lastly, using Bolt AI to build real-time code generation workflows took multiple iterations to make outputs safe, readable, and production-ready. These challenges shaped the maturity and reliability of SenseAI and pushed us to think deeper about developer-first UX across every interface.
Accomplishments that we're proud of
One of our proudest accomplishments with SenseAI is that we turned a bold, almost sci-fi vision search that returns working AI models, code, and deployable APIs into a functional, cross-platform tool. We successfully built the world’s first browser + browserless AI-native search engine, capable of interpreting natural language queries and delivering complete, ready-to-use AI solutions across web, CLI, and IDE environments. That means users can now go from “build a classification API” to having an actual working API in seconds. We also engineered a real-time backend on Supabase, indexing data from Hugging Face, GitHub, Kaggle, and other platforms. We didn’t just crawl those platforms we normalized and unified their outputs into a single schema and search experience, accessible by developers, researchers, and learners alike. Another major milestone was getting Bolt AI to generate deployable backend code with Supabase Edge Functions and pushing seamless updates through Netlify. This drastically reduced our build time and helped us launch fast with enterprise-grade performance. Lastly, we’re proud of building a product that resonates deeply with our early users people who felt the same pain of hunting for AI components and now say SenseAI feels like having a full-stack AI engineer at their fingertips.
What we learned
Building SenseAI taught us that developers and researchers don’t want more documentation they want execution. We realized that the future of AI tooling lies in collapsing the gap between “idea” and “working output.” People don’t just want to search for models they want to use them immediately, with the right code, dataset, and API already wired together. We also learned the power of building with tools like Supabase and Bolt AI, which let us iterate faster, focus on high-level logic, and build production-ready features without reinventing the wheel. Supabase’s edge functions and real-time sync enabled us to build truly cross-platform workflows that developers love. Lastly, we learned that AI-native UX is about reducing steps, not adding layers. From zero-click API generation to cross-interface sync, every second saved matters. The best developer tools feel invisible and that’s what we’ll keep striving for with SenseAI.
What's next for senseai
We’re just getting started with what SenseAI can become. Next, we’re rolling out SenseAI Cloud Workspaces a persistent environment where users can customize, fine-tune, and deploy AI solutions generated from queries. Think: auto-saved projects, Colab-like notebooks, and version control that lives in the cloud but powered entirely by your search intent.We’re also building a plugin ecosystem, allowing users to connect SenseAI to their favorite tools: Postman, Replit, LangChain, Slack, Discord, and more. Imagine typing a query, getting a model, and instantly integrating it into your stack without switching platforms. On the browserless side, we’re launching full support for SenseAI CLI Pro and the VS Code extension, enabling devs to generate APIs, models, and datasets from inside their terminal or editor with zero configuration. We’re also expanding our model & dataset indexing network, bringing in more sources like Zenodo, AWS Open Datasets, and even enterprise-internal model registries. Finally, we’re opening up our Edge Function SDK for power users and enterprise teams to customize their own pipelines using Supabase bringing fully autonomous prompt-to-deploy AI development to the edge.Our vision: make SenseAI the default interface for building AI, no matter where or how you work.
Built With
- bolt
- javascript
Log in or sign up for Devpost to join the conversation.