Inspiration
With the increasing usage of Large Language Models like ChatGPT, Gemini, and Claude, we observed that users often struggle to frame effective prompts, especially when trying to maintain contextual continuity. We wanted to automate and enhance the prompt-building process without requiring users to switch tools or copy-paste between platforms. This led to the idea of Promper — a lightweight Chrome extension that makes your prompt proper with a single keystroke.
What it does
Promper is a real-time prompt enhancement extension for Chrome. It works seamlessly on popular AI chat interfaces such as ChatGPT, Gemini, and Claude. The extension continuously monitors the user input. Upon pressing Shift + Enter, it captures the current message along with the previous context and sends it to the Perplexity API using the sonar-pro model. The API returns a refined, context-aware, and structured prompt, which is automatically inserted into the chat input field. This enhanced prompt increases the chances of receiving more accurate, relevant, and readable responses from LLMs.
How I built it
- Frontend: Developed using standard HTML, CSS, and JavaScript to create a minimal UI for the Chrome extension.
- Event Listener: We used JavaScript to detect user keystrokes and capture messages on platforms like ChatGPT, Gemini, and Claude.
- Backend Integration: Integrated with the Perplexity API via asynchronous fetch calls, sending both current and previous messages to the
chat-completionendpoint using the sonar-pro model. - DOM Manipulation: Used query selectors to interact with AI chatboxes and replace user inputs with the enhanced prompt.
- Storage: Chrome local storage API is used to maintain basic settings and context cache.
Challenges we ran into
- Ensuring compatibility across different LLM chat UIs, as each has a unique DOM structure.
- Managing real-time input capture while maintaining user experience without any lag or disruption.
- Handling rate limits and potential failure scenarios with Perplexity API calls.
- Prompt enhancement logic had to maintain tone and intent without altering the core meaning, which required rigorous testing.
Accomplishments that we're proud of
- Successfully built a browser extension that interacts with multiple third-party AI platforms in real-time.
- Achieved instant and seamless prompt enhancement with zero disruption to the user's workflow.
- Built a working pipeline between front-end input capture and backend API integration with low latency and high reliability.
- Users can now improve their prompts with just Shift + Enter, making LLM usage much more accessible and intuitive.
What we learned
- Deep understanding of DOM parsing and manipulation across different dynamic web applications.
- Advanced integration of third-party APIs within browser extension environments.
- Importance of context in prompt generation and how even subtle changes can impact the quality of LLM responses.
- Optimizing asynchronous operations to maintain UI responsiveness.
What's next for Promper – Make your prompt proper
- Add customizable shortcut keys and user preferences for enhanced usability.
- Implement a local caching system for API responses to reduce repeated queries.
- Introduce multi-language prompt support.
- Explore integration with additional LLM platforms like Mistral, Cohere, and open-source local models.
- Launch Promper on the Chrome Web Store for public access.
- Collect user feedback to iterate and refine the enhancement algorithm for more personalization.
Built With
- css
- dom
- html5
- javascript
- perplexity
- sonar
Log in or sign up for Devpost to join the conversation.