💡 Inspiration

Setting up software projects is repetitive. CodePromptLLM turns plain-text prompts into full, working codebases — fast and clean.

⚙️ What it does

It converts a single prompt into a deploy-ready software starter kit: folders, files, routes, auth, DB setup, and docs — all exportable.

🧱 How we built it

  • Frontend: React + Tailwind
  • AI: GPT-4 via OpenAI API
  • Formatter: Custom file structuring
  • ZIP Export: JSZip
  • Hosting: GitHub Pages / PWA

🧗 Challenges we ran into

  • Cleaning LLM outputs
  • Handling vague/incomplete prompts
  • Keeping folder structure consistent
  • Avoiding bloated or broken code

🏆 Accomplishments we're proud of

  • Fully working codebase from 1 prompt
  • Multi-stack support: MERN, Django, Laravel
  • Simple UI, fast ZIP export
  • Lightweight and mobile-friendly

📚 What we learned

  • LLMs need guided prompt tuning
  • Devs prefer clean scaffolds over large templates
  • Speed and clarity matter more than features

🔮 What's next for CodePromptLLM

  • CLI + Telegram bot version
  • Add Swahili/French prompt support
  • GitHub auto-push + live deploy
  • MPesa, PayPal monetization
  • Offline/portable LLM version (Ollama)

Built With

  • and
  • android
  • built-with-**languages**:-javascript
  • css
  • firebase-hosting-**offline-support**:-pwa-(progressive-web-app)-**authentication-(optional)**:-firebase-auth-**payment-integration-(optional)**:-mpesa-daraja-api
  • firefox
  • html
  • html2pdf-**platform-&-deployment**:-github-pages
  • json
  • markdown-**frontend-framework**:-react.js
  • paypal-rest-api-**database-(optional)**:-supabase-or-localstorage-for-user-sessions-**version-control**:-github-**browser-compatibility**:-fully-optimized-for-chrome
  • replit
  • tailwind-css-**ai-integration**:-openai-gpt-4-api-**code-structuring-&-parsing**:-custom-prompt-to-code-logic-**packaging**:-jszip-for-dynamic-zip-generation-**export-tools**:-filesaver.js
Share this project:

Updates

posted an update

Project Update: CodePromptLLM — AI-Powered Codebase Generator

CodePromptLLM continues to grow. Below is the latest update, complete with working links, feature progress, and community access.

What’s New

  • Multi-stack support: Node.js (Express), React, Django, Laravel, Flask
  • Full prompt-to-code ZIP export system
  • GPT-4 powered code generation with clean formatting
  • Responsive UI built with Tailwind CSS
  • Live preview of project file structure
  • PWA-ready interface (offline-friendly)

Try It Out


Screenshots

Preview screenshots will be added soon at:
https://github.com/omosHxr/LLM/assets


Upcoming Features

  • Stack selection interface with AI auto-detection
  • Multilingual prompt support (Swahili, French)
  • MPesa and PayPal integration for premium templates
  • GitHub push & deploy option
  • Telegram/CLI versions
  • Offline generation using Ollama

Community Feedback

I designed and developed the full CodePromptLLM system and shared it with my developer community for testing and improvement feedback. Community insights have played a key role in guiding its roadmap.


Stay Connected

Contributions, forks, and suggestions are welcome.

Log in or sign up for Devpost to join the conversation.

posted an update

Project Update: CodePromptLLM — AI-Powered Codebase Generator

CodePromptLLM has evolved significantly over the past few days. Here's what's new:

What’s New

  • Multi-stack support: React, Node.js (Express), Django, Flask, Laravel
  • Prompt-to-code ZIP export: Generate and download fully structured codebases
  • Responsive UI using Tailwind CSS with PWA support
  • GPT-4 LLM integration for accurate and modular code generation
  • Customizable output: Adjust stack config before code generation
  • Live Replit demo for testing without installation

Screenshots

Prompt-to-Code Interface
[Insert image or preview link here]

Generated Project Structure Preview
[Insert image or preview link here]


Try It Out Now


Upcoming Features

  • Stack selector with auto-detection
  • Auto-generated documentation (README.md, API docs)
  • Swahili and French language prompt support
  • MPesa and PayPal payment support for premium templates
  • Offline model support via Ollama

Community Feedback

I built and shared CodePromptLLM with my local and online developer communities. Feedback has been insightful and overwhelmingly positive. Thanks to everyone contributing ideas and testing.

Let me know what features you'd like to see next or what stacks we should add.

Log in or sign up for Devpost to join the conversation.