What it does

PPT Maker is a Python application that generates professional PowerPoint presentations from natural language text prompts by leveraging advanced AI models through the Ollama platform and the python-pptx library. Users simply provide a topic or prompt, and the tool automatically creates a structured, well-formatted presentation—complete with a title slide, content slides featuring bullet points, and a conclusion or thank-you slide. It supports customizable templates, multiple slide layouts, and options to adjust the number and structure of slides. Both command-line and Python API interfaces are available, allowing flexible integration into various workflows.

Key features include:

  • AI-powered content generation: Uses Ollama with models like Llama 3 and Mistral to intelligently create and enhance slide content.
  • Automated presentation structure: Automatically organizes content into clear, logical slides for maximum impact.
  • Professional formatting: Ensures consistent, high-quality visual styling and slide layouts.
  • Customization: Users can specify the number of slides, choose different AI models, select output files, and decide on content enhancement options.
  • Multiple deployment modes: Usable as a standalone Windows executable, web app, portable package, or cloud solution, making it accessible for all technical levels.
  • Comprehensive testing: Robust error handling, fallback mode (works even if AI is unavailable), and extensive tests ensure reliability.

Simply install the required dependencies, launch the tool, and generate presentations instantly—with or without direct AI content enhancement—making high-quality presentation creation accessible to everyone, regardless of technical background.

How we built it

  • The core pipeline uses a local LLM exposed via an OpenAI-compatible endpoint (e.g., llama.cpp/Ollama) to produce a slide outline and bullet content, which is then rendered into native PPTX via python-pptx for immediate editing in PowerPoint or Google Slides offline.
  • A lightweight UI layer (Streamlit/Flask) orchestrates prompt templates, calls the local model, and maps structured responses into layouts, drawing on patterns from open-source local LLM slide tools and tutorials to stay model-agnostic and vendor-free.

Challenges we ran into

  • Prompting smaller local models to consistently output slide-safe structure within token limits required tight schemas and post-processing to enforce titles, bullet count, and hierarchy deterministically
  • Matching the theme variety and visual quality of cloud tools while using only offline assets (fonts, palettes, placeholders) demanded careful template design without relying on hosted libraries.

Accomplishments that we're proud of

  • Achieved end-to-end, fully local generation with editable layouts, image placeholders, and multiple design themes, delivering a flow comparable to online builders but with no external data transfer.
  • Simplified setup to a run-local workflow (serve the model, select a template, generate), eliminating API costs and enabling repeatable, fast deck creation for private or regulated environments.

What we learned

  • Local LLMs reliably draft presentations when prompts constrain slide count and structure, but benefit from JSON-style schemas and validation before conversion to PPTX via python-pptx.
  • Users strongly prefer direct PPTX output over intermediary formats, since native slides integrate cleanly with existing branding workflows and allow immediate manual edits where needed.

What’s next for Local PPT-Maker

  • Add pluggable adapters and caching for broader local endpoints and faster multi-slide generation, plus expand offline theme packs and brand-guided templates to improve visual consistency.
  • Integrate optional offline image generation and chart placeholders, and ship a guided “hackathon deck” mode covering problem, solution, demo, learnings, and next steps to streamline demo-day prep.

TL;DR

  • Built with a local LLM + python-pptx pipeline for editable, on-device PPTX generation and a simple Streamlit/Flask UI.
  • Key wins: fully offline flow, multiple themes, simple setup; next: adapters, caching, brand packs, guided hackathon mode.

Built With

  • agentic-ai
  • llm
  • local-ai
  • ollama
  • python
Share this project:

Updates

posted an update

PPT Maker now supports LM Studio as an alternative to Ollama for AI-powered presentation generation. LM Studio provides a user-friendly way to run large language models locally with an OpenAI-compatible API.

Log in or sign up for Devpost to join the conversation.