Inspiration
Yahoo Pipes democratized data transformation until it died in 2015. We wanted to revive that power with modern security and performance.
What it does
Drag-and-drop visual editor for building data pipelines. Fetch from APIs/RSS/CSV, transform with 20+ operators, and share with the community.
How we built it
Using Kiro where, the stack included React + TypeScript frontend with ReactFlow, Node.js + Express backend, PostgreSQL database. Deployed on Google Cloud Run with Firebase Hosting and automated CI/CD.
Challenges we ran into
Real-time schema propagation between nodes, securing external API calls with encrypted storage, handling diverse data formats while maintaining type safety, and implementing undo/redo for complex graph state.
Accomplishments that we're proud of
Bringing back Yahoo pipes to its glory that it deserves, 20+ operators with template system, and fully automated deployment pipeline.
What we learned
Graph-based data flow architecture, schema propagation trade-offs, GCP auto-scaling patterns, and the critical importance of intuitive UX for complex systems.
What's next for Pipe Forge
Webhook triggers for real-time execution, collaborative editing, operator marketplace, more API integrations (Twitter, Slack, Notion), and data visualization outputs.
Built With
- express.js
- firebase-hosting
- github-actions
- google-cloud-run
- node.js
- postgresql
- react
- reactflow
- redis
- typescript
Log in or sign up for Devpost to join the conversation.