Discover software product ideas by mining Reddit for user pain points. AI-powered analysis with Google Gemini.
Finding product ideas shouldn't be guesswork. Every day, thousands of people describe their frustrations on Reddit — "I wish there was...", "Why isn't there a tool for...", "I'd pay for something that...". These are validated pain points from real users.
This tool automates the discovery process:
- Scrapes targeted subreddits for posts expressing frustration or unmet needs
- Uses AI to extract structured insights (pain point, audience, market size, solutions)
- Ranks opportunities by a composite score so you can focus on the best ones
- Presents everything in a clean, filterable dashboard
Stop building products nobody wants. Start with real pain.
- Scrapes posts and comments from 12+ target subreddits (r/SaaS, r/startups, r/Entrepreneur, etc.)
- Filters for pain-point language ("I wish", "frustrated", "need a tool", etc.)
- Analyzes each match with Google Gemini to extract structured insights
- Ranks opportunities by a composite score (engagement × severity × market potential)
- Displays everything in a clean, filterable dashboard
- 🔍 Smart Scraping — Targets 12+ subreddits with pain-point keyword matching
- 🌐 No Auth Scraping — Public API mode, no Reddit credentials needed
- 🤖 AI Analysis — Claude or Gemini extracts pain points, audiences, solutions, market estimates
- 📊 Opportunity Scoring — Composite score based on engagement, severity, and market size
- 🏷️ Auto-Categorization — Productivity, Dev Tools, Business, Marketing, etc.
- 🔥 Trending View — See what's hot right now
- 🔎 Search & Filter — By subreddit, category, score, keywords
- 📥 Export — CSV and JSON export
- 🚀 One-Click Scrape — Trigger scrapes from the dashboard UI
- 🎮 Demo Mode — Try the dashboard with sample data, no API keys needed
- ⏰ Scheduled scraping (cron jobs)
- 📧 Email/Slack alerts for high-score opportunities
- 📈 Trend detection over time
- 🌐 Multi-platform support (Hacker News, Twitter/X, Indie Hackers)
- 🏷️ Custom keyword lists
- 📊 Analytics dashboard with charts
No API keys needed — explore the dashboard with sample data:
git clone https://github.com/lefttree/reddit-pain-points.git
cd reddit-pain-points
# Backend
cd backend
python -m venv venv && source venv/bin/activate
pip install -r requirements.txt
python cli.py demo # Load 10 sample pain points
python cli.py serve & # Start API on :8000
cd ..
# Frontend
cd frontend
npm install
npm run dev # Start UI on :5173Open http://localhost:5173 and explore!
Only need an LLM API key — scrape Reddit without credentials:
git clone https://github.com/lefttree/reddit-pain-points.git
cd reddit-pain-points
cp .env.example .env
# Add your LLM key to .env (either one):
# ANTHROPIC_API_KEY=sk-ant-... (Claude, recommended)
# GOOGLE_API_KEY=AIza... (Gemini)
cd backend
python -m venv venv && source venv/bin/activate
pip install -r requirements.txt
python cli.py scrape --public # Scrape Reddit (no auth needed!)
python cli.py analyze # Analyze with AI
python cli.py serve & # Start API
cd ../frontend && npm install && npm run dev # Start UI- Python 3.11+
- Node.js 18+
- LLM API key — one of:
- Anthropic/Claude API key (recommended)
- Google Gemini API key
- Optional: Reddit API credentials (create app here) — for higher rate limits
git clone https://github.com/lefttree/reddit-pain-points.git
cd reddit-pain-points
# Copy and fill in your credentials
cp .env.example .env
# Edit .env — at minimum set one LLM key
# Install backend
cd backend
python -m venv venv && source venv/bin/activate
pip install -r requirements.txt
cd ..
# Install frontend
cd frontend
npm install
cd ..cd backend
# Scrape Reddit + analyze with AI (full pipeline)
python cli.py run --public # Public API (no Reddit credentials)
python cli.py run # With Reddit API credentials
# Or run steps separately:
python cli.py scrape --public # Scrape via public API
python cli.py scrape # Scrape via Reddit API (needs credentials)
python cli.py analyze # Analyze unanalyzed posts with LLM
python cli.py scrape -s "SaaS,startups" # Specific subreddits
python cli.py stats # View database stats# Option A: Quick start (both services)
./run.sh
# Option B: Manual
cd backend && python cli.py serve & # API on :8000
cd frontend && npm run dev & # UI on :5173Open http://localhost:5173 to browse discovered pain points.
Here's what an analyzed pain point looks like:
🎯 Pain Point: "No simple tool to aggregate feature requests from multiple
channels (email, Slack, support, social) for small teams"
📊 Opportunity Score: 82/100
🔥 Severity: 4/5
👥 Audience: Small SaaS founders and product managers
📈 Market Size: Large
💡 Product Ideas:
1. Lightweight feature request aggregator with email/Slack/Twitter integrations
2. AI-powered request deduplication and ranking tool
3. Simple voting board with multi-channel intake automation
🔍 Existing Solutions: Canny, Productboard, Nolt, Fider
📌 Source: r/SaaS · ⬆ 187 · 💬 43 comments
reddit-pain-points/
├── backend/
│ ├── api.py # FastAPI REST API
│ ├── scraper.py # Reddit scraper (PRAW)
│ ├── analyzer.py # Gemini LLM analyzer
│ ├── database.py # SQLite management
│ ├── config.py # Configuration
│ ├── cli.py # CLI interface
│ ├── demo_data.py # Sample data for demo mode
│ └── requirements.txt
├── frontend/
│ ├── src/
│ │ ├── App.jsx # Main React app
│ │ ├── main.jsx # Entry point
│ │ └── index.css # Tailwind styles
│ ├── package.json
│ └── vite.config.js
├── data/ # SQLite database (auto-created)
├── docs/
│ └── DESIGN.md # Architecture & design doc
├── .env.example # Template for credentials
├── run.sh # Quick start script
└── README.md
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/pain-points |
List pain points (filterable, paginated) |
| GET | /api/pain-points/:id |
Single pain point detail |
| GET | /api/stats |
Dashboard statistics |
| GET | /api/trending |
Trending pain points |
| GET | /api/categories |
Categories with counts |
| GET | /api/subreddits |
Subreddits with counts |
| GET | /api/export?format=csv |
Export data |
| POST | /api/scrape |
Trigger scrape run |
| GET | /api/scrape/status |
Scraper status |
Auto-generated docs at http://localhost:8000/docs (Swagger UI).
All config via .env:
# LLM API Key (at least one required for analysis)
ANTHROPIC_API_KEY=sk-ant-... # Claude (recommended)
GOOGLE_API_KEY=AIza... # Gemini
LLM_PROVIDER=auto # "auto" (default), "claude", or "gemini"
# Reddit API (optional — use --public flag to skip)
REDDIT_CLIENT_ID=your_client_id
REDDIT_CLIENT_SECRET=your_client_secret
# Optional
SUBREDDITS=SaaS,startups,Entrepreneur # Override target subreddits
SCRAPE_LIMIT=50 # Posts per subreddit per feed
DATABASE_PATH=data/painpoints.db # Database location- Go to https://www.reddit.com/prefs/apps
- Click "create another app..."
- Choose "script" type
- Set redirect URI to
http://localhost:8080 - Copy the client ID (under app name) and secret
The scraper targets these 12 subreddits by default:
| Subreddit | Focus |
|---|---|
| r/SaaS | SaaS product discussions |
| r/startups | Startup challenges & needs |
| r/Entrepreneur | Business pain points |
| r/smallbusiness | Small business struggles |
| r/webdev | Web development frustrations |
| r/programming | Developer tooling gaps |
| r/productivity | Workflow & productivity issues |
| r/selfhosted | Self-hosting needs & gaps |
| r/sideproject | Side project builders' needs |
| r/indiehackers | Indie maker challenges |
| r/digitalnomad | Remote work pain points |
| r/nocode | No-code/low-code limitations |
Override via .env: SUBREDDITS=SaaS,startups,yourSubreddit
💡 Suggest new subreddits! Open an issue or PR to add subreddits you think are valuable.
The scraper searches for posts containing these 31 keywords/phrases:
| Category | Keywords |
|---|---|
| Wishes | "i wish", "wish there was", "why isn't there", "doesn't exist" |
| Frustration | "frustrated", "annoying", "so annoying", "drives me crazy", "hate when", "sucks", "terrible", "worst part" |
| Seeking | "looking for", "need a tool", "anyone know of", "is there a", "recommend a", "help me find", "what do you use for", "looking for something", "can't find" |
| Alternatives | "alternative to", "tired of" |
| Validation | "would pay for", "shut up and take my money", "deal breaker" |
| Feedback | "pain point", "struggle with", "feature request", "broken", "waste of time" |
💡 Suggest new keywords! If you know domain-specific phrases that signal pain points, open an issue or PR.
Discovered pain points are auto-categorized into:
Productivity · Developer Tools · Business · Communication · Finance · Health · Education · Marketing · Design · Data & Analytics · Automation · Other
- Scheduled Scraping — Cron-based scraping with configurable intervals
- Alerts — Email/Slack notifications when high-scoring opportunities are found
- Multi-Platform — Extend to Hacker News, Twitter/X, Indie Hackers, Product Hunt
- Trend Detection — Track pain points over time, detect emerging trends
- Custom Keywords — User-defined keyword lists for domain-specific discovery
- Team Features — Bookmarks, notes, and collaboration on opportunities
- Analytics Dashboard — Charts and visualizations for scraping history
- Backend: Python, FastAPI, PRAW (optional), SQLite
- Frontend: React 19, Vite, Tailwind CSS
- Analysis: Claude (Anthropic) or Gemini (Google) — auto-detected
Contributions are welcome! See CONTRIBUTING.md for guidelines.
MIT — see LICENSE for details.