An intelligent tool that automatically generates concise summaries of GitHub repository README files using AI
AI Agentic is a Node.js application that automatically fetches README files from multiple GitHub repositories, generates AI-powered summaries, and compiles them into a comprehensive markdown document. The tool supports multiple AI backends including OpenAI's GPT models and local Ollama instances.
- Multi-Repository Processing: Process multiple GitHub repositories in a single run
- AI-Powered Summarization: Generate concise, technical summaries using AI
- Multiple AI Backends: Support for OpenAI GPT models and local Ollama instances
- Flexible Output: Save summaries locally or push directly to GitHub
- Dry Run Mode: Test functionality without making changes
- Comprehensive Error Handling: Detailed logging and error reporting
- Environment-Based Configuration: Secure configuration management
The project consists of three main scripts, each serving different use cases:
| Script | Purpose | AI Backend | Output |
|---|---|---|---|
summarizer.js |
Basic OpenAI integration | OpenAI GPT | GitHub Repository |
summarizer-local.js |
Local file output | OpenAI GPT | Local Files |
summarizer-ollama.js |
Ollama integration | Local Ollama | GitHub Repository or Local |
- Node.js 18+
- pnpm (recommended) or npm
- GitHub Personal Access Token
- OpenAI API Key (for OpenAI scripts) or Ollama (for Ollama script)
-
Clone the repository
git clone <repository-url> cd ai-agentic
-
Install dependencies
pnpm install
-
Configure environment variables
cp .env.example .env.local
-
Edit
.env.localwith your credentials# GitHub Configuration GITHUB_TOKEN=your_github_personal_access_token GITHUB_TARGET_REPO=your-username/ai-docs # OpenAI Configuration (for OpenAI scripts) OPENAI_API_KEY=your_openai_api_key # Ollama Configuration (for Ollama script) OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=llama2
The project provides several npm scripts for different use cases:
# Development - Local file output with OpenAI
pnpm run dev
# OpenAI - Push to GitHub repository
pnpm run openapi
# Ollama - Dry run (local output)
pnpm run ollama:dry
# Ollama - Push to GitHub repository
pnpm run ollama| Variable | Description | Default | Required |
|---|---|---|---|
GITHUB_TOKEN |
GitHub Personal Access Token | - | Yes |
GITHUB_TARGET_REPO |
Target repository for summaries | stephanbit/ai-docs |
No |
OPENAI_API_KEY |
OpenAI API Key | - | For OpenAI scripts |
OLLAMA_BASE_URL |
Ollama server URL | http://localhost:11434 |
No |
OLLAMA_MODEL |
Ollama model name | llama2 |
No |
Edit the SOURCE_REPOS array in the scripts to specify which repositories to process:
const SOURCE_REPOS = [
"stephanbit/eze-cli",
"stephanbit/micro-frontend-pages",
"stephanbit/vue-cli-plugin-easycloud"
];Generate summaries and save them locally:
pnpm run devThis will:
- Fetch README files from configured repositories
- Generate AI summaries using OpenAI
- Save results to
summaries/README_SUMMARIES.md
Generate summaries and push to a GitHub repository:
pnpm run openapiThis will:
- Process all source repositories
- Generate AI summaries
- Create or update the target GitHub repository
- Commit the summaries file
Test with local Ollama without pushing to GitHub:
pnpm run ollama:dryThis will:
- Use local Ollama instance for AI processing
- Save results locally
- Not make any GitHub changes
Use Ollama and push to GitHub:
pnpm run ollamaThis will:
- Process repositories with local Ollama
- Automatically create target repository if needed
- Push summaries to GitHub
ai-agentic/
βββ summarizer.js # Basic OpenAI β GitHub script
βββ summarizer-local.js # OpenAI β Local files script
βββ summarizer-ollama.js # Ollama β GitHub/Local script
βββ package.json # Dependencies and scripts
βββ summaries/ # Local output directory
β βββ README_SUMMARIES.md # Generated summaries
βββ README.md # This file
- @octokit/rest: GitHub API client
- axios: HTTP client for AI API calls
- dotenv: Environment variable management
- yargs: Command-line argument parsing (Ollama script)
- New AI Backend: Create a new script following the existing pattern
- Custom Output Formats: Modify the
compileSummaries()function - Additional Repository Sources: Extend the repository fetching logic
- Cause: Repository doesn't exist or insufficient permissions
- Solution: Ensure the target repository exists and your token has
reposcope
- Cause: Invalid API key or model name
- Solution: Verify your API key and use a valid model (e.g.,
gpt-4o-mini)
- Cause: Ollama not running or wrong URL
- Solution: Start Ollama and verify the base URL
- Cause: GitHub token lacks required scopes
- Solution: Ensure token has
repoandpublic_reposcopes
Enable detailed logging by checking the console output. All scripts provide comprehensive error messages and status updates.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the ISC License - see the LICENSE file for details.
For issues and questions:
- Check the troubleshooting section above
- Review the error logs for specific details
- Open an issue on GitHub with detailed error information
Made with β€οΈ for the open source community