Generate small apps with one prompt. Powered by the Gemini API and Model Context Protocol (MCP).
This project is a fork of InstantCoder with added support for connecting to MCP servers as alternative code generation sources.
- Dual Code Generation Sources: Choose between Gemini API and MCP servers
- MCP Server Management: Add, edit, and connect to any MCP server
- Beginner-Friendly Interface: Simple UI for managing MCP connections
- Real-time Code Generation: Stream code generation results as they're created
- Interactive Sandbox: Test generated code immediately in a Sandpack environment
- Docker Support: Easy deployment to cloud services like Railway
The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. It allows for a clean separation between LLM applications and the tools, resources, and prompts they use. Think of MCP like a USB-C port for AI applications - it provides a standardized way for AI applications to connect to various resources and tools.
- Gemini API to use Gemini 1.5 Pro, Gemini 1.5 Flash, and Gemini 2.0 Flash Experimental
- Model Context Protocol (MCP) for connecting to MCP servers
- Zustand for state management
- Sandpack for the code sandbox
- Next.js app router with Tailwind
- Docker for containerization and deployment
- Clone the repo:
git clone https://github.com/vredrick2/InstantCoder-MCP - Create a
.envfile and add your Google AI Studio API key:GOOGLE_AI_API_KEY= - Run
npm installandnpm run devto install dependencies and run locally
- Docker and Docker Compose installed on your machine
- Clone the repo:
git clone https://github.com/vredrick2/InstantCoder-MCP - Create a
.envfile with your Google AI API key:GOOGLE_AI_API_KEY=your_api_key_here - Build and start the container:
docker-compose up -d
- Access the application at http://localhost:3000
- Build the Docker image:
docker build -t instantcoder-mcp . - Run the container:
docker run -p 3000:3000 -e GOOGLE_AI_API_KEY=your_api_key_here instantcoder-mcp
- Access the application at http://localhost:3000
Railway is a platform that makes it easy to deploy applications with minimal configuration.
- Fork this repository to your GitHub account
- Sign up for Railway
- Create a new project and select "Deploy from GitHub repo"
- Connect your GitHub account and select the InstantCoder-MCP repository
- Add the environment variable
GOOGLE_AI_API_KEYwith your Google AI API key - Railway will automatically detect the Dockerfile and deploy your application
- Once deployed, Railway will provide you with a public URL to access your application
This project includes specific optimizations for Railway deployment:
-
Database Configuration: The Prisma setup is configured to work without requiring a database during the build process, which prevents build failures in Railway.
-
Build Process: The build script has been modified to separate Prisma client generation from database migrations, allowing successful builds even without a configured database.
-
Environment Variables:
GOOGLE_AI_API_KEY(required): Your Google AI API key for GeminiDATABASE_URL(optional): If you add a PostgreSQL service in Railway, this will be automatically set
-
Adding a Database (Optional):
- To enable saving generated apps, add a PostgreSQL service in Railway
- Railway will automatically connect the database to your application
- The first time you deploy with a database, you may need to run migrations manually:
railway run npm run migrate
-
Troubleshooting:
- If you encounter build issues, check that your
GOOGLE_AI_API_KEYis correctly set - For database-related errors, verify that your PostgreSQL service is properly provisioned
- If you encounter build issues, check that your
- Click the "MCP Server" option in the main interface
- Click the "+" button to add a new MCP server
- Enter the server details:
- Name: A friendly name for the server
- URL: The full URL to the MCP server endpoint
- Auth Type: None, API Token (OAuth coming soon)
- Click "Add Server" to save the connection
- Select the server from the dropdown to connect
- Once connected, enter your prompt and generate code
This implementation is compatible with any MCP server that follows the Model Context Protocol specification. The integration will automatically detect and use code generation tools available on the connected server.
The MCP integration consists of several key components:
- MCP Client Service: Core service for connecting to and interacting with MCP servers
- MCP Server Store: State management for server configurations
- MCP API Endpoint: Backend API for handling code generation requests
- UI Components: User interface for managing MCP server connections
Contributions are welcome! Feel free to submit issues or pull requests to improve the MCP integration.
This project is licensed under the same terms as the original InstantCoder project.
This is a personal project and not a Google official project