FastAPI-LLM is an advanced, Dockerized Python project that integrates Large Language Models (LLMs) with a FastAPI backend. It demonstrates how to leverage AI to perform intelligent automation tasks and exposes APIs that can be consumed by other applications.
This project is production-ready in structure, containerized with Docker, and designed for deployment on cloud platforms like AWS.
- FastAPI Backend: Lightweight, high-performance web API framework.
- LLM Integration: Uses AI models to perform automation and intelligent processing.
- Dockerized: Easily run the app locally or in cloud environments without dependency conflicts.
- Cloud-Ready: Fully compatible with AWS ECS / EC2 deployment.
- Modular Architecture: Clean folder structure for scalability and maintainability.
- Health Endpoint: Check service status with a simple
/healthroute.
FastAPI-LLM/
│── app/
│ ├── api/ # API route definitions
│ ├── core/ # Core utilities and configurations
│ ├── models/ # Data and ML model classes
│ ├── services/ # Business logic and AI services
│ └── main.py # FastAPI app entrypoint
│
│── Dockerfile
│── requirements.txt
│── README.md
│── .gitignore
| Layer | Technology |
|---|---|
| Backend API | FastAPI |
| Language Model | OpenAI / LLM API |
| Containerization | Docker |
| Cloud Deployment | AWS (ECS/EC2/S3) |
| Configuration | Python-dotenv / YAML |
| CI/CD (future) | GitHub Actions / GHCR |
- Python 3.10+
- Docker installed locally
- Optional: OpenAI API key for LLM integration
# Build the Docker image
docker build -t fastapi-llm:latest .
# Run the container
docker run -p 8000:8000 fastapi-llm:latest
# Visit API health check
http://localhost:8000/health- Ensure your OpenAI API key or other LLM credentials are configured in
.env - Call the AI-powered endpoints via POST requests with your input payload.
- Dockerized project can be deployed on AWS Free Tier using ECS or EC2.
- Cloud storage (like S3) can be used for storing outputs or logs.
- CI/CD (GitHub Actions) can automate building and pushing Docker images to GHCR.
- Add authentication & authorization to API routes
- Implement automatic versioned Docker builds via CI/CD
- Extend AI services with multiple LLM models and pipelines
- Integrate logging & monitoring for production readiness
- Showcases full-stack backend development using Python + FastAPI
- Demonstrates AI/LLM integration in real-world scenarios
- Highlights cloud deployment readiness with Docker & AWS
- Illustrates modern development best practices: modular architecture, containerization, CI/CD
MIT License – free to use and modify