This Python project processes emails using the Fastmail API and summarizes or generates replies to them using a local Ollama API.
- Fetch and list emails from your Fastmail inbox.
- Clean email content by removing HTML and CSS to prepare it for processing.
- Summarize emails using the Ollama API.
- Generate polite and professional replies.
- Fully interactive CLI for selecting and processing emails.
- Automatically save Ollama-generated replies as drafts in your Fastmail account.
- Tag emails with labels based on Ollama's analysis.
- Automatically or manually move emails to folders.
- Delete emails directly from the CLI.
- Process multiple emails at once for summarization, replies, or folder assignment.
- Generate tasks from email content and export to external tools like Todoist or Trello.
- Python 3.7 or later
- Fastmail API credentials
- Access to an Ollama instance
git clone https://github.com/jakeandco/fastmail-ollama-cli.git
cd fastmail-ollama-cliCreate a virtual environment in the project directory:
python -m venv venvActivate the virtual environment:
- Windows:
.\venv\Scripts\activate- macOS/Linux:
source venv/bin/activateInstall the required dependencies using pip:
pip install -r requirements.txtCreate a .env file in the project directory to store sensitive credentials:
touch .envAdd the following variables to .env:
API_TOKEN=your_fastmail_api_token
ACCOUNT_ID=your_account_id
API_URL=https://api.fastmail.com/jmap/api/
OLLAMA_URL=http://ollama:11434/api/generate
Run the script to start the interactive email processor:
python main.pyInteractive CLI Features
- Lists emails from your Fastmail inbox.
- Allows you to select an email to summarize or reply to.
- Generates replies or summaries using the Ollama API.
To update or add dependencies, install the package and update requirements.txt:
pip install <package_name>
pip freeze > requirements.txt- Missing Module Error: Ensure the virtual environment is activated and dependencies are installed.
- API Errors: Verify the credentials in your .env file and that the API endpoint exists on your Ollama server.
This project is licensed under the MIT License.