Skip to content

Prompt Optimizer is a tool that takes a vague user request, synthesizes an optimal system/assistant prompt, and returns the LLM's response

Notifications You must be signed in to change notification settings

Claire56/PromptOptimiser

Repository files navigation

Prompt Optimizer

A sophisticated tool that takes vague user requests and synthesizes optimal system/assistant prompts to get better responses from Large Language Models (LLMs).

Features

  • Multi-Provider Support: Works with OpenAI, Anthropic (Claude), and Google Gemini APIs
  • Intelligent Prompt Synthesis: Uses advanced prompt engineering techniques including:
    • System prompt optimization
    • Few-shot prompting
    • Chain-of-thought reasoning
  • Automatic Optimization: Analyzes vague requests and creates structured, effective prompts
  • CLI Interface: Easy-to-use command-line tool

Installation

  1. Clone this repository:
git clone https://github.com/Claire56/PromptOptimiser.git
cd PromptOptimiser
  1. Install dependencies:
pip install -r requirements.txt
  1. Set up environment variables. Create a .env file in the root directory:
# For OpenAI
OPENAI_API_KEY=your_openai_api_key_here

# For Anthropic
ANTHROPIC_API_KEY=your_anthropic_api_key_here

# For Google Gemini
GOOGLE_API_KEY=your_google_api_key_here

# Default provider (openai, anthropic, or gemini)
DEFAULT_PROVIDER=openai

Usage

Basic Usage

python prompt_optimizer.py "help me write a blog post"

Specify Provider

python prompt_optimizer.py "analyze this data" --provider anthropic

Advanced Options

python prompt_optimizer.py "your vague request" --provider openai --model gpt-4 --temperature 0.7

How It Works

  1. Input Analysis: Takes your vague user request
  2. Prompt Synthesis: Uses an LLM to analyze the request and create an optimal system prompt with:
    • Clear role definition
    • Specific instructions
    • Output format guidelines
    • Context and constraints
  3. Response Generation: Uses the optimized prompt to generate a high-quality response
  4. Output: Returns both the optimized prompt and the final response

Example

Input:

"help me write something about AI"

Optimized System Prompt:

You are an expert technical writer specializing in artificial intelligence. 
Your task is to create engaging, informative content about AI topics.

Guidelines:
- Write in a clear, accessible style suitable for a general audience
- Include relevant examples and use cases
- Structure content with clear headings and sections
- Provide actionable insights

Output format: Well-structured article with introduction, main content, and conclusion.

Final Response: A well-structured article about AI based on the optimized prompt.

Supported Providers

  • OpenAI: GPT-3.5, GPT-4, GPT-4 Turbo
  • Anthropic: Claude 3 Opus, Claude 3 Sonnet, Claude 3 Haiku
  • Google: Gemini Pro, Gemini Pro Vision

Project Structure

PromptOptimiser/
├── prompt_optimizer.py      # Main CLI interface
├── optimizer/
│   ├── __init__.py
│   ├── core.py              # Core optimization logic
│   └── providers.py         # LLM provider integrations
├── requirements.txt
├── README.md
└── .env                     # Environment variables (create this)

License

MIT License

About

Prompt Optimizer is a tool that takes a vague user request, synthesizes an optimal system/assistant prompt, and returns the LLM's response

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages