-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Closed
Labels
enhancementNew feature or requestNew feature or request
Description
Description
I would like to be able to have the whole system thing to be run locally, with no reliance on external APIs, and use self-hosted LM Studio and Ollama models.
I am happy to accept it may already be possible, if so, a simple guide would be fantastic.
I love the goal of the project, Rust is an excellent choice.
Alternatives Considered
I have tried removing all references to subscription services, yet I still get "Streaming setup failed: Boot failed: Agent LLM driver init failed: Missing API key: Set GROQ_API_KEY environment variable for provider 'groq'" messages. I have never had nor referenced Groq, in this project or ever.
Additional Context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request