From 3eb6f8f4b4d5b7f96053327a03f56c6d3b80df69 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Erik=20Bj=C3=A4reholt?= Date: Tue, 29 Oct 2024 20:35:09 +0100 Subject: [PATCH] docs: fixed incorrect local/ollama/... provider prefix --- docs/providers.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/providers.rst b/docs/providers.rst index cbf00e87..d67fe919 100644 --- a/docs/providers.rst +++ b/docs/providers.rst @@ -10,7 +10,7 @@ To select a provider and model, run ``gptme`` with the ``--model`` flag set to ` gptme --model openai/gpt-4o "hello" gptme --model anthropic "hello" # if model part unspecified, will fall back to the provider default gptme --model openrouter/meta-llama/llama-3.1-70b-instruct "hello" - gptme --model local/ollama/llama3.2:1b "hello" + gptme --model local/llama3.2:1b "hello" On first startup, if `--model` is not set, and no API keys are set in the config or environment it will be prompted for. It will then auto-detect the provider, and save the key in the configuration file.