Replies: 1 comment
-
I believe you are missing
So it should be provider = "ollama";
vendors = {
deepseek = {
__inherited_from = "openai",
api_key_name = "",
endpoint = "http://localhost:11434/v1",
model = "deepseek-coder-v2",
};
}; |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
This is my configuration for Avante (in nix as I am using NixOS) trying to use the local Deepseek model that I have running with Ollama. When I go to http://localhost:11434 in my browser, it shows me that Ollama is running, and I can see the Ollama service when I check btop. When I open up the Avante ask UI panel and enter in a question, I get a 404 error.

And here is the output of

ollama list
showing the model is installed:Has anyone had success getting a locally running Deepseek coder model via Ollama to work properly in Avante? I feel like I must be close, but am just missing something I don't see. I have been able to get Avante to use online models with an API endpoint, but not a local model yet.
Beta Was this translation helpful? Give feedback.
All reactions