Skip to content

Ollama compatible? #687

Closed Answered by ha100
Masber asked this question in Q&A
Oct 7, 2024 · 2 comments · 3 replies
Discussion options

You must be logged in to vote

sure. fully functional lazy config looks like this

return {
    "yetone/avante.nvim",
    event = "VeryLazy",
    lazy = false,
    version = false, -- set this if you want to always pull the latest change
    opts = {
        provider = "ollama",
        use_absolute_path = true,
        vendors = {
            ---@type AvanteProvider
            ollama = {
                ['local'] = true,
                endpoint = "http://localhost:11434/v1",
                model = "codellama:7b-instruct",
                parse_curl_args = function(opts, code_opts)
                    return {
                        url = opts.endpoint .. "/chat/completions",
                        headers = {
    …

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
3 replies
@Marvinus
Comment options

@SerkanSipahi
Comment options

@Alex23rodriguez
Comment options

Answer selected by aarnphm
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
6 participants