-
Notifications
You must be signed in to change notification settings - Fork 373
[BUG] Issue with Ollama and context. #1026
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
what embedder do you have configured? |
any update @mc9625 ? |
Hi all, same problem here! can we help debug? |
Set a proper embedder and test it in the memory page in the admin |
I had the same problem, and the reason was the score being below 0.7—even though, to a "human" eye, the question was perfectly relevant to the document (and got a score of 0.6). By adding more details to the input/question, the score increased, and the system correctly included it in the context. So, it was simply an embedding model issue in my case. (easily debugged in memory panel). I tried mxbai-embed-large and nomic-embed-text. I could work on a plugin that can include some context regardless the threshold (something like: "include always at least N chunks of declarative memories". High risk of hallucinations, but at least in-topic, i think. Useful if we have few documents as I did in my playground and first tests). In addition: does it make sense to allow the modification of the threshold for declarative memories in embedding settings? (I can work on it if you agree) |
Have you checked the C.A.T. — Cat Advanced Tools plugin? It allows you to configure thresholds and K results for the three main memory collections, plus other useful settings for this type of implementation. P.S: are you the guy from Gitbar? Big fan of you guys! Keep it up! 🚀 |
Yo'h. I was looking for this. Thank you!
❤️ |
Describe the bug
I have installed the latest version (1.8.1) on two brand new setup: ubuntu 24.0.1 and Raspian OS. On both I have installed Ollama as local service (not a docker container). Everything works fine when I chat (eg I get a proper reply from the model) but there is no context information passed to the prompt. The # Context part of the SystemMessage is missing. I have checked in the memory page and I am able to find declarative message.
To Reproduce
Steps to reproduce the behavior (as exampple`):
The text was updated successfully, but these errors were encountered: