-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Interactive mode/Session #7
Comments
Hey, yes that's certainly possible. I don't have an example yet but this can be done through the |
Yeah was confused because generate only took the minimal vars. Thought about hacking an interactive mode into Llamas default call itself something like this |
I have tried to port the original main from llama.cpp to python here: https://gist.github.com/SagsMug/c39d5f337b0e0faf32b3ff6bc5bf4956 |
Is a session mode (calling it session here an interactive mode where since user interaction like ctrl+c is required probably isn't wanted in a python lib) wanted in this library? |
Could be an example of using it as chat. :) thx for posting similar to my personal hacked together version, but I didn't bother to use hinting. Maybe I will use your "wrapper". |
This is cool, I'd like to include python ports of the
Ahh gotcha, the reason I haven't implemented continuation / interactive mode in the call function is because there are some bugs with continuing a conversation that ended in a stop word (technically the model has seen the stop word even if you haven't returned the text). There was a recent update to the |
@BlackLotus this is now implemented in the examples https://github.com/abetlen/llama-cpp-python/blob/main/examples/low_level_api/low_level_api_chat_cpp.py |
Hey there sorry if this obviously possible/easy, but is it possible right now to use llama.cpp's interactive mode and if so how?
The text was updated successfully, but these errors were encountered: