Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when trying to convert a HF model which is a LORA PEFT fine tuned version of phi-128k #7287

Closed
swarnava112 opened this issue May 14, 2024 · 2 comments

Comments

@swarnava112
Copy link

image

I am trying to convert a HF model I finetuned, the model is an adapter model for microsoft phi-128k. I donot know what arch should I specify as I didnot see "phi" as an arch option, when I looked into the code.

Any help is appriciated.

@arnfaldur
Copy link

llama.cpp does not support phi-3-128k currently #6849
The work on adding it is happening here #7225

Please try searching a little before opening an issue. Feel free to close it as well as it's already being tracked.

@ngxson
Copy link
Collaborator

ngxson commented May 14, 2024

The convert-lora-to-ggml script has been deprecated and already been removed: #7204

@ngxson ngxson closed this as completed May 14, 2024
@ngxson ngxson closed this as not planned Won't fix, can't repro, duplicate, stale May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants