Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update lora.py to fix the AttributeError: can't set attribute issue #1

Open
wants to merge 20 commits into
base: support_peft
Choose a base branch
from
Open
Prev Previous commit
Next Next commit
Update README.md
  • Loading branch information
SuperBruceJia authored Dec 17, 2023

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
commit eec3a92860bcf935cf58ff5ce03e0acaed4872a8
37 changes: 37 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -16,8 +16,45 @@ vLLM with LoRA support

---

## Compile and install from source

```shell
git clone --branch support_peft https://github.com/SuperBruceJia/vllm.git
cd vllm
pip install -e . --user
```

## Set-up LLM with LoRA
Please note that this is just a demo!
```shell
from vllm import LLM, SamplingParams
from vllm.model_executor.adapters import lora
from vllm.model_executor.parallel_utils.parallel_state import destroy_model_parallel

def stop_token_list():
stop_tokens = ["Question:",
"Question",
"USER:",
"USER",
"ASSISTANT:",
"ASSISTANT",
"Instruction:",
"Instruction",
"Response:",
"Response",]

return stop_tokens


stop_tokens = stop_token_list()
sampling_params = SamplingParams(temperature=0.0, top_p=1, max_tokens=128, stop=stop_tokens)

llm = LLM(model="meta-llama/Llama-2-7b-hf", load_format="auto", tensor_parallel_size=1, gpu_memory_utilization=0.90)
lora.LoRAModel.from_pretrained(llm.llm_engine.workers[0].model, '/adapter') # The adapter saved path

prompts = ["Hello World", "Hello Python"]
completions = llm.generate(prompts, sampling_params)
for output in completions:
gens = output.outputs[0].text
print(gens, '\n')
```