-
-
Notifications
You must be signed in to change notification settings - Fork 6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding support for switch-transformer / NLLB-MoE #1565
Comments
@yl3469 do you still intend to add support for this model archutecture? |
You can follow the steps at https://docs.vllm.ai/en/latest/models/adding_model.html |
This is for CausalLM, where as NLLB is encoder-decoder. I suppose that it would be very different from what's written here ? |
Ah I see, in that case you'll want to head over to #187 to discuss Encoder-Decoder model support. |
Hi vLLM Team,
Thanks again for the great work!
Recently my collaborator and I have been working on MoE and like to try to integrate vLLM with SwitchTransformer / FairSeq. I was wondering if you mind sharing if there is any effort already going on and what would be the major challenges?
https://huggingface.co/docs/transformers/main/model_doc/nllb-moe
https://github.dev/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py
Thanks,
Lisa
The text was updated successfully, but these errors were encountered: