-
Notifications
You must be signed in to change notification settings - Fork 727
[wip] Adding flash attention for sequence parallel #511
Conversation
711ab36
to
5fa1a21
Compare
d9b126e
to
8e5969f
Compare
Hi @stephenroller! Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention. You currently have a record in our system, but the CLA is no longer valid, and will need to be resubmitted. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks! |
Patch Description
Since we're doing manual activation checkpointing, we need to have custom backwards for MHA. This patch leverages the flash implementation in xformers.
TODO:
Testing steps
At very large scale, was a ~0.5-1% speedup. Probably not worth it at the largest scales given the risk of numeric changes, but maybe still worth it for medium scales.