F/flax split head dim#5181
Merged
Merged
Conversation
…f/flax-split-head-dim
…f/flax-split-head-dim
patrickvonplaten
approved these changes
Sep 26, 2023
Contributor
patrickvonplaten
left a comment
There was a problem hiding this comment.
Very cool! @pcuenca what do you think?
Co-authored-by: Patrick von Platen <[email protected]>
pcuenca
approved these changes
Sep 26, 2023
Member
pcuenca
left a comment
There was a problem hiding this comment.
Looks great, thanks a lot!
Running tests to see if anything important comes up :)
Member
|
@entrpn could you please run |
Contributor
Author
|
@pcuenca I ran it already. I just re-ran it after Patrick's changes and there were no changes. |
Member
|
Ok, I'll merge then and we can verify later. |
yoonseokjin
pushed a commit
to yoonseokjin/diffusers
that referenced
this pull request
Dec 25, 2023
* split_head_dim flax attn * Make split_head_dim non default * make style and make quality * add description for split_head_dim flag * Update src/diffusers/models/attention_flax.py Co-authored-by: Patrick von Platen <[email protected]> --------- Co-authored-by: Juan Acevedo <[email protected]> Co-authored-by: Patrick von Platen <[email protected]>
AmericanPresidentJimmyCarter
pushed a commit
to AmericanPresidentJimmyCarter/diffusers
that referenced
this pull request
Apr 26, 2024
* split_head_dim flax attn * Make split_head_dim non default * make style and make quality * add description for split_head_dim flag * Update src/diffusers/models/attention_flax.py Co-authored-by: Patrick von Platen <[email protected]> --------- Co-authored-by: Juan Acevedo <[email protected]> Co-authored-by: Patrick von Platen <[email protected]>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
@patrickvonplaten Adds optional flag to split_head_dim in flax attention which speeds up the diffusion process for Stable Diffusion 2.1 and SDXL.
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.