Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LoRA] fix vanilla fine-tuned lora loading. #8691

Merged
merged 5 commits into from
Jun 26, 2024
Merged

Conversation

sayakpaul
Copy link
Member

@sayakpaul sayakpaul commented Jun 25, 2024

What does this PR do?

Fixes vanilla (legacy) fine-tuning LoRA loading. The bug was introduced in #8316.

Verified

This commit was signed with the committer’s verified signature.
wmouchere William Mouchère
@sayakpaul sayakpaul marked this pull request as ready for review June 25, 2024 06:38
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing this. Could you please give a short explanation why this fails with legacy LoRA adapters? Also, would it be possible to catch this in a test?

@sayakpaul
Copy link
Member Author

Could you please give a short explanation why this fails with legacy LoRA adapters?

It failed because of this check:

if any(key.startswith(cls.unet_name) for key in keys) and not only_text_encoder:

More specifically, any(key.startswith(cls.unet_name) for key in keys). The old format didn't have such prefix identifiers.

Unfortunately, it won't be possible to catch this in a fast test because the format is about 1.5 years old, and no one really uses it.

Verified

This commit was signed with the committer’s verified signature.
wmouchere William Mouchère
Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for explaining. LGTM.

It would probably be a good idea to add the explanation to the commit message if there is ever the need to understand the history of this part of the code.

@yiyixuxu yiyixuxu merged commit 5b51ad0 into main Jun 26, 2024
18 checks passed
@yiyixuxu yiyixuxu deleted the fix-vanilla-ft-lora-loading branch June 26, 2024 17:39
sayakpaul added a commit that referenced this pull request Jun 27, 2024
fix vanilla fine-tuned lora loading.
sayakpaul added a commit that referenced this pull request Dec 23, 2024
fix vanilla fine-tuned lora loading.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants