-
Notifications
You must be signed in to change notification settings - Fork 1.9k
ENH: Better DoRA check in mixed adapter batch inference #2089
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH: Better DoRA check in mixed adapter batch inference #2089
Conversation
This is a bit of an edge case, but I noticed this while working on something else. PEFT allows mixed batch adapter inference, i.e. when predicting, the same batch can use different adapters by passing the adapter_names argument. However, this is not supported for DoRA (yet), so there is a check that raises an error if DoRA is used. Previously, this check would check all adapters for DoRA, even if those adapters are not being used in adapter_names. This was unnecessarily strict and with this PR, we only check the adapters that are actually being used.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! I left comments mostly related to adding comments. Nothing merge-blocking.
src/peft/tuners/lora/layer.py
Outdated
@@ -344,7 +344,8 @@ def _check_forward_args(self, x, *args, **kwargs): | |||
msg = "Cannot pass `adapter_names` when there are merged adapters, please call `unmerge_adapter` first." | |||
raise ValueError(msg) | |||
|
|||
unique_adapters = set(self.active_adapters) | |||
# DoRA is not supported (yet), check that it's not being used |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a need to log an information here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what. If DoRA is used by this layer, an error is raised, otherwise nothing further happens.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the feedback to better clarify what's happening. I hope the updated comments do the job.
src/peft/tuners/lora/layer.py
Outdated
@@ -344,7 +344,8 @@ def _check_forward_args(self, x, *args, **kwargs): | |||
msg = "Cannot pass `adapter_names` when there are merged adapters, please call `unmerge_adapter` first." | |||
raise ValueError(msg) | |||
|
|||
unique_adapters = set(self.active_adapters) | |||
# DoRA is not supported (yet), check that it's not being used |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what. If DoRA is used by this layer, an error is raised, otherwise nothing further happens.
…2089) This is a bit of an edge case, but I noticed this while working on something else. PEFT allows mixed batch adapter inference, i.e. when predicting, the same batch can use different adapters by passing the adapter_names argument. However, this is not supported for DoRA (yet), so there is a check that raises an error if DoRA is used. Previously, this check would check all adapters for DoRA, even if those adapters are not being used in adapter_names. This was unnecessarily strict and with this PR, we only check the adapters that are actually being used.
This is a bit of an edge case, but I noticed this while working on something else.
PEFT allows mixed batch adapter inference, i.e. when predicting, the same batch can use different adapters by passing the
adapter_names
argument. However, this is not supported for DoRA (yet), so there is a check that raises an error if DoRA is used.Previously, this check would check all adapters for DoRA, even if those adapters are not being used in
adapter_names
. This was unnecessarily strict and with this PR, we only check the adapters that are actually being used.