-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LLM] Support prefix tuning and lora for qwen2 #8601
Conversation
Thanks for your contribution! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #8601 +/- ##
===========================================
+ Coverage 54.18% 54.73% +0.55%
===========================================
Files 625 625
Lines 98942 98985 +43
===========================================
+ Hits 53612 54180 +568
+ Misses 45330 44805 -525 ☔ View full report in Codecov by Sentry. |
# However, when state dict only contains the one piece of shared parameters, the shared parameters | ||
# will be different from the original shared parameters. | ||
|
||
if isinstance(model, PipelineLayer): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
赞👍🏻
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Function optimization
PR changes
Models
Description