-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CodeCamp2023-526] Kullback-Leibler divergence Loss implementation #3242
Conversation
Hi @zen0no, |
Hi @zen0no, |
Hi @crazysteeaam, Link for the problem in circleci: https://app.circleci.com/pipelines/github/open-mmlab/mmsegmentation/1751/workflows/0f284d64-9863-477b-a1c7-569668b4aa31/jobs/5655?invite=true#step-103-46 |
Hi @zen0no, We'd like to express our appreciation for your valuable contributions to the mmsegmentation. Your efforts have significantly aided in enhancing the project's quality. If you're on WeChat, we'd also love for you to join our community there. Just add our assistant using the WeChat ID: openmmlabwx. When sending the friend request, remember to include the remark "mmsig + Github ID". Thanks again for your awesome contribution, and we're excited to have you as part of our community! |
mmseg/models/losses/kldiv_loss.py
Outdated
loss = loss * self.temperature * self.temperature | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
loss = loss * self.temperature * self.temperature | |
loss = loss * self.temperature**2 | |
mmseg/models/losses/kldiv_loss.py
Outdated
@MODELS.register_module() | ||
class KLDivLoss(nn.Module): | ||
|
||
def __init__(self, temperature=1.0, reduction='mean'): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We might add type hints.
…pen-mmlab#3242) Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers. ## Motivation It's OpenMMLab Codecamp task. ## Modification Implementd Kullback-Leibler divergence loss and also added tests for it. ## Checklist 1. Pre-commit or other linting tools are used to fix the potential lint issues. 2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness. 3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMDet3D. 4. The documentation has been modified accordingly, like docstring or example tutorials. --------- Co-authored-by: xiexinch <[email protected]>
…pen-mmlab#3242) Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers. ## Motivation It's OpenMMLab Codecamp task. ## Modification Implementd Kullback-Leibler divergence loss and also added tests for it. ## Checklist 1. Pre-commit or other linting tools are used to fix the potential lint issues. 2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness. 3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMDet3D. 4. The documentation has been modified accordingly, like docstring or example tutorials. --------- Co-authored-by: xiexinch <[email protected]>
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
It's OpenMMLab Codecamp task.
Modification
Implementd Kullback-Leibler divergence loss and also added tests for it.
Checklist