-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Typing][A-48,A-49,A-50][debug] Add type annotations for 3 optimizers (RAdam
, RMSProp
, Rprop
)
#65085
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
这个 PR 建议在 #65076 之后再推进,另外,最好参考 https://github.com/cattidea/paddlepaddle-stubs 标注更详细些~比如泛型不要省略参数,dict 用更细致的 TypedDict 等 |
paddle/optimizer/*
paddle/optimizer/*
#65076 已合,这个 PR 可以推进下了~ |
@DrRyanHuang 关于这几个 |
paddle/optimizer/*
paddle/optimizer/*
paddle/optimizer/*
RAdam
, RMSProp
, Rprop
)
标题还是不要用 |
#65273 已合,麻烦一起改一下~ |
RAdam
, RMSProp
, Rprop
)RAdam
, RMSProp
, Rprop
)
@megemini 开了 debug 好像看到了一些奇怪的报错 ![]() |
2024-06-21 14:50:55 Traceback (most recent call last):
2024-06-21 14:50:55 File "/paddle/tools/type_checking.py", line 325, in <module>
2024-06-21 14:50:55 run_type_checker(args, mypy_checker)
2024-06-21 14:50:55 File "/paddle/tools/type_checking.py", line 304, in run_type_checker
2024-06-21 14:50:55 test_results = get_test_results(type_checker, docstrings_to_test)
2024-06-21 14:50:55 File "/paddle/tools/type_checking.py", line 272, in get_test_results
2024-06-21 14:50:55 with multiprocessing.Pool(initializer=init_worker) as pool:
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/context.py", line 119, in Pool
2024-06-21 14:50:55 return Pool(processes, initializer, initargs, maxtasksperchild,
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/pool.py", line 215, in __init__
2024-06-21 14:50:55 self._repopulate_pool()
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/pool.py", line 306, in _repopulate_pool
2024-06-21 14:50:55 return self._repopulate_pool_static(self._ctx, self.Process,
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/pool.py", line 329, in _repopulate_pool_static
2024-06-21 14:50:55 w.start()
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/process.py", line 121, in start
2024-06-21 14:50:55 self._popen = self._Popen(self)
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
2024-06-21 14:50:55 return Popen(process_obj)
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
2024-06-21 14:50:55 self._launch(process_obj)
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
2024-06-21 14:50:55 code = process_obj._bootstrap(parent_sentinel=child_r)
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
2024-06-21 14:50:55 self.run()
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
2024-06-21 14:50:55 self._target(*self._args, **self._kwargs)
2024-06-21 14:50:55 File "/usr/lib/python3.10/multiprocessing/pool.py", line 125, in worker
2024-06-21 14:50:55 result = (True, func(*args, **kwds))
2024-06-21 14:50:55 File "mypy/semanal.py", line 6714, in accept
2024-06-21 14:50:55 File "mypy/nodes.py", line 787, in accept
2024-06-21 14:50:55 File "mypy/semanal.py", line 835, in visit_func_def
2024-06-21 14:50:55 File "mypy/semanal.py", line 870, in analyze_func_def
2024-06-21 14:50:55 File "mypy/semanal.py", line 6386, in defer
2024-06-21 14:50:55 AssertionError: Must not defer during final iteration
2024-06-21 14:50:55 --------------------
defer 错误,但不是之前 |
线下做了一些实验,mypy 有问题,pyright 没问题。 定位的地方为 if TYPE_CHECKING:
from typing_extensions import NotRequired, TypedDict
# from paddle import Tensor # 不能用这个
from paddle.nn.clip import GradientClipBase
from ..base.framework import Operator, Program
from ..tensor.tensor import Tensor # 使用这个代替
class _ParameterConfig(TypedDict):
params: Sequence[Tensor]
weight_decay: NotRequired[float | WeightDecayRegularizer | None]
learning_rate: NotRequired[float | Tensor | LRScheduler | None] 这里需要用 这里可能:
现在还不能下定论,也没搜到相关 issue ~ 二位是否可以实验一下?@DrRyanHuang 如果修改 >>> rmsprop = paddle.optimizer.RMSProp(
... learning_rate=0.1,
... parameters=[{ # type: ignore
... 'params': linear_1.parameters()
... }, {
... 'params': linear_2.parameters(),
... 'weight_decay': 0.001,
... 'learning_rate': 0.1
... }],
... weight_decay=0.01
... ) 这里的 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
… (`RAdam`, `RMSProp`, `Rprop`) (PaddlePaddle#65085) --------- Co-authored-by: SigureMo <[email protected]>
… (`RAdam`, `RMSProp`, `Rprop`) (PaddlePaddle#65085) --------- Co-authored-by: SigureMo <[email protected]>
PR Category
User Experience
PR Types
Improvements
Description
目前阻塞在🚧 #65076类型标注:
Related links
@SigureMo @megemini