Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: optimizer_step() got an unexpected keyword argument 'using_native_amp' #44

Closed
mmagithub opened this issue Jul 4, 2020 · 3 comments

Comments

@mmagithub
Copy link

Hi,
I am trying to finetune a gpt2 model ("124") on a local hardware. I keep getting this error:

ValueError: Keyword arguments {'return_attention_masks': False} not recognized.
0%| | 0/683971 [00:00<?, ?it/s]

I read that it could be due to broken transformers for the 3.0 version, I upgraded to 3.0.1 still getting the error. However, downgrading to 2.11.0 (or 2.9.1) did solve it.

Now the tokenization finished normally, but when the training (fine-tuning) started, I got this error:
TypeError: optimizer_step() got an unexpected keyword argument 'using_native_amp'
0%| | 0/5000 [00:46<?, ?it/s]

Any suggestion?

I am also wondering if we can upload a tokenized (.npz) text directly that we have previously prepared while using the original gpt2_simple repository, I mean before aitextgen being created.?

Thanks

@minimaxir
Copy link
Owner

minimaxir commented Jul 4, 2020

pytorch_lightning likely did a breaking change as well.

Adding !pip install pytorch-lightning==0.7.6 at the beginning of a notebook/fixing the version to 0.7.6 should fix it. I just made that change on my end.

I am also wondering if we can upload a tokenized (.npz) text directly that we have previously prepared while using the original gpt2_simple repository, I mean before aitextgen being created.?

No, the .npz from gpt-2-simple is not compatible, however it's much faster to rebuild using aitextgen.

@minimaxir
Copy link
Owner

Issue was introduced in pytorch-lightning==0.8.4, 4 days ago, in this commit.: Lightning-AI/pytorch-lightning@593837e#diff-c45bd21c331565cbe62aaa12fa43aa0a

Fix is to add the params to the optimizer_step function.

minimaxir added a commit that referenced this issue Jul 5, 2020
@minimaxir
Copy link
Owner

Fixed for real in aitextgen 0.2.3; turns out it was due to obsolete code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants