You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I read that it could be due to broken transformers for the 3.0 version, I upgraded to 3.0.1 still getting the error. However, downgrading to 2.11.0 (or 2.9.1) did solve it.
Now the tokenization finished normally, but when the training (fine-tuning) started, I got this error:
TypeError: optimizer_step() got an unexpected keyword argument 'using_native_amp'
0%| | 0/5000 [00:46<?, ?it/s]
Any suggestion?
I am also wondering if we can upload a tokenized (.npz) text directly that we have previously prepared while using the original gpt2_simple repository, I mean before aitextgen being created.?
Thanks
The text was updated successfully, but these errors were encountered:
pytorch_lightning likely did a breaking change as well.
Adding !pip install pytorch-lightning==0.7.6 at the beginning of a notebook/fixing the version to 0.7.6 should fix it. I just made that change on my end.
I am also wondering if we can upload a tokenized (.npz) text directly that we have previously prepared while using the original gpt2_simple repository, I mean before aitextgen being created.?
No, the .npz from gpt-2-simple is not compatible, however it's much faster to rebuild using aitextgen.
Hi,
I am trying to finetune a gpt2 model ("124") on a local hardware. I keep getting this error:
ValueError: Keyword arguments {'return_attention_masks': False} not recognized.
0%| | 0/683971 [00:00<?, ?it/s]
I read that it could be due to broken transformers for the 3.0 version, I upgraded to 3.0.1 still getting the error. However, downgrading to 2.11.0 (or 2.9.1) did solve it.
Now the tokenization finished normally, but when the training (fine-tuning) started, I got this error:
TypeError: optimizer_step() got an unexpected keyword argument 'using_native_amp'
0%| | 0/5000 [00:46<?, ?it/s]
Any suggestion?
I am also wondering if we can upload a tokenized (.npz) text directly that we have previously prepared while using the original gpt2_simple repository, I mean before aitextgen being created.?
Thanks
The text was updated successfully, but these errors were encountered: