-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't load state_dict for GPT2ForSequenceClassification (Unexpected key(s) in state_dict) #1
Comments
To run this code, I have made the decision to remove the bias layer. Replace the load_model() function in line 165 with the following code snippet:
This modification will allow you to proceed with your desired testing. |
@xiyiyia Can you tell me the Python, PyTorch, and Hugging Face transformer versions you're using? I tested on the following configuration and it runs fine: Python = 3.9.13 In principle, we should've used the " To fix the unexpected keys issue we'll need to alter the key names of the stored model to match the key names expected by the GPT-2 architecture. |
Here are the versions of the packages you mentioned:
I will create a new environment for testing. Thanks for your nice work. |
Hi! @malik727 @Hunaid2000
I guess the problem is in the GPTGC.pt.
May I get a new file of pre-trained model?
Thanks a lot!
The text was updated successfully, but these errors were encountered: