Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[megatron-bert-uncased-345m] fix conversion #16639

Merged
merged 1 commit into from
Apr 7, 2022
Merged

Conversation

stas00
Copy link
Contributor

@stas00 stas00 commented Apr 6, 2022

Fixes: #16638

The original conversion script made an assumption that all released megatron-bert-*-345m checkpoints had the same vocab, but https://huggingface.co/nvidia/megatron-bert-cased-345m/blob/main/vocab.txt and https://huggingface.co/nvidia/megatron-bert-uncased-345m/blob/main/vocab.txt are quite different.

This PR sets config.vocab_size to the actual size of one of the params of vocab dimension.

I tested that both checkpoints mentioned above convert and load correctly:

python src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py megatron-bert-cased-345m/checkpoint.zip
python -c 'from transformers import MegatronBertForMaskedLM; MegatronBertForMaskedLM.from_pretrained("megatron-bert-cased-345m")'

python src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py megatron-bert-uncased-345m/checkpoint.zip
python -c 'from transformers import MegatronBertForMaskedLM; MegatronBertForMaskedLM.from_pretrained("megatron-bert-uncased-345m")'

both succeed.

Before this PR only the former worked, and the 2nd failed with:

RuntimeError: Error(s) in loading state_dict for MegatronBertForMaskedLM:
	size mismatch for cls.predictions.bias: copying a param with shape torch.Size([30592]) from checkpoint, the shape in current model is torch.Size([29056])

29056 is the vocab size of megatron-bert-cased-345m

@LysandreJik, @sgugger

@stas00 stas00 mentioned this pull request Apr 6, 2022
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 6, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

@stas00 stas00 merged commit 080e42d into main Apr 7, 2022
@stas00 stas00 deleted the meg-bert-conversion-uncased branch April 7, 2022 14:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

MegatronBertForMaskedLM
3 participants