i\'m using huggingface transformers package to load a pretrained GPT-2 model. I want to use GPT-2 for text generation, but the pretrained version isn\'t enough so I want to fine tune it with a bunch o
I am thinking开发者_运维知识库 of training a mobileBERT model from scratch for the German language.