Using medbert.de out-of-the-box
Hi,
I am getting the following warning:
Some weights of BertModel were not initialized from the model checkpoint at GerMedBERT/medbert-512 and are newly initialized: ['bert.pooler.dense.bias', 'bert.pooler.dense.weight']
AFAIK it is possible to use medbert.de out of the box. I use the lines of codes as given in the instructions. Am I doing something wrong?
You should finetune the model. Using it out of the box for, e.g. classification will likely not work well.
Thanks @kbressem . Can you perhaps point me to any resources for fine-tuning Germedbert? That would help me get started!
We do not have a specific tutorial how to finetune our more, but it is compatible with huggingface transformers. So any tutorial in how to finetune a BERT model will probably do.