Releases: jarib/bert-norne
Releases · jarib/bert-norne
bert-norne
f1 = 0.8427046263345196
loss = 0.03901922195237588
precision = 0.8297126839523475
recall = 0.8561099060014461
I.e. this multilingual BERT model appears to perfrom worse on NER than jarib/spacy-nb.