Batavia asked for advice: Pretrained language models for Named Entity Recognition in historical texts

Sophie Arnoult, Lodewijk Petram, Piek Vossen

Research output: Chapter in book/volumeContribution to conference proceedingsScientificpeer-review

3 Citations (Scopus)

Abstract

Pretrained language models like BERT have advanced the state of the art for many NLP tasks. For resource-rich languages, one has the choice between a number of language-specific models, while multilingual models are also worth considering. These models are well known for their crosslingual performance, but have also shown competitive in-language performance on some tasks. We consider monolingual and multilingual models from the perspective of historical texts, and in particular for texts enriched with editorial notes: how do language models deal with the historical and editorial content in these texts? We present a new Named Entity Recognition dataset for Dutch based on 17th and 18th century United East India Company (VOC) reports extended with modern editorial notes. Our experiments with multilingual and Dutch pretrained language models confirm the crosslingual abilities of multilingual models while showing that all language models can leverage mixed-variant data. In particular, language models successfully incorporate notes for the prediction of entities in historical texts. We also find that multilingual models outperform monolingual models on our data, but that this superiority is linked to the task at hand: multilingual models lose their advantage when confronted with more semantical tasks.
Original languageEnglish
Title of host publicationProceedings of the 5th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature
Subtitle of host publicationLaTeCHCLfL 2021 - Co-located with the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021
EditorsStefania Dagaetano-Ortlieb, Anna Kazantseva, Nils Reiter, Stan Szpakowicz
PublisherAssociation for Computational Linguistics (ACL)
Pages21-30
ISBN (Electronic)9781954085916
Publication statusPublished - Nov 2021

Keywords

  • computational linguistics
  • history
  • pretrained language models
  • named entity recognition
  • VOC

Fingerprint

Dive into the research topics of 'Batavia asked for advice: Pretrained language models for Named Entity Recognition in historical texts'. Together they form a unique fingerprint.

Cite this