Files and code for English dictionaries, gold and silver standard corpora for biomedical natural language processing related to SARS-CoV-2 and COVID-19: Dataset record

Research output: Other contributionMiscellaneousResearch

Abstract

BACKGROUND
Automated information extraction with natural language processing (NLP) tools is required to gain systematic insights from the large number of COVID-19 publications, reports and social media posts, which far exceed human processing capabilities. A key challenge for NLP is the extensive variation in terminology used to describe medical entities, which was especially pronounced for this newly emergent disease.

FINDINGS
Here we present an NLP toolbox comprising very large English dictionaries of synonyms for SARS-CoV-2 (including variant names) and COVID-19, which can be used with dictionary-based NLP tools. We also present a silver standard corpus generated with the dictionaries, and a gold standard corpus, consisting of PubMed abstracts manually annotated for disease, virus, symptom, protein/gene, cell type, chemical and species terms, which can be used to train and evaluate COVID-19-related NLP tools. Code for annotation, which can be used to expand the silver standard corpus or for text mining is also included. This toolbox is freely available on Github (on https://github.com/Aitslab/corona) and here.

CONCLUSIONS
The toolbox can be used for a variety of text analytics tasks related to the COVID-19 crisis and has already been used to create a COVID-19 knowledge graph, study the variability and evolution of COVID-19-related terminology and develop and benchmark text mining tools.
Original languageEnglish
Short descriptionDataset
PublisherZenodo
DOIs
Publication statusPublished - 2022 Jun 14

Subject classification (UKÄ)

  • Infectious Medicine

Fingerprint

Dive into the research topics of 'Files and code for English dictionaries, gold and silver standard corpora for biomedical natural language processing related to SARS-CoV-2 and COVID-19: Dataset record'. Together they form a unique fingerprint.

Cite this