Transformers and the representation of biomedical background knowledge

Oskar Wysocki, Zili Zhou, Paul O'Regan, Deborah Ferreira, Magdalena Wysocka, Dónal Landers, André Freitas

Research output: Working paperPreprint

3 Downloads (Pure)


BioBERT and BioMegatron are Transformers models adapted for the biomedical domain based on publicly available biomedical corpora. As such, they have the potential to encode large-scale biological knowledge. We investigate the encoding and representation of biological knowledge in these models, and its potential utility to support inference in cancer precision medicine - namely, the interpretation of the clinical significance of genomic alterations. We compare the performance of different transformer baselines; we use probing to determine the consistency of encodings for distinct entities; and we use clustering methods to compare and contrast the internal properties of the embeddings for genes, variants, drugs and diseases. We show that these models do indeed encode biological knowledge, although some of this is lost in fine-tuning for specific tasks. Finally, we analyse how the models behave with regard to biases and imbalances in the dataset.
Original languageEnglish
Publication statusPublished - 4 Feb 2022


  • cs.CL
  • cs.AI
  • cs.LG


Dive into the research topics of 'Transformers and the representation of biomedical background knowledge'. Together they form a unique fingerprint.

Cite this