loading page

Medical BigBERTa: An Optimized Transformer for Long Medical Document
  • Chérubin Mugisha ,
  • Incheon Paik
Chérubin Mugisha
Author Profile
Incheon Paik
The University of Aizu, The University of Aizu

Corresponding Author:[email protected]

Author Profile

Abstract

For this research, we propose a biomedical language model trained on biomedical publicly available datasets from Kaggle, Pubmed abstract baseline 2019, and MIMIC III.