loading page

SANSformers: Self-Supervised Forecasting in Electronic Health Records with Attention-Free Models
  • +3
  • Yogesh Kumar ,
  • Alexander Ilin ,
  • Henri Salo ,
  • Sangita Kulathinal ,
  • Maarit K Leinonen ,
  • Pekka Marttinen
Yogesh Kumar
Aalto University

Corresponding Author:[email protected]

Author Profile
Alexander Ilin
Author Profile
Henri Salo
Author Profile
Sangita Kulathinal
Author Profile
Maarit K Leinonen
Author Profile
Pekka Marttinen
Author Profile

Abstract

Large neural networks have demonstrated success in various predictive tasks using Electronic Health Records (EHR). However, their performance in small divergent patient cohorts, such as those with rare diseases, often falls short of simpler linear models due to the substantial data requirements of large models. To address this limitation, we introduce the SANSformers architecture, specifically designed for forecasting healthcare utilization within EHR. Distinct from traditional transformers, SANSformers utilize attention-free mechanisms, thereby reducing complexity. We also present Generative Summary Pretraining (GSP), a self-supervised learning technique that enables large neural networks to maintain predictive efficiency even with smaller patient subgroups. Through extensive evaluation across two real-world datasets, we provide a comparative analysis with existing state-of-the-art EHR prediction models, offering a new perspective on predicting healthcare utilization.