TechRxiv
MeditCom2021_SimonBos_CameraReady.pdf (129.96 kB)
Download file

Avoiding normalization uncertainties in deep learning architectures for end-to-end communication

Download (129.96 kB)
preprint
posted on 2021-07-19, 15:26 authored by Simon BosSimon Bos, Evgenii Vinogradov, Sofie Pollin
Recently, deep learning is considered to optimize the end-to-end performance of digital communication systems. The promise of learning a digital communication scheme from data is attractive, since this makes the scheme adaptable and precisely tunable to many scenarios and channel models. In this paper, we analyse a widely used neural network architecture and show that the training of the end-to-end architecture suffers from normalization errors introduced by an average power constraint. To solve this issue, we propose a modified architecture: shifting the batch slicing after the normalization layer. This approach meets the normalization constraints better, especially in the case of small batch sizes. Finally, we experimentally demonstrate that our modified architecture leads to significantly improved performance of trained models, even for large batch sizes where normalization constraints are more easily met.

Funding

European Union's Horizon 2020, Grant agreement ID: 101017171

History

Email Address of Submitting Author

simon.bos@kuleuven.be

ORCID of Submitting Author

0000-0002-4780-1484

Submitting Author's Institution

KU Leuven

Submitting Author's Country

  • Belgium

Usage metrics

    Licence

    Exports