Deep Learning for Signal Processing with Predictions of Channel Profile, Doppler Shift and Signal-To-Noise Ratio
preprintposted on 19.06.2021, 20:20 authored by Thinh NgoThinh Ngo, Brian Kelley, Rad Paul
This paper proposes Deep Learning (DL) for Signal Processing, with reviews and discussions of the three recent DL application advancements in wireless communication, which predict channel profile, Doppler shift, and signal-to-noise ratio (SNR) of LTE and 5G systems. MATLAB simulations are performed on time-domain and frequency-domain signals, emulating real wireless environments, with randomized payloads (e.g., non-data-aided), modulation types [QPSK, 16QAM, 64QAM], Doppler shifts [0, 50, .., 550] Hz, and SNRs [-10, 20] dB. The predictions are accurate at ~95%. The methodology consists of input diversity to empower multiple inputs per prediction, binary prediction to reduce prediction complexity and uncertainty, and a hybrid DL convolutional neural network and long-short-term-memory (CNN-LSTM) model to learn features in every input and across inputs. Additionally, the paper presents common lessons learned and future research directions. The designed methodology provides an effective backup scheme for prediction accuracy enhancement upon nonperformance of the traditional single input single output, multi-class prediction scheme. Furthermore, the review aims to extend upon the methodology's success from the three applications paving the way for a universal DL prediction methodology for wireless communications (i.e., DL for Signal Processing) and other domains.