Deep Learning Forecasting and Statistical Modeling for Q/V-Band LEO Satellite Channels
This paper presents a practical approach for Q/V-band modeling for low Earth orbit satellite channels based on tools from machine learning and statistical modeling. The developed Q/V-band LEO satellite channel model is presented in two folds; (i) a real-time forecasting method using model-based deep learning, intended for real-time operation of satellite terminals, and (ii) a statistical channel simulator that generates the path loss as a time-series random process, intended for system design and research. The provided approach capitalizes on real satellite measurements that are obtained from AlphaSat's Q/V-band transmitter at different geographic latitudes, to model the radio channel. The results show that model-based deep learning forecasting can outperform conventional statistically derived prediction methods for varying rain and elevation angle profiles. Moreover, it can also provide more accuracy in long-term prediction in comparison to current state-of-the-art machine learning approaches for radio channel prediction. Results for the statistical channel simulator is shown to produce synthetic radio excess path loss values for varying satellite passes by capitalizing on empirical statistical models obtained from real measurements.
History
Email Address of Submitting Author
bhomssi@ieee.orgSubmitting Author's Institution
RMIT UniversitySubmitting Author's Country
- Australia