loading page

SceDL: A Simultaneous Compression and Encryption Scheme for Deep Learning Models
  • Nivedita Shrivastava ,
  • smruti ranjan sarangi
Nivedita Shrivastava
IIT Delhi

Corresponding Author:[email protected]

Author Profile
smruti ranjan sarangi
Author Profile


Efficiently securing and compressing neural network models is a problem of significant interest due to its high popularity in various machine learning and computer vision applications such as industrial automation, autonomous vehicles, surveillance, and medical imaging. Such technologies are necessary while running machine learning models in both resource-constrained edge devices as well as cloud-based servers. These models embody valuable intellectual property that must be protected. Traditional encryption ciphers can provide high security guarantees in order to secure the model, but their sizes are prohibitive for resource-constrained devices. In this paper, we present a simultaneous compression and encryption approach for deep learning models, where the model weights are encrypted using chaotic maps. We claim that employing multiple chaotic maps and a lossless compression method can help us create not only an efficient encryption scheme but also compress the models efficiently in a hardware-friendly manner. This reduces the model storage overheads by 1.51× as compared to the nearest computing work. Additionally, our method is 70% faster and provides much better security guarantees