Deep Learning for Channel Coding via Neural Mutual Information Estimation
Fritschek, Rick ; Schaefer, Rafael F. ; Wunder, Gerhard
arXiv, Tome 2019 (2019) no. 0, / Harvested from
End-to-end deep learning for communication systems, i.e., systems whose encoder and decoder are learned, has attracted significant interest recently, due to its performance which comes close to well-developed classical encoder-decoder designs. However, one of the drawbacks of current learning approaches is that a differentiable channel model is needed for the training of the underlying neural networks. In real-world scenarios, such a channel model is hardly available and often the channel density is not even known at all. Some works, therefore, focus on a generative approach, i.e., generating the channel from samples, or rely on reinforcement learning to circumvent this problem. We present a novel approach which utilizes a recently proposed neural estimator of mutual information. We use this estimator to optimize the encoder for a maximized mutual information, only relying on channel samples. Moreover, we show that our approach achieves the same performance as state-of-the-art end-to-end learning with perfect channel model knowledge.
Publié le : 2019-03-07
Classification:  Computer Science - Information Theory,  Computer Science - Machine Learning
@article{1903.02865,
     author = {Fritschek, Rick and Schaefer, Rafael F. and Wunder, Gerhard},
     title = {Deep Learning for Channel Coding via Neural Mutual Information
  Estimation},
     journal = {arXiv},
     volume = {2019},
     number = {0},
     year = {2019},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1903.02865}
}
Fritschek, Rick; Schaefer, Rafael F.; Wunder, Gerhard. Deep Learning for Channel Coding via Neural Mutual Information
  Estimation. arXiv, Tome 2019 (2019) no. 0, . http://gdmltest.u-ga.fr/item/1903.02865/