Shannon Theory
Capacity Computation

Capacity of Continuous Channels with Memory via Directed Information Neural Estimator

Ziv Aharoni, Dor Tsur, Haim Permuter, Ziv Goldfeld

Date & Time

01:00 am – 01:00 am


Calculating the capacity (with or without feedback) of channels with memory and continuous alphabets is a challenging task. It requires optimizing the directed information (DI) rate over all channel input distributions. The objective is a multi-letter expression, whose analytic solution is only known for a few specific cases. When no analytic solution is present or the channel model is unknown, there is no unified framework for calculating or even approximating capacity. This work proposes a novel capacity estimation algorithm that treats the channel as a `black-box', both when feedback is or is not present. The algorithm has two main ingredients: (i) a neural distribution transformer (NDT) model that shapes a noise variable into the channel input distribution, which we are able to sample, and (ii) the DI neural estimator (DINE) that estimates the communication rate of the current NDT model. These models are trained by an alternating maximization procedure to both estimate the channel capacity and obtain an NDT for the optimal input distribution. The method is demonstrated on the moving average additive Gaussian noise channel, where it is shown that both the capacity and feedback capacity are estimated without knowledge of the channel transition kernel. The proposed estimation framework opens the door to a myriad of capacity approximation results for continuous alphabet channels that were inaccessible until now.


Ziv Aharoni

Ben-Gurion University of the Negev

Dor Tsur

Ben-Gurion University of the Negev

Haim Permuter

Ben Gurion University

Ziv Goldfeld

Cornell University
Private Conversation

Reach out to the speaker privately

Questions & Answers

Post a publicly available question

No questions have been asked.

Session Chair

Tobias Koch

Universidad Carlos III de Madrid