PROBLEMS OF INFORMATION TRANSMISSION
A translation of Problemy Peredachi Informatsii
On the Reliability Function for a BSC with
Noisy Feedback
M. V. Burnashev and H. Yamamoto
pp. 103121
AbstractA binary symmetric channel is used for information transmission. There is also another noisy binary symmetric channel (feedback channel), and the transmitter observes without delay all outputs of the forward channel via the feedback channel. Transmission of an exponential number of messages is considered (i.e., the transmission rate is positive). The achievable decoding error exponent for this combination of channels is studied. It is shown that if the crossover probability of the feedback channel is less than a certain positive value, then the achievable error exponent is better than the decoding error exponent of a channel without feedback.
On Computation of Information via Variation and
Inequalities for the Entropy Function
V. V. Prelov
pp. 122126
AbstractThis paper supplements the author's paper [1]. We obtain an explicit formula which in a special case allows us to calculate the maximum of mutual information of several random variables via the variational distance between the joint distribution of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary entropy function, which are related to the problem considered here.
Multitrial Decoding of Concatenated Codes Using
Fixed Thresholds
C. Senger, V. R. Sidorenko, M. Bossert, and V. V. Zyablov
pp. 127141
AbstractFor decoding concatenated codes up to half their designed distance, generalized minimum distance (GMD) decoding can be used. GMD decoding applies multitrial error/erasure decoding of the outer code, where erased symbols depend on some reliability measure stemming from the inner decoders. We consider the case where the outer decoder is able to decode beyond half the minimum distance of the outer code. For a given number of outer decoding trials, we derive achievable decoding radii for GMD decoding. Vice versa, we give a lower bound on the number of required outer decoding trials to obtain the greatest possible decoding radius.
Asymptotic Estimation of the Fraction of Errors
Correctable by $q$-ary LDPC Codes
A. A. Frolov and V. V. Zyablov
pp. 142159
AbstractWe consider an ensemble of random $q$-ary LDPC codes. As constituent codes, we use $q$-ary single-parity-check codes with $d=2$ and ReedSolomon codes with $d=3$. We propose a hard-decision iterative decoding algorithm with the number of iterations of the order of the logarithm of the code length. We show that under this decoding algorithm there are codes in the ensemble with the number of correctable errors linearly growing with the code length. We weaken a condition on the vertex expansion of the Tanner graph corresponding to the code.
Large Deviations for Distributions of Sums of
Random Variables: Markov Chain Method
V. R. Fatalov
pp. 160183
AbstractLet $\left\{\xi_k\right\}_{k=0}^\infty$ be a sequence of i.i.d. real-valued random variables, and let $g(x)$ be a continuous positive function. Under rather general conditions, we prove results on sharp asymptotics of the probabilities $\operatorname{\bf P}\left\{{\displaystyle\frac 1n}\sum\limits_{k=0}^{n-1} g(\xi_k)\lt d \right\}$, $n\to\infty$, and also of their conditional versions. The results are obtained using a new method developed in the paper, namely, the Laplace method for sojourn times of discrete-time Markov chains. We consider two examples: standard Gaussian random variables with $g(x)=|x|^p$, $p>0$, and exponential random variables with $g(x)=x$ for $x\ge 0$.
On Ergodic Algorithms in Random Multiple Access
Systems with Success-Failure Feedback
A. M. Turlikov and S. G. Foss
pp. 184200
AbstractWe consider a decentralized multiple access system with a binary success-failure feedback. We introduce a family of algorithms (protocols) called algorithms with delayed intervals and study stability conditions of one of them. Then we discuss some numerical results and a number of related and interesting problems and hypotheses.