PROBLEMS OF INFORMATION TRANSMISSION
A translation of Problemy Peredachi Informatsii


Volume 32, Number 1, January–March, 1996
Back to contents page

CONTENTS                   Powered by MathJax

 

Review of Scientific Achievements of M. S. Pinsker
pp. 3–14

 

Queueing System with Selection of the Shortest of Two Queues: An Asymptotic Approach
N. D. Vvedenskaya, R. L. Dobrushin, and F. I. Karpelevich
pp. 15–27

Abstract—Consider a system that consists of $N$ servers with a Poisson input flow of demands of intensity $N \lambda$. Each demand arriving to the system randomly selects two servers and is instantly sent to the one with the shorter queue. The service time is distributed exponentially with mean $1$. It turns out that for $\lambda <1$ it is possible to investigate the asymptotic distribution of the queue lengths as $N \rightarrow \infty$. In the limit the queue length probability decreases superexponentially as the queue length increases.

 

Inequalities and Algorithms for Universal Data Compression
J. Ziv
pp. 28–32

Abstract—We describe nonasymptotic uniform bounds on the performance of data-compression algorithms in cases where the length $N$ of the training sequence (“history”) that is available to the encoder is not large enough so as to yield the ultimate compression, namely the entropy of the source. Two characteristic ultimate goals are considered: The $\ell$th-order entropy $H(X_1^\ell )$, and the associated conditional entropy $H(X_1^{\ell-k}\,|\, X_{-k+1}^0)$. The bounds are based on classical information-theoretic convexity arguments. Nevertheless, it is demonstrated that convexity arguments that work for some cases are totally useless for others. Furthermore, these classical convexity arguments, when properly used, lead to efficient universal data-compression algorithms for each of the cases considered. We limit the discussion to fixed-to-variable, uniquely decipherable codes.

 

Authentication, Identification, and Pairwise Separated Measures
L. A. Bassalygo and M. V. Burnashev
pp. 33–39

Abstract—A certain equivalence of authentication and identification problems is shown. A new upper bound on the maximum number of pairwise-separated-in-$L_1$-metrics probability measures on a finite alphabet is obtained, which is at the same time an upper bound for those problems.

 

Almost Independence and Secrecy Capacity
I. Csiszár
pp. 40–47

Abstract—In information-theoretic models of secure transmission or cooperative secret key generation, the usual secrecy criterion is per letter mutual information. For familiar models, the same secrecy capacity is shown to be achievable with a much stronger criterion. The main tool is a general theorem about a function of a random variable almost uniformly distributed on a large set and almost independent of another random variable, a consequence of a recent result of Ahlswede and the author about robust uniform randomness.

 

Bounds on Distances for Unit Memory Concatenated Codes
V. Zyablov, J. Justesen, C. Thommesen, and S. Shavgulidze
pp. 48–57

Abstract—Two schemes of concatenated coding on the basis of outer unit memory code are considered. Constructive lower bounds for free distance, also for extended row, column, and segment distances are obtained. It is proved that in the random ensemble of unit memory concatenated codes there exist codes that asymptotically meet the best known bounds on distances for unit memory codes.

 

On Parameter Estimation from Indirect Observations
A. V. Skorokhod and R. Z. Khasminskii
pp. 58–68

Abstract—We consider the problem of parameter estimation in the situation where the observation $Y_\varepsilon(t)$ is the sum of a known function $\Phi$ of an unobservable process $X_\varepsilon(t)$ and the Gaussian white noise of small intensity $\varepsilon$. The process $X_\varepsilon(t)$ is also the sum of a signal depending on an unknown parameter $\theta\in \mathbb{R}^d$ and the Gaussian process with small correlation operator. The property of local asymptotic normality (LAN), lower and upper bounds for the estimation quality, and the construction of an asymptotically efficient estimate are established for this model.

 

The Strong Converse for Source Coding with a Fidelity Criterion
T. S. Han and H. Oishi
pp. 69–77

Abstract—A source coding problem with a fidelity criterion, where the sources are stationary discrete and memoryless, is considered at low rates. The reliability function for the source coding without a fidelity criterion has been determined by Jelinek and by Csiszár and Longo for rates which are either higher or lower than the source entropy. On the other hand, the reliability function for the source coding with a fidelity criterion at rates which are higher than the rate-distortion function has been determined by Marton. The present paper determines the reliability function for the source coding with a fidelity criterion at lower rates, thereby establishing the strong converse. It is also shown that this reliability function can be attained by using a universal encoder.

 

Channel Assignment in Cellular Networks
S. P. Fedortsov and B. S. Tsybakov
pp. 78–85

Abstract—In cellular radio networks, the spectrum utilization can be increased by using dense frequency reuse schemes. Already known reuse schemes have been developed for base stations with circular service area. In the paper we suggest three methods of constructing static frequency reuse schemes for base stations with more general service area. The methods are compared with each other and with known results on graph coloring. Suggested methods can be used in designing effective channel assignment schemes for cellular networks.

 

The Exponential Distribution in Information Theory
S. Verdú
pp. 86–95

Abstract—It is shown that the exponential distribution leads to information theoretic formulas which are strikingly similar to their Gaussian counterparts:

 

Typicality of a Good Rate-Distortion Code
A. Kanlis, S. Khudanpur, and P. Narayan
pp. 96–103

Abstract—We consider a good code for a discrete memoryless source with a specified distortion level to be one whose rate is close to the corresponding rate-distortion function and which, with large probability, reproduces the source within the allowed distortion level. We show that any good code must contain an exponentially large set of codewords, of effectively the same rate, which are all typical with respect to the output distribution induced by the rate-distortion-achieving channel. Furthermore, the output distribution induced by a good code is asymptotically singular with respect to the i.i.d. output distribution induced by the rate-distortion-achieving channel. However, the normalized (Kullback–Leibler) divergence between these output distributions converges to the conditional entropy of the output under the rate-distortion-achieving channel.

 

On the Theory of List Decoding of Convolutional Codes
K. Sh. Zigangirov and R. Johanneson
pp. 104–111

Abstract—Lower bounds on the list minimum weight and list weight are obtained for ensembles of systematic and nonsystematic encoders. From these bounds it follows that the necessary list size grows exponentially with the number of corrected errors. We obtain an expurgated upper bound on the probability of the correct path loss for the ensemble of time-invariant binary convolutional codes. The notion of an $l$-list weight enumerator is introduced and used to obtain an upper bound on the probability of the correct path loss for list decoding of a fixed convolutional code.

 

Causal Interpretations of Random Variables
J. L. Massey
pp. 112–116

Abstract—A causal interpretation of random variables corresponds to the successive generation of these random variables by a sequence of random experiments, each of which uses the results of previous experiments only. Causality graphs are introduced to describe such a causal representation. It is shown that although the order of the random variables in a causal interpretation is completely arbitrary, causality graphs are nonetheless useful in deducing independences among random variables.

 

On Fourier-Invariant Partitions of Finite Abelian Groups and the MacWilliams Identity for Group Codes
V. A. Zinoviev and T. Ericson
pp. 117–122

Abstract—The concept of an $F$-partition of a finite Abelian group is introduced. For such $F$-partitions, the MacWilliams identity for an arbitrary group code is established.

 

Identification under Random Processes
R. Ahlswede and V. B. Balakirsky
pp. 123–138

Abstract—Recently it has been understood how the common randomness of a sender and receivers increases the identification capacity. Here, we study the effect of randomness of a more general structure, namely, a correlated source, when the sender has access to one marginal source and the receivers have access to the other marginal source. The typical phenomena can be and will be explained for binary symmetric correlated sources.