PROBLEMS OF INFORMATION TRANSMISSION
A translation of Problemy Peredachi Informatsii


Volume 38, Number 1, January–March, 2002
Back to contents page

CONTENTS                   Powered by MathJax

 

Epsilon-entropy of an Ellipsoid in a Hamming Space
I. I. Dumer, M. S. Pinsker, and V. V. Prelov
pp. 1–15

Abstract—Asymptotic behavior of the $\varepsilon$-entropy of an ellipsoid in a Hamming space is investigated as the dimension of the space grows.

 

New Approach to Estimation of the Decoding Error Probability
V. M. Blinovsky
pp. 16–19

Abstract—We suggest a new approach to constructing a lower bound on the decoding error probability in a discrete memoryless channel at zero transmission rate. This approach considerably simplifies derivation of the bound, making it more natural, and simultaneously improves the remainder term of the estimate for the decoding error exponent.

 

An Efficient Generation Method for Uniformly Distributed Random Numbers
B. Ya. Ryabko and E. P. Matchikina
pp. 20–25

Abstract—The problem of constructing efficient methods for generating uniformly distributed random numbers from nonuniformly distributed ones with a given arbitrarily small error is considered. An estimate of the complexity of these methods is given as a function of the error, which is measured as the deviation of numbers generated from uniformly distributed. Methods whose complexity is lower in order than that of known methods are proposed.

 

Compatible Information as a Natural Information Measure of a Quantum Channel
B. A. Grishanin
pp. 26–35

Abstract—As a natural quantitative characteristic of mutual information contained in two compatible sets of quantum states considered as input and output, we introduce the Shannon amount of information corresponding to two independent general measurements of all possible quantum states of the input and output. We analyze the physical content of this information measure and its relation to other measures such as the Holevo information and coherent information. In an example of two two-level systems, the most important features of compatible information in the absence and presence of selection for measured states are revealed and discussed.

 

On the Relation between Cardinalities of 1-Orbits and 1-Information of Words
S. N. Tronin
pp. 36–40

Abstract—We correct the results of Chapter 5 of [Goppa V.D., Introduction to Algebraic Information Theory, Moscow, Nauka, 1995], namely, Theorems 5.1 and 5.2. Their proofs given in this book are based on an inexactly formulated (and therefore actually unproved) statement concerning a relation between words and Euler graphs. We give precise formulations and detailed proofs.

 

Encoder and Distance Properties of Woven Convolutional Codes with One Tailbiting Component Code
M. Handlery, R. Johannesson, and V. V. Zyablov
pp. 41–49

Abstract—Woven convolutional codes with one tailbiting component code are studied and their generator matrices are given. It is shown that, if the constituent encoders are identical, a woven convolutional encoder with an outer convolutional warp and one inner tailbiting encoder (WIT) generates the same code as a woven convolutional encoder with one outer tailbiting encoder and an inner convolutional warp (WOT). However, for rate $R_{tb}\lt 1$ tailbiting encoders, the WOT cannot be an encoder realization with a minimum number of delay elements. Lower bounds on the free distance and active distances of woven convolutional codes with a tailbiting component code are given. These bounds are equal to those for woven codes consisting exclusively of unterminated convolutional codes. However, for woven convolutional codes with one tailbiting component code, the conditions for the bounds to hold are less strict.

 

New One-Generator Quasi-cyclic Codes over $\operatorname{\it GF}(7)$
R. Daskalov and P. Hristov
pp. 50–54

Abstract—Let $[n,k,d]_q$ codes be linear codes of length $n$, dimension $k$, and minimum Hamming distance $d$ over $\operatorname{\it GF}(q)$. In this paper, seventeen new codes are constructed, which improve the known lower bounds on minimum distance.

 

A Note on Systematic Tailbiting Encoders
P. Ståhl and R. Johannesson
pp. 55–64

Abstract—Tailbiting codes encoded by convolutional encoders are studied. An explanation is given for the fact that, at low signal-to-noise ratios, a systematic feedback encoder results in fewer decoding bit errors than a nonsystematic feedforward encoder for the same tailbiting code. The analysis is based on a recently introduced code property, namely, the weight density of distance-$d$ codewords. For a given distance-$d$ weight density, the decoding bit error probability depends on an encoder property, viz., the number of taps in the tap-minimal encoder pseudoinverse. Among all convolutional encoders that encode a given tailbiting code, the systematic one has the tap-minimal encoder pseudoinverse with fewest taps and, hence, gives the smallest bit error probability.

 

Reconstruction of Sparse Vectors in White Gaussian Noise
G. K. Golubev
pp. 65–80

Abstract—We consider the problem of reconstruction of a sparse vector observed against a background of white Gaussian noise. The sparsity is assumed to be unknown. Two approaches to statistical estimation in this case are discussed, namely, the model selection method and threshold estimators. We propose a method of selecting a threshold estimator based on the principle of empirical complexity minimization with minimal conservative penalization.

 

On Guaranteed Estimation of the Spectral Density of an Autoregression–Moving Average Process
V. V. Konev and D. V. Shapovalov
pp. 81–95

Abstract—An estimate for the spectral density of a stationary autoregression–moving average process with a given mean-square accuracy is proposed. In the construction of the estimate, we use the sequential analysis approach, which involves a special choice of the observation termination instant, depending on the estimation accuracy. An asymptotic formula for the average number of observations is obtained.

 

On the 100th Anniversary since the Birth of Andrei Nikolaevich Kolmogorov
pp. 96–97