PROBLEMS OF INFORMATION TRANSMISSION
A translation of Problemy Peredachi Informatsii


Volume 13, Number 2, April–June, 1977
Back to contents page

CONTENTS                   Powered by MathJax

 

The Economic Effect of Source Coding and Channel Coding
H. Ohnsorge
pp. 83–88

Abstract—We analyze the economic effectiveness of error-correcting systems as a function of the error probability. We show that the error-correcting systems for data transmission can be very effective from the economic point of view, in spite of high equipment costs.

 

Bounds for the Codes Correcting Errors and Defects
B. S. Tsybakov
pp. 89–97

Abstract—We consider a memory with defective cells and random errors. To store data in the memory, one can use the codes correcting defects and errors. In the paper we construct upper and lower bounds for the probability of erroneous decoding corresponding to the optimal code.

 

Carrying Capacity for Parallel Broadcast Channels with Degraded Components
G. Sh. Poltyrev
pp. 97–107

Abstract—Parallel broadcast channels (BC) with two receivers and independent components are examined. Each component is a degraded BC. Of particular interest is the case when the components are degraded in different aspects. Here the parallel BC is not degraded. For a parallel BC with components degraded in different aspects and for the case when only individual data is transmitted to each of the receivers, a characterization has been found for the domain of carrying capacities. For the case when both individual as well as common data are transmitted, lower and upper bounds on the boundary of the carrying capacity domain have been constructed, coinciding for symmetric channels.

 

Lower Bound for Error Probability in Channels with Feedback
E. A. Haroutunian
pp. 107–114

Abstract—The greatest lower bound for the error probability is obtained in the case of discrete memoryless channels with feedback. In the case of channels with a matrix of transition probabilities and with a specific symmetry [R.G. Gallager, Information Theory and Reliable Communication, Wiley, New York (1968)], this lower bound is identical with the bound for packing of spheres. The decoding by lists is also considered.

 

Optimal Recognition of Jump-Like Components of Markov Signals
N. S. Demin
pp. 114–121

Abstract—Formulas are obtained for statistics that are sufficient for nonrandomized decision rules of recognition of the type of jump-like components of a random Markov signal that has continuous and jump-like components. Estimation of signal components is also considered, and the possibility of realizing suboptimal procedures of recognition and estimation in solving specific problems.

 

On a Class of Estimates of the Location Parameters
R. Z. Khas'minskii and Yu. A. Sheverdyaev
pp. 121–126

Abstract—A class of estimates of a signal in additive noise with zero mean that have the form $\overline{X}-\overline{\Phi(X_i-\overline{X})}$ is studied. An asymptotic formula is obtained for the mean-square error of such estimates, and it is shown that an asymptotically optimal (in the minimax sense) estimate can always be obtained in such a form by appropriate choice of the function $\Phi$ if the noise distribution density $p(x)$ is not exactly known, and only finitely many integrals of the form $\int\varphi_j(x)p(x)\,dx$ are known.

 

Calculation of Statistical Characteristics of an Adaptive Quantizer
V. S. Molodtsov, N. I. Pilipchuk, and V. P. Yakovlev
pp. 126–132

Abstract—A method of calculation of the principal characteristics (i.e., the approximation error and the amount of discrete data) of an adaptive quantizer with an adjustable quantization interval in the stationary mode is presented.

 

Potential Possibilities of Detection of Random Paths
K. Sh. Zigangirov
pp. 132–140

Abstract—The testing of two hypotheses in Gaussian noise is considered, namely, of the hypothesis $\mathbf H_0$ that there is no particle in the observation field, and of the hypothesis $\mathbf H_1$ of presence of a particle in the field, with the distribution of the paths of the particle being known. An upper bound is obtained for signal-to-noise ratios for which it is impossible in principle to test the hypotheses with arbitrarily small error probabilities, and a lower bound is obtained for signal-to-noise ratios for which this is possible.

 

Equivalence of Circuits with One Fedback
M. A. Roitberg
pp. 141–147

Abstract—The so-called circuit equivalence of automata is investigated. An algorithm is indicated for verifying the circuit equivalence of finite automata and the complexity of the experiment establishing circuit equivalence is estimated.

 

On Connection Probability of a Random Subgraph of an $n$-Dimensional Cube
Yu. D. Burtin
pp. 147–152

Abstract—The following random procedure is considered. Each edge of an $n$-dimensional cube is removed with a specified probability, independently of the remaining edges. In the paper it is shown that if the probability of removing an edge $q$ has been fixed, while the dimensionality of the cube tends to infinity, then the probability that the graph formed by the unremoved edges is connected tends to $1$ when $q<1/2$ and to $0$ when $q>1/2$. By virtue of the upper bound obtained in [M.V. Lomonosov and V.P. Polesskii, Probl. Peredachi Inf., 7, No. 4, 78–81 (1971)] for the connection probability of a random graph, this assertion proves the asymptotic optimality in the sense of reliability of the graph of an $n$-dimensional cube among the graphs with the same number of vertices and edges.

 

Symmetric Large Systems of Finite Automata
A. R. Rotenberg
pp. 152–164

Abstract—A definition is introduced for convergence as $n\to\infty$ of symmetric systems consisting of a large number $n$ of stochastic finite automata, depending on which time intervals this convergence is being examined. Sufficient conditions are derived for the existence in a specified sense of a complete hierarchy of intervals, tending to infinity, on which the behavior of the systems has a limit as $n\to\infty$. A class of systems is put into consideration, in which the interaction of the automata is based on their forming random groups conducting themselves independently of each other; for such systems sufficient conditions are given for their convergence and formulas are given for the limit systems.