PROBLEMS OF INFORMATION TRANSMISSION
A translation of Problemy Peredachi Informatsii
CONTENTS
Linear Transmission of Gaussian
Messages over a Nonstationary Gaussian Channel with Feedback
I. A. Ovseevich and M. S. Pinsker
pp. 8599
AbstractNecessary and sufficient existence conditions are found for linear transmission of a Gaussian message over a discrete (in time) memoryless Gaussian channel with complete feedback, under the condition that the transmitted and reproduced messages have a specified covariance function. An algorithm for this transmission is given; a transmission algorithm that minimizes the mean-square error (MSE) is indicated, and the value of this error is written out.
Bounds on the Error Probability for
Certain Ensembles of Random Codes
A. G. D'yachkov
pp. 99108
AbstractTwo classes of random codes are considered, for which (unlike the standard situation in information theory) the code words are not independent. These codes arise in investigating Boolean and linear models of the design of screening experiments. Upper and lower bounds are obtained for the mean error probability over the ensemble of codes, for transmission over a memoryless channel and with maximum-likelihood decoding.
Pseudostochastic Coding for Error
Detection
L. M. Fink and S. A. Mukhametshina
pp. 108110
AbstractIt is shown that it is possible to replace random sequences for universal coding in data transmission systems with feedback by pseudorandom sequences.
Capacity for a Sum of Broadcast
Channels
G. Sh. Poltyrev
pp. 111114
AbstractThe article considers a discrete broadcast channel with two receivers, each constituent of which is a sum of some number of discrete memoryless channels (components). The components form broadcast channels that degrade in different directions. The capacity is determined for a broadcast channel of this type.
Coding of Sources on the Basis of
Observations with Incomplete Information
S. I. Gel'fand and M. S. Pinsker
pp. 115125
AbstractThe article considers the problem of recovering the messages of a source on the basis of the encoded messages of several sources that are correlated with it.
Correction of Error Bursts and
Independent Errors using Generalized Concatenated Codes
V. A. Zinov'ev and V. V. Zyablov
pp. 125134
AbstractIt is shown that a cascade decoding algorithm that realizes its code distance makes it possible to simultaneously correct both error bursts and independent errors that occur in addition to bursts. Lower bounds are given for the guaranteed lengths of the bursts to be corrected and for the guaranteed number of independent errors that can be corrected.
Coding of a Source with Unknown but
Ordered Probabilities
B. Ya. Ryabko
pp. 134138
AbstractThe article deals with the problem of optimum coding of a source for whose symbols it is known only that they are arranged in decreasing order of probability. On the basis of the resultant code, a design for a universal retrieval system is proposed and a hypothesis that accounts for Zipfs law is advanced.
Weight Spectra of Some Classes of
Binary Cyclic Codes
V. I. Tairyan
pp. 139144
AbstractFormulas are derived for the weight spectra and code distances of some classes of binary cyclic codes.
Efficiency of Discrimination Algorithm
for Orthogonal Signals with Unknown Parameters
A. P. Trifonov
pp. 144152
AbstractThe probabilities of incorrect decisions in discriminating signals with unknown nonenergy parameters in Gaussian noise are determined. The efficiency loss as a result of lack of knowledge of the parameters is estimated. Results of experimental verification of the formulas are given.
Asymptotic Behavior of the Number of
Types of Equivalent Binary Connection Matrices
Yu. L. Sagalovich
pp. 152158
AbstractAn asymptotic formula is derived for the number of types of binary matrices that are equivalent relative to permutations of rows and columns and/or inversion of columns. The asymptotic behavior is also obtained for the case of nonsingular matrices, for which an exact formula is as yet unknown.
On Error Probability in Recognition
Problems with a Random Vocabulary
V. S. Fain
pp. 159160
AbstractFor recognition problems in which the set of objects to be recognized is not fixed in advance, but only their type is known (e.g., in the handwriting-analysis problem), it is not possible to specify a priori the value of the error probability inherent in the algorithm employed. On the basis of the analogy between such problems and the problem of transmission of information over a noisy channel with random coding, it is shown that it is possible to set up an upper bound for the mean error probability (over the sets) when the maximum-likelihood method of recognition is employed. A working formula for this bound is given for one particular case.
Vladimir Ivanovich Siforovs Seventy-Fifth Birthday
pp. 161162