PROBLEMS OF INFORMATION TRANSMISSION
A translation of Problemy Peredachi Informatsii
Asymptotic Upper Bound for the Rate of $(w,r)$
Cover-Free Codes
V. S. Lebedev
pp. 317323
AbstractA binary code is called a $(w,r)$ cover-free code if it is the incidence matrix of a family of sets where the intersection of any $w$ of the sets is not covered by the union of any other $r$ sets. Such a family is called a $(w,r)$ cover-free family. We obtain a new recurrent inequality for the rate of $(w,r)$ cover-free codes, which improves previously known upper bounds on the rate.
Higher-Order Asymptotics of Mutual
Information for Nonlinear Channels with Non-Gaussian Noise
V. V. Prelov and E. C. van der Meulen
pp. 324340
AbstractNonlinear channels with non-Gaussian noise where the transmitted signal is a random function of the input signal are considered. Under some assumptions on smoothness and the behavior of tails of the noise density function, higher-order asymptotics of the mutual information between the input and output signals in such channels is obtained, as the mean power of the input signal (or, equivalently, the signal-to-noise ratio) goes to zero.
On the Ranks and Kernels Problem for
Perfect Codes
S. V. Avgustinovich, F. I. Solov'eva, and O. Heden
pp. 341345
AbstractA construction is proposed which, for $n$ large enough, allows one to build perfect binary codes of length $n$ and rank $r$, with kernel of dimension $k$, for any admissible pair $(r,k)$ within the limits of known bounds.
Enumeration of Optimal Ternary
Constant-Composition Codes
G. T. Bogdanova and S. N. Kapralov
pp. 346351
AbstractThe problem of classification of optimal ternary constant-composition codes is considered. Using a combinatorial-computer method, the number of inequivalent codes is determined for $3 \le d \le n \le 10$.
On Some Singularities in Parameter
Estimation Problems
S. Amari and M. V. Burnashev
pp. 352372
AbstractAssume that there are two identical remote objects and we want to estimate the distance to the closest of them and the distance between objects. For that purpose, some device (like a radar) is used, and the observed signal is the sum of signals reflected by each object. Moreover, the observed signal is corrupted by white Gaussian noise. Singularity (i.e., very poor estimation accuracy) occurs in this problem if objects are very close to each other. Our aim is to demonstrate this inevitable singularity by the example of the maximum likelihood estimate and also to show that it takes place for any other estimate.
Large Deviations in Quantum Information
Theory
R. Ahlswede and V. M. Blinovsky
pp. 373379
AbstractWe obtain asymptotic estimates for the probabilities of events of special types, which are useful in quantum information theory, especially for identification in noisy channels.
Extremal Relations between Additive
Loss Functions and the Kolmogorov Complexity
V. V. V'yugin and V. P. Maslov
pp. 380394
AbstractConditions are presented under which the maximum of the Kolmogorov complexity (algorithmic entropy) $\operatorname{K}(\omega_1\ldots\omega_N)$ is attained, given the cost $\sum\limits_{i=1}^Nf(\omega_i)$ of a message $\omega_1\ldots\omega_N$. Various extremal relations between the message cost and the Kolmogorov complexity are also considered; in particular, the minimization problem for the function $\sum\limits_{i=1}^Nf(\omega_i)-\theta \operatorname{K}(\omega_1\ldots\omega_N)$ is studied. Here, $\theta$ is a parameter, called the temperature by analogy with thermodynamics. We also study domains of small variation of this function.
Systems of Strings with High Mutual
Complexity
M. V. Vyugin
pp. 395399
AbstractConsider a binary string $x_0$ of Kolmogorov complexity $K(x_0)\ge n$. The question is whether there exist two strings $x_1$ and $x_2$ such that the approximate equalities $K(x_i\,|\,x_j)\approx n$ and $K(x_i\,|\,x_j,x_k)\approx n$ hold for all $0\le i,j,k\le 2$, $i\ne j \ne k$, $i\ne k$. We prove that the answer is positive if we require the equalities to hold up to an additive term $O(\log K(x_0))$. It becomes negative in the case of better accuracy, namely, $O(\log n)$.
INDEX
pp. 400403