BOUNDARY VALUES OF ENTROPY WITH CONSTANT DURATION OF CODE WORDS

Authors

  • Н В Захарченко
  • А В Кочетков
  • А А Русаловская
  • Д O Шпак
  • В В Гордейчук

Abstract

The analysis of information parameters of positional and time codes, as well as methods for increasing the information content in code words (entropy) with a constant duration of the implementation interval (m = const of nayquist elements), timed signals, and its boundary values are determined. It is shown that the information capacity of one Nyquist element with timer encoding is tens of times greater than the information capacity for positional coding. The number of implementations in the synthesis of timed signal constructions (TSC) is also much larger than the number of positional code implementations. It is shown that at TSC with increasing interval of realization m the total values of entropy and information capacity of the nayquist element increase, and the coefficient of relative speed of reduction of the total information capacity becomes smaller, while the total power of the ensembles increases.

Issue

Section

Радіотехніка і телекомунікації