Coding theorems of information theory

Coding theorems of information theory

BOOK REVIEWS 311 d o n e - - I obviously did not read it. As to t h e surveyors, they p r e s u m a b l y did read all t h e r e f e r e n c e s - ...

176KB Sizes 0 Downloads 121 Views

BOOK

REVIEWS

311

d o n e - - I obviously did not read it. As to t h e surveyors, they p r e s u m a b l y did read all t h e r e f e r e n c e s - - i n c l u d i n g those in t h e Proceedings of t h e U z b e k Academy; b u t t h e y did n o t measure up to t h e t a s k of t r a c i n g t h e 1846 volume of the Proceedings of t h e I m p e r i a l A c a d e m y of Saint P e t e r s b u r g ; shame on t h e New York Public Library! BENOIT ~/~ANDELBROT, Book Review Editor

Harvard University, Cambridge 38, Mass.

C o d i n g T h e o r e m s of I n f o r m a t i o n T h e o r y . By J. WOLFOWITZ. Ergebnisse der M a t h e m a t i k u n d i h r e r Grenzgebiete. N.F., H e f t 31. Springer-Verlag, BerlinG S t t i n g e n - H e i d e l b e r g ; P r e n t i c e - H a l l , Englewood Cliffs, N.J., 1961. ix + 125 pp. $9.35. F a i r l y recently, two j o i n t authors made a d i s t i n c t i o n between " i n f o r m a t i o n t h e o r y in t h e s t r i c t s e n s e " a n d " i n f o r m a t i o n t h e o r y in t h e wide sense." I n t h e first c a t e g o r y t h e y included t h a t b o d y of research which has its direct origins in t h e 1947-8 p a p e r of C. E. S h a n n o n . I do n o t recall precisely t h e i r definition of t h e second category; p r e s u m a b l y it consisted of t h e c o m p l e m e n t of the first in w h a t ever t h e reader chooses to embrace w i t h i n t h e germ " i n f o r m a t i o n t h e o r y , " unqualified. T h e book u n d e r review lends itself admirably, a n d in t h e reviewer's opinion, c o m m e n d a b l y to this classification scheme; it is s t r i c t l y s t r i c t sense i n f o r m a t i o n theory. This book has been w r i t t e n (to quote the preface) to provide, for m a t h e m a t i c i a n s of some m a t u r i t y , an easy i n t r o d u c t i o n to t h e ideas and principal k n o w n t h e o r e m s of a c e r t a i n b o d y of coding theory. T h e first c o m m e n t to be m a d e is t h a t t h e word " i n t r o d u c t i o n " is overly modest; a reader who absorbs t h e full c o n t e n t of each of t h e 124 pages will be in possession of at least 70% say of t h e known results on coding theorems. Second, t h e r e q u i r e m e n t of m a t h e m a t i c a l m a t u r i t y should b y no means be misunderstood to m e a n a v e r y extensive knowlege of a n m n b e r of m a t h e m a t i c a l disciplines. A reader capable of working t h r o u g h a rigorous course in a d v a n c e d calculus, say, will need only a modest knowlege of e l e m e n t a r y p r o b a b i l i t y t h e o r y in order to read most of t h e book a n d c o m p r e h e n d all of its essential ideas. He may, of course, feel t h a t w h a t he is reading is not quite a n easy i n t r o d u c t i o n ; however, his persistance will quickly b r i n g s u b s t a n t i a l rewards. L e t us now examine the c o n t e n t s in some detail. C h a p t e r I contains a succinct b u t a d e q u a t e heuristic i n t r o d u c t i o n to t h e discrete memoryless channel. C h a p t e r 2 ( C o m b i n a t o r i a l Preliminaries) discusses the properties of g e n e r a t e d sequences, which were i n t r o d u c e d b y the a u t h o r several years ago, a n d w h i c h p l a y a leading role in m a n y of t h e proofs. T h e results presented are polished versions of basic facts which have b e e n used previously in t h e a u t h o r ' s publications. S t a n d a r d properties of t h e e n t r o p y f u n c t i o n arc also presented. C h a p t e r 3 discusses t h e discrete memoryless channel. The coding t h e o r e m and strong converse are proved, as well as a sharper form of t h e converse for t h e b i n a r y s y m m e t r i c channel. Also included in this c h a p t e r is t h e finite s t a t e channel, b u t defined s o m e w h a t differently from w h a t t h e reviewer h a d considered cust o m a r y . I n t h e a u t h o r ' s definition, to each s t a t e of the c h a n n e l t h e r e corresponds a c h a n n e l p r o b a b i l i t y f u n c t i o n (cpf) and t h e s t a t e of t h e system, at a n y i n s t a n t ,

312

BOOK REVIEWS

is a d e t e r m i n i s t i c f u n c t i o n of t h e preceding s t a t e a n d the most r e c e n t l y t r a n s m i t t e d symbol. The more usual definition,which leads to a c h a n n e l w i t h m e m o r y , is considered in C h a p t e r 6. C h a p t e r 4 is d e v o t e d to c o m p o u n d channels. B y comp o u n d channels is m e a n t channels whose b e h a v i o r m a y be governed b y m a n y different cpf's. First, systems in which the cpf remains fixed during t h e t r a n s mission of a word are considered, in t h e cases where n e i t h e r t r a n s m i t t e r n o r receiver knows which cpf governs the system, or when one or b o t h do. I n all cases, t h e coding t h e o r e m and strong converse is proved. T h e n , systems in which t h e cpf is s t o c h a s t i c a l l y d e t e r m i n e d are considered; here, in a d d i t i o n to s t u d y i n g t h e a f o r e m e n t i o n e d cases, t h e a u t h o r also considers t h e result of t h e t r a n s m i t t e r knowing t h e cpf j u s t at t h e m o m e n t of transmission, a n d not in advance. Coding t h e o r e m s a n d converses are p r o v e n in all cases, t h e converse being s t r o n g in all eases b u t one, for which t h e strong converse was o b t a i n e d ( i n d e p e n d e n t l y b y H. K e s t e n a n d J. L. C. K e m p e r m a n ) too late for inclusion. Finally, b y p r o v i n g t h e weak converse, t h e use of feedback is shown to effect no increase in t h e c a p a c i t y of a discrete memoryless c h a n n e l ; again, t h e strong converse was o b t a i n e d (by K e s t e n a n d K e m p e r m a n ) too late for inclusion. We should remark, for those who are in t h e h a b i t of giving a n a u t h o r ' s casual r e m a r k greater weight t h a n t h e i r own considered j u d g e m e n t , t h a t t h e assignment of credit to D o b r u s i n is, a t this point, more generous t h a n accurate, as t h e context of D o b r u s i n ' s result is s u b s t a n t i a l l y different from t h e a u t h o r ' s . C h a p t e r 5 considers t h e discrete finite-memory channel; t h e coding t h e o r e m a n d strong converse are proved. C h a p t e r 6 discusses channels " w i t h a past h i s t o r y , " i.e., w i t h various types of memory. F o r channels of this sort, of course, every probabilistic s t a t e m e n t concerning t h e f u t u r e beh a v i o r of t h e channel m u s t be conditioned b y some a s s u m p t i o n concerning t h e past b e h a v i o r of t h e channel. After some discussion of this point, several examples are considered, n o t w i t h a n eye towards m a x i m u m generality, b u t r a t h e r to illust r a t e how earlier methods, s u i t a b l y modified, can be of considerable value in these more i n v o l v e d cases. Let us m e n t i o n here t h a t we h a v e been informed b y t h e a u t h o r t h a t he has been able to establish t h e strong converse for t h e finite-state c h a n n e l of Section 6.6. C h a p t e r 7, General Discrete Channels, discusses o t h e r general techniques for p r o v i n g coding t h e o r e m s ; t h a t of t h e reviewer, as polished a n d extended b y Blackwell, B r e i m a n , and T h o m a s i a n as well as t h e a u t h o r , and S h a n n o n ' s r a n d o m coding m e t h o d . F a n o ' s weak converse is also presented, and t h e n a brief discussion to clarify the difference between a strong a n d weak converse. I t is at this p o i n t t h a t t h e reviewer disagrees w i t h t h e a u t h o r ' s assertion t h a t one ought not to speak of c a p a c i t y w i t h o u t h a v i n g proved a s t r o n g converse. If only for t h e fact t h a t t h e t e r m " c a p a c i t y " a n t e d a t e s " s t r o n g c o n v e r s e " b y a b o u t a decade, t h e preceding d i c t u m would appear a r b i t r a r y . B u t in fact, a coding t h e o r e m and corresponding weak converse u n i q u e l y define a n u m b e r which it is reasonable (and t r a d i t i o n M ) to call capacity. F u r t h e r m o r e , t h e weak converse says t h a t t h a t t e c h n i q u e (block coding) which was successful, below eapacity, in reducing t h e p r o b a b i l i t y of error to any positive level, fails above capacity. T h e s t r o n g converse simply adds t h a t t h e failure is, in t h e limit of large block length, as b a d as it can be. G r a n t e d t h a t it permits t h e c a p a c i t y to be defined as a limit r a t h e r t h a n as a limit of a limit s u p r e m u m , t h e reviewer does n o t feel t h a t one is not justified in speaking of a c a p a c i t y in its absence. We should m e n t i o n t h a t t h e r e

ROOK ~EWEWS

313

are examples (due to the a u t h o r ! ) where t h e s t r o n g converse does n o t obtain, a l t h o u g h t h e weak converse does. C h a p t e r 8 deals w i t h t h e semicontinuous memoryless channel, p r o v i n g t h e coding theorem, the s t r o n g converse, a n d a sharper version of the latter, due in this i n s t a n c e to K e m p e r m a n . C h a p t e r 9 considers a fully continuous channel, i.e., one in which there is a continuum b o t h of i n p u t and o u t p u t symbols. The specific channel which is discussed has as i n p u t a l p h a b e t t h e u n i t i n t e r v a l and outp u t a l p h a b e t the real line. T h e noise appears as a n additive Gaussian r a n d o m v a r i a b l e ; t h e t i m e v a r i a b l e is discrete. T h e coding t h e o r e m and strong converse are p r o v e n in four eases; no restriction on t h e i n p u t sequences; ~-~.~ x¢~ < n, where (xl , . . - , x,) is a typical i n p u t sequence; n(1 - ~) =< ~ x$ -< n, and finally, ~ ' ~ xi 2 = n. T h e proofs arc carried out first b y the a u t h o r ' s methods, and t h e n b y a n a d a p t a t i o n of S h a n n o n ' s techniques. C h a p t e r 10, e n t i t l e d M a t h e m a t i c a l Miscellanea, contains s proof, following T h o m a s i s n , of the asymptotic e q u i p a r t i t i o n p r o p e r t y , and of t h e admissability of an ergodic i n p u t for a discrete finite-memory channel, a l t h o u g h n e i t h e r is used elsewhere.

University of Illinois Urbana, Illinois