This book provides a basic introduction to both information theory and data compression. Although the two topics are related, this unique treatment allows readers to explore either topic independently. The authors' presentation of information theory is pitched at an elementary level, making the book less daunting than most other texts. The second edition includes a detailed history of information theory that provides a solid background for the quantification of the topic as developed by Claude Shannon. It also covers the information rate of a code and the trade-off between error correction and rate of information transmission, probabilistic finite state source automata, and wavelet methods.
"synopsis" may belong to another edition of this title.
"Statisticians, applied mathematicians, engineers, and computer scientists will find this well-written book useful." -Journal of Statistical Computation and Simulation
This book evaluates the basics of information theory and data compression. Two separate but related sections present the topics. The information theory section provides a logical foundation for the subsequent presentation of data compression from a practical, engineering perspective, explaining each technique and its application. This section covers lossless compression techniques as well as introduces lossy compression and general procedures for using various transforms.
"About this title" may belong to another edition of this title.
Seller: Coffee Cat Books, Chapel Hill, NC, U.S.A.
Hardcover. Condition: Good. First Edition. 1997. First Edition. Chapman and Hall; CRC. Good. Hardcover. Text with scattered markings, underlining & formula notations throughout (some pages heavy). Name stamp of prior owner to endpages (Provenance copy, from the personal collection of former Computer, Engineering & Electrical Dept. Professor). Cover clean & bright, bit of edge, shelf wear. Binding solid & square. 332 pp. Illustrated. Dual-topic textbook covering information theory (entropy, channel capacity, Shannon's theorems, coding theory, error correction) and data compression (lossless methods: Huffman, arithmetic coding, Lempel-Ziv, adaptive methods, dictionary methods; lossy methods: JPEG, transform methods, wavelets). Topics can be studied independently. Includes elementary probability, Bernoulli trials, prefix-condition codes, Kraft-McMillan inequality, maximum likelihood decoding, probabilistic finite state automata, Gallager/Knuth algorithms. Ships quickly and with care. Seller Inventory # 0Gsamwr020526G13
Seller: BennettBooksLtd, Los Angeles, CA, U.S.A.
hardcover. Condition: New. In shrink wrap. Looks like an interesting title! Seller Inventory # Q-0849339855