Entropy and Information Theory

(Autor)

Buch | Hardcover
409 Seiten
2011 | 2nd ed. 2011
Springer-Verlag New York Inc.
978-1-4419-7969-8 (ISBN)

Lese- und Medienproben

Entropy and Information Theory - Robert M. Gray
181,89 inkl. MwSt
This fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.
This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.

New in this edition:



Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes
Expanded discussion of results from ergodic theory relevant to information theory
Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources
New material on trading off information and distortion, including the Marton inequality
New material on the properties of optimal and asymptotically optimal source codes
New material on the relationships of source coding and rate-constrained simulation or modeling of random processes

Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.

Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- References.- Index

Erscheint lt. Verlag 3.2.2011
Zusatzinfo XXVII, 409 p.
Verlagsort New York, NY
Sprache englisch
Maße 155 x 235 mm
Themenwelt Mathematik / Informatik Informatik Theorie / Studium
Mathematik / Informatik Mathematik
Technik Elektrotechnik / Energietechnik
Technik Nachrichtentechnik
ISBN-10 1-4419-7969-7 / 1441979697
ISBN-13 978-1-4419-7969-8 / 9781441979698
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Wie bewerten Sie den Artikel?
Bitte geben Sie Ihre Bewertung ein:
Bitte geben Sie Daten ein:
Mehr entdecken
aus dem Bereich
Grundlagen – Anwendungen – Perspektiven

von Matthias Homeister

Buch | Softcover (2022)
Springer Vieweg (Verlag)
34,99
was jeder über Informatik wissen sollte

von Timm Eichstädt; Stefan Spieker

Buch | Softcover (2024)
Springer Vieweg (Verlag)
37,99
Grundlagen und formale Methoden

von Uwe Kastens; Hans Kleine Büning

Buch | Hardcover (2021)
Hanser, Carl (Verlag)
29,99