>
Information-Theoretic Methods in Data Science

Information-Theoretic Methods in Data Science

  • £44.99
  • Save £30



Cambridge University Press, 4/8/2021
EAN 9781108427135, ISBN10: 1108427138

Hardcover, 560 pages, 24.9 x 17.5 x 3.3 cm
Language: English
Originally published in English

Learn about the state-of-the-art at the interface between information theory and data science with this first unified treatment of the subject. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information-theoretic methods are being used in data acquisition, data representation, data analysis, and statistics and machine learning. Coverage is broad, with chapters on signal acquisition, data compression, compressive sensing, data communication, representation learning, emerging topics in statistics, and much more. Each chapter includes a topic overview, definition of the key problems, emerging and open problems, and an extensive reference list, allowing readers to develop in-depth knowledge and understanding. Providing a thorough survey of the current research area and cutting-edge trends, this is essential reading for graduate students and researchers working in information theory, signal processing, machine learning, and statistics.

1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar
2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith
3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor
4. Information-theoretic bounds on sketching Mert Pillanci
5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa
6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei
7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister
8. Computing choice
learning distributions over permutations Devavrat Shah
9. Universal clustering Ravi Raman and Lav Varshney
10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu
11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega
12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh
13. Statistical problems with planted structures
information-theoretical and computational limits Yihong Wu and Jiaming Xu
14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai
15. Network functional compression Soheil Feizi and Muriel Médard
16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.