Concepts in geometry often have parallels in information theory; for example, volume and entropy, surface area and Fisher information, sphere-packing and channel coding, and Euclidean balls and Gaussian distributions, to name a few. These similarities provide a simple way to posit theorems in one area by translating the corresponding theorems in the other. However, the analogy does not extended fully, and the proof techniques often do not carry over without substantial modification. In this talk, I will try to bridge this gap by interpreting information-theoretic problems through the lens of high-dimensional geometry. This approach makes it possible to create new mathematical tools in information theory using existing tools in geometry. I will focus on two applications of these tools: analyzing the Shannon capacity of energy-harvesting channels, and obtaining a generalization of differential entropy for log-concave distributions. I will also describe some open problems and conjectures related to this line of work.
Discovery Building, Orchard View Room
Varun Jog