Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
This will be an informal course. All are welcome to attend. The level of presentation is intended to be appropriate for graduate students and final year undergraduates.
More information click here : A Short Course in Information Theory