What you will learn
Concept of Information
Entropy and Mutual Information
Communication Channels and Channel Capacity
Concept of Data Compression
Limits of Data Compression
Today’s communication technology can be considered the result of Shannon’s theorem published in 1948. The paper A Mathematical Theory of Communication where Shannon defined entropy, mutual information, and channel capacity was a milestone in communication science. In this lecture, we teach fundamental subjects of information theory. The student of this course should have a background in probability and random variables. Without the knowledge of probability and random variables, it is not possible to get the fundamentals of the course.
In the course, we first explain the meaning of information and how to measure the information. We explain the concept of entropy and solve various examples clearing the meaning of entropy. Next, we explain mutual information and define channel capacity. We calculate the capacities of some discrete channels. In the sequel, we define the capacity for the AWGN channel and derive the channel capacity expression. We also give information about typical sequences and explain the philosophy behind data compression subject. We provide the limits od data compression. This course can be taken by anyone interested in the fundamentals of communication theory. This course is especially useful for those working in the communication/telecommunication area. However, it can be studied by anyone interested in the communication field.