Course: Information Theory

» List of faculties » FBI » UAI
Course title Information Theory
Course code UAI/500
Organizational form of instruction Lecture + Lesson
Level of course Master
Year of study not specified
Frequency of the course In each academic year, in the winter semester.
Semester Winter
Number of ECTS credits 4
Language of instruction English
Status of course Compulsory
Form of instruction Face-to-face
Work placements This is not an internship
Recommended optional programme components None
Course availability The course is available to visiting students
Lecturer(s)
  • Bukovský Ivo, doc. Ing. Ph.D.
  • Beránek Ladislav, prof. Ing. CSc., MBA
Course content
1. Probability, Entropy 2. Element of the data compression, the source coding theorem 3. Symbol codes, stream Codes 4. Noisy-channel coding, communication over a noisy channel 5. - 6. Error-Correcting Codes and Real Channels 7. Coding - hash codes, binary codes 8. Good linear codes, message passing 9. - 10. Communication over Constrained Noiseless Channels 11. Elements from probability - exact marginalization (in trellises, in graphs) The content of the seminars is based on lectures. Practical tasks relating the lectures topics will be discussed at the tutorials. 12. - 13. Graph codes - low-density parity-check codes, convolutional codes, turbo codes

Learning activities and teaching methods
Monologic (reading, lecture, briefing), Work with text (with textbook, with book), Individual preparation for exam, Work with multi-media resources (texts, internet, IT technologies), Individual tutoring, Blended learning
  • Preparation for exam - 26 hours per semester
Learning outcomes
This course deals with information theory and its applications. The aim of the course is for students to gain knowledge about concepts such as entropy and information, lossless data compression, communication in the presence of noise, channel capacity and channel coding and coding in graphs. The course includes presentations of practical applications of these concepts.
By studying the course lectures, solving lab tasks, and passing the exam, students will demonstrate the ability to orient themselves in the issues of information theory and related methods, to understand the principles in connection to probability theory, machine learning, with applications to evaluation of data information contents, data relationships, data flows, coding, and data compression. In regards of machine learning, students will gain fundamental knowledge for applying probabilistic and information approaches to data evaluation and to validation of the performance of machine learning algorithms. Students shall be eligible to define these methods mathematically, apply them to case studies, and implement them (Python) either with their code or with existing libraries (with emphasis on understanding the mathematical nature of the methods used in the libraries ).
Prerequisites
Knowledge of probability and statistics

Assessment methods and criteria
Oral examination, Student performance assessment, Test, Seminar work, Interim evaluation

At least 50 points in a scoring system consisting of: a) max. 20 points from voluntary homework assigned during the semester and solved within the set deadline. b) max 20 points from the first test before the main test c) max 70 points from the exam Alternatively, a) and b) can be exchanged for a semester work, the topic of which is agreed by the teacher. During the semester, students can get a maximum of 40 points. Rating: F <50 b, 50 <E <60, 60 <D <70.70 <C <80.80 <B <90, 90 <A If a student were not active during the semester, i.e., did not solve voluntary homework a) and did not take test b), then the evaluation system shows that they can get the best grade D for the exam. If the student is not able to solve homework during the semester for serious reasons, there is still the possibility of combining a semester project + an exam, and the student can still achieve evaluation A.
Recommended literature
  • COVER, T. M. and Joy A. THOMAS. Elements of information theory. 2nd ed. Hoboken: Wiley-Interscience, c2006. ISBN 978-0-471-24195-9..
  • EL-GAMAL, A. and YOUNG-HAN, K. Network information theory. Primera. Cambridge: Cambridge University Press, 2011. ISBN 978-1-107-00873-1..
  • HOST, S. Information and Communication Theory. Hoboken, NJ: Wiley-IEEE Press, 2019. ISBN 978-1119433781..
  • MACKAY, David J. C. Information theory, inference, and learning algorithms. Cambridge: Cambridge University Press, 2003. ISBN 978-0-521-64298-9..


Study plans that include the course
Faculty Study plan (Version) Category of Branch/Specialization Recommended year of study Recommended semester