Course: Computer Architekture II

» List of faculties » FBI » UAI
Course title Computer Architekture II
Course code UAI/682
Organizational form of instruction Lecture + Lesson
Level of course Master
Year of study not specified
Frequency of the course In each academic year, in the winter semester.
Semester Winter
Number of ECTS credits 4
Language of instruction Czech
Status of course Compulsory-optional
Form of instruction Face-to-face
Work placements This is not an internship
Recommended optional programme components None
Lecturer(s)
  • Bukovský Ivo, doc. Ing. Ph.D.
  • Budík Ondřej, Ing.
Course content
1. Basic characteristics and trends in computer architecture for AI 2. Possibilities and limitations of large language models (LLM) 3. Data preparation for neural networks 4. CPU, GPU, TPU, and FPGA 5. DDR, GDDR, HBM 6. HDD, SSD, SATA, NVMe 7. CUDA and ROCm 8. Types of neural architectures 9. SW tools for neural networks 10. Principles of neural network learning 11. What are optimizers, schedulers, precision, and loss 12. Metrics in neural networks (MSE, MAE, RMSE, mIoU, etc.) 13. Edge deployment options for AI

Learning activities and teaching methods
Laboratory, Practical training, Case studies
  • Preparation for exam - 20 hours per semester
  • Preparation for classes - 100 hours per semester
  • Preparation for credit - 30 hours per semester
Learning outcomes
To introduce students to computer architectures for artificial intelligence computing. Differences between CPU, GPU,TPU and FPGA. What is CUDA and ROCm. Practical introduction to the principles of learning of neural networks. Deployment options on Edge devices.

Prerequisites
- (UAI/698) Architektura počítačů I - (UMB/564) Matematická analýza I. - (UAI/693) Softwarové modelování
UAI/698

Assessment methods and criteria
Oral examination, Interim evaluation

Active participation in exercise. Elaboration and presentation of an independent work. Detailed rules for completing the subject are always listed on moodle (e-learning) for the given academic year.
Recommended literature
  • C. C. Aggarwal. Neural Networks and Deep Learning: A Textbook, 1st ed. Cham, Switzerland: Springer. 2018.
  • C. M. Bishop and H. Bishop. Deep Learning: Foundations and Concepts, 2024th edition. Cham, Switzerland: Springer. 2023.
  • J. L. Hennessy and D. A. Patterson. Computer Architecture: A Quantitative Approach, 6th edition. Cambridge. 2017.
  • M. Justice. How Computers Really Work: A Hands-On Guide to the Inner Workings of the Machine. San Francisco: No Starch Press. 2020.


Study plans that include the course
Faculty Study plan (Version) Category of Branch/Specialization Recommended year of study Recommended semester