FI:PV021 Neural Networks - Course Information
PV021 Neural Networks
Faculty of InformaticsAutumn 2024
- Extent and Intensity
- 2/0/2. 4 credit(s) (plus extra credits for completion). Recommended Type of Completion: zk (examination). Other types of completion: k (colloquium).
In-person direct teaching - Teacher(s)
- doc. RNDr. Tomáš Brázdil, Ph.D. (lecturer)
Mgr. Tomáš Foltýnek, Ph.D. (assistant)
Mgr. Matej Gallo (assistant)
Mgr. Adam Bajger (assistant)
Mgr. Adam Ivora (assistant)
Mgr. Petr Zelina (assistant)
Bc. Jaroslav Kubín (assistant)
Bc. Jozef Kraus (assistant)
Lukáš Lejdar (assistant)
Bc. Andrej Šimurka (assistant) - Guaranteed by
- doc. RNDr. Tomáš Brázdil, Ph.D.
Department of Machine Learning and Data Processing – Faculty of Informatics
Contact Person: doc. RNDr. Tomáš Brázdil, Ph.D.
Supplier department: Department of Machine Learning and Data Processing – Faculty of Informatics - Timetable
- Tue 24. 9. to Tue 17. 12. Tue 8:00–9:50 D2
- Prerequisites
- Recommended: knowledge corresponding to the courses IB031, MB152, and MB153.
- Course Enrolment Limitations
- The course is also offered to the students of the fields other than those the course is directly associated with.
- fields of study / plans the course is directly associated with
- there are 37 fields of study the course is directly associated with, display
- Course objectives
- Introduction to neural networks.
- Learning outcomes
- At the end of the course student will have a comprehensive knowledge of neural networks and related areas of machine learning. Will be able to independently learn and explain neural networks problems. Will be able to solve practical problems using neural networks techniques, both independently and as a part of a team. Will be able to critically interpret third party neural-networks based solutions.
- Syllabus
- Basics of machine learning and pattern recognition: classification and regression problems; supervised and unsupervised learning; simple examples
- Perceptron: biological motivation; geometry
- Linear models: least squares (gradient descent, Widrow-Hoff rule); connection with maximum likelihood
- Multilayer neural networks: multilayer perceptron; loss functions; backpropagation
- Practical considerations: basic data preparation; practical techniques for learning optimization; overfitting & regularization; feature selection; applications
- Deep learning: learning in deep neural networks (vanishing gradient, pretraining with autoencoders)
- Convolutional networks
- Recurrent networks: Elman and Jordan networks, LSTM
- Transformer networks
- Literature
- recommended literature
- GOODFELLOW, Ian, Yoshua BENGIO and Aaron COURVILLE. Deep Learning. MIT Press, 2016. info
- not specified
- ŠÍMA, Jiří and Roman NERUDA. Teoretické otázky neuronových sítí. Vyd. 1. Praha: Matfyzpress, 1996, 390 s. ISBN 80-85863-18-9. info
- Teaching methods
- Theoretical lectures, group project
- Assessment methods
- Lectures, class discussion, projects. Oral examination.
- Language of instruction
- English
- Further Comments
- Study Materials
The course is taught annually. - Listed among pre-requisites of other courses
- Enrolment Statistics (recent)
- Permalink: https://is.muni.cz/course/fi/autumn2024/PV021