FI:PV021 Neural Networks - Course Information
PV021 Neural Networks
Faculty of InformaticsAutumn 2018
- Extent and Intensity
- 2/0/2. 4 credit(s) (plus extra credits for completion). Recommended Type of Completion: zk (examination). Other types of completion: k (colloquium).
- Teacher(s)
- doc. RNDr. Tomáš Brázdil, Ph.D. (lecturer)
Mgr. Jiří Vahala (assistant) - Guaranteed by
- prof. RNDr. Mojmír Křetínský, CSc.
Department of Computer Science – Faculty of Informatics
Contact Person: doc. RNDr. Tomáš Brázdil, Ph.D.
Supplier department: Department of Computer Science – Faculty of Informatics - Timetable
- Fri 8:00–9:50 A217
- Prerequisites
- Recommended: knowledge corresponding to the courses MB102 and MB103.
- Course Enrolment Limitations
- The course is also offered to the students of the fields other than those the course is directly associated with.
- fields of study / plans the course is directly associated with
- Applied Informatics (programme FI, B-AP)
- Applied Informatics (programme FI, N-AP)
- Information Technology Security (eng.) (programme FI, N-IN)
- Information Technology Security (programme FI, N-IN)
- Bioinformatics (programme FI, B-AP)
- Bioinformatics (programme FI, N-AP)
- Information Systems (programme FI, N-IN)
- Informatics with another discipline (programme FI, B-EB)
- Informatics with another discipline (programme FI, B-FY)
- Informatics with another discipline (programme FI, B-GE)
- Informatics with another discipline (programme FI, B-GK)
- Informatics with another discipline (programme FI, B-CH)
- Informatics with another discipline (programme FI, B-IO)
- Informatics with another discipline (programme FI, B-MA)
- Informatics with another discipline (programme FI, B-TV)
- Public Administration Informatics (programme FI, B-AP)
- Mathematical Informatics (programme FI, B-IN)
- Parallel and Distributed Systems (programme FI, B-IN)
- Parallel and Distributed Systems (programme FI, N-IN)
- Computer Graphics and Image Processing (programme FI, B-IN)
- Computer Graphics (programme FI, N-IN)
- Computer Networks and Communication (programme FI, B-IN)
- Computer Networks and Communication (programme FI, N-IN)
- Computer Systems and Data Processing (programme FI, B-IN)
- Computer Systems (programme FI, N-IN)
- Embedded Systems (eng.) (programme FI, N-IN)
- Programmable Technical Structures (programme FI, B-IN)
- Embedded Systems (programme FI, N-IN)
- Service Science, Management and Engineering (eng.) (programme FI, N-AP)
- Service Science, Management and Engineering (programme FI, N-AP)
- Social Informatics (programme FI, B-AP)
- Theoretical Informatics (programme FI, N-IN)
- Upper Secondary School Teacher Training in Informatics (programme FI, N-SS) (2)
- Artificial Intelligence and Natural Language Processing (programme FI, B-IN)
- Artificial Intelligence and Natural Language Processing (programme FI, N-IN)
- Image Processing (programme FI, N-AP)
- Course objectives
- At the end of the course student will have a comprehensive knowledge of neural networks and related areas of machine learning. Will be able to independently learn and explain neural networks problems. Will be able to solve practical problems using neural networks techniques, both independently and as a part of a team. Will be able to critically interpret third party neural-networks based solutions.
- Syllabus
- Basics of machine learning and pattern recognition: classification and regression problems; cluster analysis; supervised and unsupervised learning; simple examples
- Perceptron: biological motivation; geometry
- Linear models: least squares (pseudoinverse, gradient descent, Widrow-Hoff rule); connection with Bayes classifier; connection with maximum likelihood; regularization; bias-variance decomposition
- Multilayer neural networks: multilayer perceptron; loss functions; backpropagation
- Practical considerations: basic data preparation; practical techniques for improving backpropagation; bias & variance tradeoff; overfitting; feature selection; applications
- Hopfield network: Hebb's rule; energy; capacity
- Deep learning: restricted Boltzmann machines (sampling, maximum-likelihood learning, contrastive divergence learning); learning in deep neural networks (vanishing gradient, pretraining with autoencoders, deep belief networks)
- Convolutional networks
- Recurrent networks: Elman and Jordan networks, LSTM
- Clustering: density estimation; self-organizing maps
- Project: Software implementation of particular models and their simple applications.
- Literature
- recommended literature
- GOODFELLOW, Ian, Yoshua BENGIO and Aaron COURVILLE. Deep Learning. MIT Press, 2016. info
- Teaching methods
- Theoretical lectures, group project
- Assessment methods
- Lectures, class discussion, group projects (approx. 3 people per project). Several midterm progress reports on the respective projects, one final project presentation plus oral examination.
- Language of instruction
- English
- Further Comments
- Study Materials
The course is taught annually. - Listed among pre-requisites of other courses
- Enrolment Statistics (Autumn 2018, recent)
- Permalink: https://is.muni.cz/course/fi/autumn2018/PV021