Neural Nets (596)
Type: | Lecture with Exercises |
ECTS: | 6 points |
Lectures: | Fridays, 08:30-12:00 |
Venue: | BIN 2.A.01 |
Lecturers: | Prof. Rolf Pfeifer |
Target audience: | Recommended for MSc students. The course is interdisciplinary; it is also targeted at students from other fields than computer science, e.g. economics, biology, natural sciences, and psychology. |
Language: | English |
Assessment: | Written exam. Date: Friday, 14.06.2013, 8:00 - 9:30am |
Assistants: | Nico Schmidt, Matthias Weyland |
Systematic introduction to neural networks, biological foundations; important network classes and learning algorithms; supervised models (perceptrons, adalines, multi-layer perceptrons), support-vector machines, echo-state networks, non-supervised networks (competitive, Kohonen, Hebb), recurrent networks (Hopfield, CTRNNs - continuous-time recurrent neural networks), spiking neural networks, spike-time dependent plasticity, applications. Special consideration will be given to neural networks embedded in adaptive systems having to interact with the real world, such as embodied systems (in particular robots). Cooperation of neural control, morphology, materials, and environment. Evolutionary approaches to designing autonomous systems; interaction of learning and evolution. Network theory applied to brain networks; motifs.
Additional case studies will be discussed to deepen the understanding of neural networks, e.g. Neural interfacing - coupling neural systems with technology (in particular robotic devices), neural imaging studies, Distributed Adaptive Control (DAC), neural gas and DRNNs - Dynamically Rearranging Neural Networks (neuro-modulator-based networks), neural network models of memory.
This is an elementary, interdisciplinary introduction to neural networks, suited not only for computer scientists, but also for economists, biologists, psychologists, etc.
If you wish, you can do the exercises in groups of two - please hand in only one task sheet. You can also do them on your own if you prefer.
Important
The page will be subjected to changes over time. Please stay up-to-date by checking it periodically.
Final Exam
List of topics relevant for exam (PDF, 69 KB)
List of topics relevant for the final exam.
The final exam will be Friday, 14 June, 2013, from 8.00 to 9.30am in the lecture room 2.A.01.
In order to attend the final exam, you must achieve 50% out of the total possible points of all the exercises (Task sheets 1-4) together (not each).
It will not be an open-book exam, so you are not allowed to use your books and notes. However, you don't have to learn any formulas by heart, as we will provide you with a sheet containing all formulas (but you do have to know which one applies to which type of Neural Net). No laptops or cell phones.
Please bring along:
- simple pocket calculator (no cell phones or other advanced programmable devices)
- your student ID card (required)
We wish you a lot of success.
Date | Topic | Lecture | Exercises |
---|---|---|---|
22 February | Introduction, Linear Algebra | For an intuitive, 50min introduction to artificial neural networks, please consult this video which was recorded in the context of the ShanghAI Lectures. All the points raised in this video will be taken up again and discussed in more detail later in the class. Please also consult the pdf of the slides for this lecture. Neural networks require relatively little prior knowledge in mathematics, just some linear algebra and a bit of elementary calculus. For those who are not confident about their linear algebra skills, we will provide an introductory tutorial - including a set of exercises during the first lecture on 22 February, starting approx at 10.00h. If you are confident that you already master basic linear algebra, you don't need to attend this tutorial. |
|
1 march | Supervised models | Perceptron, Adaline, delta-rule | |
8 March | No lecture | Special event: Robots on Tour www.robotsontour.com | |
15 March | Supervised models | Back-propagation: examples, properties; Error surfaces, Momentum term, Other improvements; N-fold cross-validation, VC dimension | (Due date: 12 April) |
22 March | Supervised models | Cascade correlation, Suport vector machines (SVMs) | |
29 March | No lecture | ||
5 April | No lecture | ||
12 April | Supervised models | Cascade correlation, Suport vector machines (SVMs) | hand in Task Sheet 1 (Due date: 26 April) |
19 April | Recurrent neural networks | Hopfield nets, Stocastic Models, CTRNNs (Continuous Time Recurrent Neural Network) | |
26 April | Hybrid models | Guest lecture: Naveen Kuppuswamy: Reservoir computing | hand in Task Sheet 2; Videos of Lectures by Andrew Ng on SVMs: 1, 2, 3 |
3 May | Unsupervised models | Nico Schmidt on Hebbian Learning, PCA, Oja's rule, Sanger's rule | |
10 May | Biologically more plausible models | Guest lecture: Pascal Kaufmann: Basic neurophysiology, Spiking neurons, Cyborgs, Lamprey experiment, Brain imaging |
Simulator (ZIP, 315 KB) |
17 May | Unsupervised models | Competitive learning, SOM, Kohonen-algorithm, Extended Kohonen map (robot arm), Adaptive light compass. | it is OK to hand in Task Sheet 3 on 17 May! |
24 May | Application of recurrent networks | Morphological Computation, Evolutionary Robotics, Co-evolution of morphology and control | hand in Task Sheet 4 |
31 May | Wrap-up | Wrap-up session, questions, final discussion | |
14 Jun | Exam |
Recommended Literature:
- J. Hertz, A. Krogh, R. Palmer, "Introduction to the theory of neural computation", Addison-Wesley Publishing Company. A "classic"; a bit mathematical, but sound, written by physicists. Recommended as a complement to the lecture script. It covers most but not all topics of the class (e.g. Support Vector Machines, spiking neurons, etc.).
- S. Haykin, "Neural Networks: A comprehensive foundation", Prentice Hall. Very comprehensive, covers most of the topics of the class. Can also be used as an introductory textbook and as a complement to the class. It also introduces quite a few topics that go beyond the class.
- N. Cristianini, J. Shawe-Taylor, "An Introduction to Support Vector Machines and other kernel-based learning methods", Cambridge University Press. Nice introduction to kernel-based learning machines. Mainly for the mathematically minded student. Support Vector Machines will be covered in class and are included in the book by Haykin.
- R. Rojas, "Neural Networks - A Systematic Introduction", Springer-Verlag, 1996.
A nicely and comprehensively written overview of the field with robotics application examples.
The book can be downloaded for free here:
http://www.inf.fu-berlin.de/inst/ag-ki/rojas_home/pmwiki/pmwiki.php?n=Books.NeuralNetworksBook
Materials that are useful for the understanding of the course:
- The Neural Network script (PDF, 8 MB)
- Lecture slides (PDF, 488 KB) from Qian Zhao about Cascade Correlation
- Paper (PDF, 171 KB) about Cascade Correlation
- NeuralNetworks Matlab demo (ZIP, 3 MB)
- GreekAlphabet (PDF, 28 KB)
- Linear Algebra Collection of formulas (PDF, 20 KB)
- Java NN-Simulator and Cascade Correlation applet (ZIP, 315 KB) (ZIP)
- NN-Simulator Manual (PDF, 1 MB)
- ART (adaptive resonance theory) article by Gail A. Carpenter and Stephen Grossberg
- Support Vector Machines: Nice and short introduction to SVM. (PDF, 941 KB)
- What is a supprt vector machine? (PDF, 242 KB)
- SOM2D Demo - Python implementation (ZIP, 2 KB) (ZIP) - Simple code and visualization of 2D SOM.
- Tutorial on training recurrent neural networks (echo state network) (PDF, 1 MB)
- MLP demo: Recognition of handwritten digits (ZIP, 391 KB) (ZIP-File) (using Matlab Neural Network Toolbox)
- PCA demo (ZIP, 1 KB): use hebbian learning to find the principle components (Matlab script)
- PCA tutorial (PDF, 323 KB) by Jonathon Shlens
- Reservoir Computing Slides (PDF, 6 MB) by Naveen Kuppuswamy
- The Limits of Intelligence (PDF, 1 MB) by Douglas Fox
Links
Some links on the internet that are useful for the understanding of the course
- Cascade-Correlation Tutorial
- Cascade-Correlation Wikipedia
- pattern recognition applet in japanese
- multilayer perceptron applet
- Support Vector Machines video tutorial 1 by Colin Campbell
- Support Vector Machines video tutorial 2 by Chih-Jen Lin
- Support Vector Machine Java Applet 1
- Support Vector Machine Java Applet 2
- Pattern completion using a hopfield net
- Pattern recognition applet using a hopfield net
- Travelling Salesman Problem: using a hopfield network in order to find possible solutions
- Self-Organizing Maps: Kohonen Network
- Travelling Salesman Problem: With a Kohonen network you can get quite satisfying results
- Self-Organizing Maps: Kohonen Network in 3D
- NERO: Neuro Evolving Robotics Operatives
- SVM Toy, from National Taiwan University