-

Patternt Recognition – Neural Networks

  • Code: 5611
  • Semester: 6th
  • Type: Scientific Field Course (SFC)
  • Category: Expertise Course (EC)
  • Character: Compulsory Selective (CS), Specialization Course (SC)
  • Specialization: Hardware Engineering

Module Description

CONTENTS
• Patterns, Pattern Recognition, Classifiers.
• Artificial neuron model.
• Supervised Learning.
• Linear Classifiers, Perceptron, Delta rule, Adaline.
• Non-linear classifiers, MLP – back propagation algorithm.
• Learning unsupervised Self-organizing Maps (SOM).
• Recurrent artificial neural networks.
• Hopfield Networks.
• Hamming Network.
• Maxnet.
• Boltzmann machines.

DESCRIPTION

The course constitutes an introduction to the definition of standards and their characteristics, objectives and importance of pattern recognition, the description of the Vector Standards Training with or without supervisor, Classification and Classifiers with Architectural Neural Networks.
• After the presentation of the Biological Standard, introduces students to the concept of artificial neuron (architecture, layers, neurons, synapses, activation functions, learning, recall).
• Explains the concepts of learning with supervision and presents linear classifiers, such as perceptron and Adaline networks, the process of learning and recall, and the rule of the delta as well as the above network applications.
• Presents nonlinear classifiers, such as the MLP network, the process of learning and recall, and the back propagation algorithm and its applications.
• Explains the concepts of unsupervised Learning and presents competitive learning networks, such as Self-organizing Maps (Self Organizing Maps) and their applications.
• Explains the concepts of recurrent artificial neural networks and presents Hopfield Networks, Hamming, Maxnet, Boltzmann machines and their applications.

EVALUATION

I. A written final examination (60%) in the theoretical part of the course that includes:
– Questions of understanding the structure of a Neural Network and the education and recall algorithms.
– Questions concerning the comparison of the performance of the different architectures Neural Networks.
– Exercises related to the design of simple pattern classification networks and their identification.
II. A Progress exam (20%)
III. Two works – optional – about some architectural Networks in Matlab or Java (20%)
II. Optional activities (Projects) relative to other methods known in the literature, which can not be covered within the course and presentation in the last lecture of the course, with 20% participation in the final grade

Alternative Evaluation Methods

Presentation of one of the topics not covered analytically, implementation in a Computer Programming Language such as Matlab, or Java, Comparison of some methods and results, Presentation of the work and Oral Examination..

Module Objectives

Upon successful completion of the course the student will be able to:

1) Know the Basic Linear and Non-Linear Classifiers for pattern recognition, which are based on Artificial Neural Networks architectures.
2) Describes schematically and in algorithm form the education process and the withdrawal of these neural networks by using a training set and recall patterns.
3) Distinguish the capabilities of each network for successful pattern recognition, depending on the linear separability.
4) Distinguish the differences between the different architectures of neural networks, depending on how education (supervised or unsupervised), how to feed (front feed and learning by error correction –multi-layer perceptron, Backpropagation algorithm, recurrent networks -Network Hopfield, Boltzmann machines ) and competitive learning networks (Kohonen maps).
5) Distinguish the differences, advantages and disadvantages between the architectures of neural networks associated with learning supervised (Perceptron, Adaline, MLP).
6) Implement the training algorithms for the simulation applications of two different architectures of neural networks and control the performance of the algorithms with two different sets training and test sets.

Bibliography

a) Greek
• K. Diamantaras, “”Artificial Neural Networks””, Prentice Hall, 2007.
b) International
• Simon O. Haykin, “”Neural Networks and Learning Machines””, (3rd Edition), 2009, Prentice Hall.
• John Hertz, Anders Krogh, Richard G. Palmer, “”Introduction to the Theory of Neural Computation””, Addison-Wesley, 1991.

Recent Announcements

4 Oct 2019
Διδασκαλία μαθημάτων από Μεταδιδάκτορες (ΕΣΠΑ)
4 Oct 2019
ΤΡΟΠΟΠΟΙΗΤΙΚΕΣ δηλώσεις μαθημάτων στο πληροφοριακό σύστημα ΠΥΘΙΑ 2019-20Χ
4 Oct 2019
Δηλώσεις τμημάτων εργαστηρίων 2019-20Χ
3 Oct 2019
ΠΡΟΘΕΣΜΙΕΣ ΚΑΙ ΔΙΚΑΙΟΛΟΓΗΤΙΚΑ ΣΙΤΙΣΗΣ ΑΚΑΔ.ΕΤΟΥΣ 2019-2020
3 Oct 2019
Οργάνωση Πινάκων Ανακοινώσεων
2 Oct 2019
ΔΗΛΩΣΕΙΣ ΜΑΘΗΜΑΤΩΝ ΚΑΤΕΥΘΥΝΣΕΩΝ – ΠΡΩΗΝ ΤΜ. ΠΛΗΡΟΦΟΡΙΚΗΣ
2 Oct 2019
Θέση υποψήφιου διδάκτορα σε ερευνητικό έργο
1 Oct 2019
Μετακίνηση το Χειμερινό 2019-2020 – Δήλωση μαθημάτων στο Pithia (επείγον)
3 Oct 2019
Τελετή Υποδοχής Πρωτοετών φοιτητών/τριών 2019-20
30 Sep 2019
Track on 5G for the Industrial Internet of Things @IEEE 5G World Forum
29 Aug 2019
Ημερίδα Πρακτικής Άσκησης
10 Jun 2019
Ημερίδα “Εθνική Στρατηγική Κυβερνοασφάλειας” στο Υπουργείο Ψηφιακής Πολιτικής
14 Apr 2019
6ο Technology Forum – 15 Απριλίου 2019 (τελικό πρόγραμμα)
19 Mar 2019
6ο Technology Forum – 15 Απριλίου 2019 (εισιτήρια με μειωμένο κόστος)
19 Mar 2019
OWASP Student Chapter Συνάντηση “Introduction to Digital Forensics”
17 Dec 2018
Ομιλία του καθηγητή Man Wai Mak (Hong Kοng Polytechnic University)

Δείτε επίσης