CMPE-452/3.0 (36L; 84P)

Neural and Genetic Computing

Fall 2022 (Sep 6 - Dec 5)

 

Prerequisites: Linear Algebra, CISC 235/3.0 or ELEC 278/3.0, and programming experience.

Recommended knowledge: Any of the COGS courses (100 or 201) or PSYC 100 or PSYC  221 or PHIL 270

Exclusion(s): COGS 400/3.0 and CISC 452/3.0   (Same as COGS 400 and CISC 452)

 

Conflicts with other course schedule will not be allowed

 

Instructor:

Farhana H. Zulkernine, PhD, PEng

Associate Professor

Coordinator, Cognitive Science Program

Director, Bigdata Analytics and Management (BAM) Laboratory
637 Goodwin Hall, School of Computing, Queen's University
Kingston, Ontario, Canada K7L 2N8

E-mail: farhana dot zulkernine at  queensu dot ca

Website: http://research.cs.queensu.ca/home/farhana/
Tel: (613)533-6426


 

Humans continuously learn and demonstrate intelligent behaviour. Various mechanisms are used to collect information, analyze the same and produce output. If the output is not satisfactory, corrections are made next time to produce a better output. Artificial Neural Networks (ANN) are computational network structures that are based on the above concept and the structure of brain cells called neurons. ANNs are composed of artificial neurons which work like the brain neurons, each performing a very simple task but when connected as a network, are able to learn gradually as humans enabling machine intelligence. Genetic Algorithm (GA) is based on the concepts of evolution where better solutions are derived by using many iterations or generations of trial and errors. The process starts with a random set of solutions which gradually gets refined by creating new solutions from the old ones until no further significant improvements can be achieved. These machine learning techniques are showing promising results for certain problem domains where all the data are not known in advance, and where machines need to continuously learn to adapt to the changing stimuli or situation. The brain-like ANNs have been used to model human cognition, perception, and memory. Recently these techniques are also being applied to train and refine the knowledge of robots which are used in many critical applications such as speech, text, sensor data analytics and computer vision. You will learn different ANNs and GAs in this course, models of ANNs for human cognition, and how these techniques are used in problem solving such as classification, clustering, optimization and data reduction. You will need to develop software programs (using any programming language such as Python, C, C++, Matlab, Java) to implement ANN models, train and test them. By the end of the course, you will know the significant ANNs and GA techniques, have experience in writing software programs that implement some of these algorithms to solve practical problems, and get an idea of the ongoing research in this area in various application domains.

 

Intended Student Learning Outcomes

Please refer to https://www.cs.queensu.ca/students/undergraduate/CLO.php#CISC452

 

Syllabus

For standard syllabus elements, please refer to School of Computing website (https://www.cs.queensu.ca/students/undergraduate/syllabus/year2021-22.php)

Unit 1: Biological neurons and the evolution of Artificial Neural Network (ANN) models. Example of an ANN model applied to human cognition. General architecture and concepts behind ANN and its application in machine learning. Learning algorithms for different machine learning applications such as classification, clustering, associations, predictions, image processing and compression, and dimensionality reduction.
Unit 2: Supervised learning in ANNs. Learning algorithms primarily include error correction (back propagation) and feedback learning in multilayer feed-forward, recurrent, Radial Basis Function, and adaptive networks. Engineering ANNs, design techniques include network pruning algorithms for adaptive networks, momentum, and decay.
Unit 3: Unsupervised learning in ANNs. Kohonen, Adaptive Resonance Theory, Principal Component Analysis networks. Application of competitive learning, and Hebbian or reinforcement learning.
Unit 4: Associative networks. Brain-state-in-a-Box (BSB), Hopfield and Self Organizing Map (SoM) networks.
Unit 5: Optimization algorithms for ANN such as Simulated Annealing and Genetic Algorithms.

 

 

Class time and location
Days & Times Room Meeting Dates
Tuesday 11:30-12:30pm

HUMPHREY AUD

Sep 6 - Dec 5
Wednesday 13:30-14:30pm

ETHERINGTON AUD

Sep 6 - Dec 5
Friday 12:30-13:30pm

HUMPHREY AUD

Sep 6 - Dec 5

 

Course Details

Details about the course content can be found at the OnQ website.

 

Grade Distribution (Weekly Syllabus)

 

Grade Categories

% of Final Grade

Coding Assignments (3)

35%

Deep Learning Group Project (includes literature review, code implementation, reports, talk and demo)

35%

Written in-class quizzes (2)

30%


Text Book

Neural Networks and Deep Learning

By Charu C. Aggarwal

e-Text book 2018, https://link.springer.com/book/10.1007/978-3-319-94463-0, Springer

Other References

Neural Networks and Deep Learning

By Michael Nielsen

e-book - Reference online study material, http://neuralnetworksanddeeplearning.com/


Deep Learning

By Ian Goodfellow, Yoshua Bengio and Aaron Courville

e-book (free) http://www.deeplearningbook.org/


Research Papers

Research papers for advanced reading and presentation can be selected from IEEE Transactions on Neural Networks and Learning Systems . You can also search Google Scholar for research papers and other reputed publication venues (IEEE and ACM journals and conference publications) in the area of NN and GA.



Teaching Assistants

TBD

***Note: The students are advised to use the discussion forum on the OnQ site for general inquiries before making an appointment to see the TA.

Last updated: Aug 13, 2021 by Farhana Zulkernine