Fall 2022 (Sep 6 - Dec 5)
Prerequisites: Linear Algebra and programming experience.
Token Type: Theory, Applications
Instructor:
Farhana H. Zulkernine, PhD, PEng
Associate Professor
Coordinator, Cognitive Science Program
Director,
Bigdata Analytics and Management (BAM) Laboratory
637
Goodwin Hall,
School of Computing, Queen's University
Kingston, Ontario, Canada K7L 2N8
E-mail: farhana dot zulkernine at queensu dot ca
Website: http://research.cs.queensu.ca/home/farhana/
Tel:
(613)533-6426
Theoretical foundation and practical applications of Artificial Neural Networks (ANN) and Cognitive Computing (CC) models. Paradigms of neural computing algorithms using attention and context embedding models, applications in cognitive modeling, artificial intelligence, and machine learning with multi-stream data processing techniques.
Prerequisites: Knowledge of relational algebra
Preferred:Knowledge of Cognitive Science
Students
are expected to spend 120 hours per term in lecture and practice.
Course Learning Outcomes:
Unit 1: | Biological neurons and the evolution of Artificial Neural Network (ANN) models. Example ANN models for human and machine cognition. General architecture and concepts behind ANN, design challenges, activation functions, learning algorithms, optimization, and application in machine learning. Special focus on computer vision, IoT, text and voice analytics. |
Unit 2: |
Supervised learning in ANNs.
Learning sequence, spatial feature extraction for decision making, and
combining multiple data modalities.
Radial Basis Function, Adpative network, and advanced optimization using momentum and regularization. Application of genetic algorithms. Class presentation on recent work. |
Unit 3: | Application of competitive learning to implement unsupervised learning in ANNs. Kohonen, Adaptive Resonance Theory, Principal Component Analysis networks. Semi-supervised network models. Class presentation on recent work. Class presentation on recent work. |
Unit 4: | Application of Hebbian or reinforcement learning, associative networks, memory modeling, memory decay, cognitive modeling and deep neural networks for modeling attention. Class presentation on recent work. |
Unit 5: | Project presentations. |
Class time and location
|
||
Days & Times | Room | Meeting Dates |
---|---|---|
Wednesday 4:30pm - 6:00pm |
Kingston RM 112 | Sep 6 - Dec 5 |
Thursday 4:00pm - 5:30pm |
Goodwin 247 | Sep 6 - Dec 5 |
Course Details
Details about the course content can be found at the OnQ website.
Grade Distribution (Weekly Syllabus)
Grade Categories |
% of Final Grade |
---|---|
Coding assignments |
25% |
Written in-class quizzes (2) | 30% |
Deep Learning Group Project (includes literature review, code implementation, reports, talk and demo) |
45% |
Text Book
Elements of Artificial
Neural Networks
By Mehrotra, Mohan, and Ranka
2nd
Edition (ebook available to order using the following link or from
Amazon.ca)
By Charu C. Aggarwal
e-Text book 2018, https://link.springer.com/book/10.1007/978-3-319-94463-0, Springer
Other References
Neural Networks and Deep Learning, By Michael Nielsen, e-book - Reference online study material, http://neuralnetworksanddeeplearning.com/
Deep Learning, By Ian Goodfellow, Yoshua Bengio and Aaron Courville, e-book (free) http://www.deeplearningbook.org/