CISC-874/3.0 (36L, 84P)

Foundations of Neural Networks

Winter 2021

Prerequisites:  Linear Algebra and programming experience.


Farhana H. Zulkernine, PhD, PEng

Associate Professor

Coordinator, Cognitive Science Program

Director, Bigdata Analytics and Management (BAM) Laboratory
637 Goodwin Hall, School of Computing, Queen's University
Kingston, Ontario, Canada K7L 2N8

E-mail: fhz at queensu dot ca (

Tel: (613)533-6426


Course Description

Theoretical foundation and practical applications of Artificial Neural Networks (ANN) and Cognitive Computing (CC) models. Paradigms of neural computing algorithms using attention and context embedding models, applications in cognitive modeling, artificial intelligence, and machine learning with multi-stream data processing techniques.


Prerequisites: Knowledge of relational algebra  


Time Commitment

Students are expected to spend 120 hours per term in lecture and practice.


Learning Outcomes

Course Learning Outcomes:


  1. Explain foundational concepts such as operation of biological neurons and learning in artificial intelligence (AI) influenced by cognitive modeling theories such as perceptual bias, memory, attention, context embedding, and belief.
  2. Apply theoretical knowledge in developing computational models for cognitive modeling, language understanding, decision support, behavior analysis, question answering, image processing, and action recognition.
  3. Explore and critically analyze recent research on cognitive modeling to explain human cognition and memory using data from social networks, application of deep neural models in computer vision, language understanding in intelligent chatbots, and multi-sensor stream data processing for predictive analytics and decision support.
  4. Explain the power and limitations of neural cognitive systems.



Unit 1: Biological neurons and the evolution of Artificial Neural Network (ANN) models. Example of an ANN model applied to human cognition. General architecture and concepts behind ANN and its application in machine learning. Learning algorithms for different machine learning applications such as classification, clustering, associations, predictions, image processing and compression, and dimensionality reduction.
Unit 2: Supervised learning in ANNs. Learning algorithms primarily include error correction (back propagation) and feedback learning in multilayer feed-forward, recurrent, Radial Basis Function, and adaptive networks. Engineering ANNs, design techniques include network pruning algorithms for adaptive networks, momentum, and decay.
Unit 3: Unsupervised learning in ANNs. Kohonen, Adaptive Resonance Theory, Principal Component Analysis networks. Application of competitive learning, and Hebbian or reinforcement learning.
Unit 4: Associative networks, memory modeling, cognitive modeling and deep neural networks.
Unit 5: Seminars on recent research



Class time and location
Days & Times Room Meeting Dates
Monday 10:30am - 12:00pm Virtual Classroom Jan 11 - Apr 9
Wednesday 10:00am - 11:30am Virtual Classroom Jan 11 - Apr 9


Course Details

Details about the course content can be found at the OnQ website.


Grade Distribution

Grade Categories

% of Final Grade

Quizzes 30%



Project and presentations


Text Book

Elements of Artificial Neural Networks

By Mehrotra, Mohan, and Ranka

      2nd Edition (ebook available to order using the following link or from

Neural Networks and Deep Learning

By Charu C. Aggarwal

e-Text book 2018,, Springer

Other References

Neural Networks and Deep Learning, By Michael Nielsen, e-book - Reference online study material,

Deep Learning,  By Ian Goodfellow, Yoshua Bengio and Aaron Courville, e-book (free)