CISC-874/3.0 (36L, 84P)

Neural and Cognitive Computing

Fall 2022 (Sep 6 - Dec 5)

Prerequisites:  Linear Algebra and programming experience.

Token Type: Theory, Applications



Farhana H. Zulkernine, PhD, PEng

Associate Professor

Coordinator, Cognitive Science Program

Director, Bigdata Analytics and Management (BAM) Laboratory
637 Goodwin Hall, School of Computing, Queen's University
Kingston, Ontario, Canada K7L 2N8

E-mail: farhana dot zulkernine at  queensu dot ca

Tel: (613)533-6426


Course Description

Theoretical foundation and practical applications of Artificial Neural Networks (ANN) and Cognitive Computing (CC) models. Paradigms of neural computing algorithms using attention and context embedding models, applications in cognitive modeling, artificial intelligence, and machine learning with multi-stream data processing techniques.


Prerequisites: Knowledge of relational algebra

Preferred:Knowledge of Cognitive Science

Time Commitment

Students are expected to spend 120 hours per term in lecture and practice.

Learning Outcomes

Course Learning Outcomes:

  1. Explain foundational concepts such as operation of biological neurons and learning in artificial intelligence (AI) influenced by cognitive modeling theories such as perceptual bias, memory, attention, context embedding, and belief.
  2. Apply theoretical knowledge in developing computational models for cognitive modeling, language understanding, decision support, behavior analysis, question answering, image processing, and action recognition.
  3. Explore and critically analyze recent research on cognitive modeling to explain human cognition and memory using data from social networks, application of deep neural models in computer vision, language understanding in intelligent chatbots, and multi-sensor stream data processing for predictive analytics and decision support.
  4. Explain the power and limitations of neural cognitive systems.


Unit 1: Biological neurons and the evolution of Artificial Neural Network (ANN) models. Example ANN models for human and machine cognition. General architecture and concepts behind ANN, design challenges, activation functions, learning algorithms, optimization, and application in machine learning. Special focus on computer vision, IoT, text and voice analytics.
Unit 2: Supervised learning in ANNs. Learning sequence, spatial feature extraction for decision making, and combining multiple data modalities.  
Radial Basis Function, Adpative network, and advanced optimization using momentum and regularization. Application of genetic algorithms. Class presentation on recent work.
Unit 3: Application of competitive learning to implement unsupervised learning in ANNs. Kohonen, Adaptive Resonance Theory, Principal Component Analysis networks. Semi-supervised network models. Class presentation on recent work.  Class presentation on recent work.
Unit 4: Application of Hebbian or reinforcement learning, associative networks, memory modeling, memory decay, cognitive modeling and deep neural networks for modeling attention.  Class presentation on recent work.
Unit 5: Project presentations.



Class time and location
Days & Times Room Meeting Dates

Wednesday 4:30pm - 6:00pm

Kingston RM 112 Sep 6 - Dec 5

Thursday 4:00pm - 5:30pm

Goodwin 247 Sep 6 - Dec 5

Course Details

Details about the course content can be found at the OnQ website.

Grade Distribution (Weekly Syllabus)


Grade Categories

% of Final Grade

Coding assignments


Written in-class quizzes (2) 30%

Deep Learning Group Project (includes literature review, code implementation, reports, talk and demo)


Text Book

Elements of Artificial Neural Networks

By Mehrotra, Mohan, and Ranka

      2nd Edition (ebook available to order using the following link or from

Neural Networks and Deep Learning

By Charu C. Aggarwal

e-Text book 2018,, Springer

Other References

Neural Networks and Deep Learning, By Michael Nielsen, e-book - Reference online study material,

Deep Learning,  By Ian Goodfellow, Yoshua Bengio and Aaron Courville, e-book (free)