#REQUEST.pageInfo.pagedescription#

Site Navigation

COMP9067 - Deep Learning

banner1
Title:Deep Learning
Long Title:Deep Learning
Module Code:COMP9067
 
Credits: 5
NFQ Level:Expert
Field of Study: Computer Science
Valid From: Semester 1 - 2018/19 ( September 2018 )
Module Delivered in no programmes
Module Coordinator: TIM HORGAN
Module Author: Ted Scully
Module Description: Deep learning techniques, which are a subfield of machine learning, has led to significant advances in challenging real-world problems such as natural language processing and image recognition. This module focuses on equipping students with both the theoretical and practical skills that will enable them to build and apply deep learning models to real-world problems.
Learning Outcomes
On successful completion of this module the learner will be able to:
LO1 Implement and evaluate a gradient descent-based machine learning algorithm.
LO2 Build, train and apply deep neural networks to problems such as computer vision.
LO3 Perform hyperparameter optimization, regularization and optimization for deep learning networks.
LO4 Create convolutional neural network models and apply to image datasets.
LO5 Build and train Recurrent Neural Networks (RNNs).
Pre-requisite learning
Module Recommendations
This is prior learning (or a practical skill) that is strongly recommended before enrolment in this module. You may enrol in this module if you have not acquired the recommended learning but you will have considerable difficulty in passing (i.e. achieving the learning outcomes of) the module. While the prior learning is expressed as named CIT module(s) it also allows for learning (in another module or modules) which is equivalent to the learning specified in the named module(s).
No recommendations listed
Incompatible Modules
These are modules which have learning outcomes that are too similar to the learning outcomes of this module. You may not earn additional credit for the same learning and therefore you may not enrol in this module if you have successfully completed any modules in the incompatible list.
No incompatible modules listed
Co-requisite Modules
No Co-requisite modules listed
Requirements
This is prior learning (or a practical skill) that is mandatory before enrolment in this module is allowed. You may not enrol on this module if you have not acquired the learning specified in this section.
No requirements listed
Co-requisites
No Co Requisites listed
 

Module Content & Assessment

Indicative Content
Regression and Gradient Descent.
Introduction to linear regression and gradient descent. Multiple linear regression and metrics for evaluating regression models. Logistic regression and activation functions. Using a vectorized implementation.
Build and evaluate deep neural networks.
Build and train shallow neural networks. Forward and backward propagation. Key parameters for neural networks. Create and train a fully connect deep learning model. Initialization, L2 and dropout regularization, gradient checking and batch normalization. Convergence algorithms. Best-practice for evaluating performance and analyzing for bias and variance.
Convolutional neural network.
Overview of convolutional neural networks. Methodology for stacking layers in a deep network to address multi-class image classification problems. Object detection and the YOLO algorithm. Deep residual learning for image recognition.
Recurrent Neural Networks (RNNs).
The basic recurrent unit (Elman unit) and LSTM (long short-term memory) unit. Overview of the GRU (gated recurrent unit). Build and train recurrent neural networks. Approaches for mitigating the vanishing gradient problem.
Assessment Breakdown%
Course Work100.00%
Course Work
Assessment Type Assessment Description Outcome addressed % of total Assessment Date
Project Perform a comparative analysis between a basic gradient descent-based machine learning model and a deep learning neural network applied to a dataset from a specific application domain. 1,2,3 50.0 Week 7
Project Build and train a convolutional or recurrent neural network and apply to a dataset from a specific application domain. A comprehensive evaluation should be completed. 4,5 50.0 Week 13
No End of Module Formal Examination
Reassessment Requirement
Coursework Only
This module is reassessed solely on the basis of re-submitted coursework. There is no repeat written examination.

The institute reserves the right to alter the nature and timings of assessment

 

Module Workload

Workload: Full Time
Workload Type Workload Description Hours Frequency Average Weekly Learner Workload
Lecture Delivers the concepts and theories underpinning the learning outcomes. 2.0 Every Week 2.00
Lab Application of learning to case studies and project work. 2.0 Every Week 2.00
Independent Learning Student undertakes independent study. The student reads recommended papers and practices implementation. 3.0 Every Week 3.00
Total Hours 7.00
Total Weekly Learner Workload 7.00
Total Weekly Contact Hours 4.00
Workload: Part Time
Workload Type Workload Description Hours Frequency Average Weekly Learner Workload
Lecture Delivers the concepts and theories underpinning the learning outcomes. 2.0 Every Week 2.00
Lab Application of learning to case studies and project work. 2.0 Every Week 2.00
Independent Learning Student undertakes independent study. The student reads recommended papers and practices implementation. 3.0 Every Week 3.00
Total Hours 7.00
Total Weekly Learner Workload 7.00
Total Weekly Contact Hours 4.00
 

Module Resources

Recommended Book Resources
  • I. Goodfellow , Y. Bengio, A. Courville 2017, Deep Learning (Adaptive Computation and Machine Learning series), 1st Ed., MIT Press [ISBN: 9780262035613]
Supplementary Book Resources
  • T. Laville 2017, Deep Learning for Beginners: Concepts, Techniques and Tools, 1st Ed., CreateSpace Independent Publishing [ISBN: 9781979311182]
  • F. Chollet 2017, Deep Learning with Python, 1st Ed., Manning Publications [ISBN: 9781617294433]
Recommended Article/Paper Resources
  • S. Ioffe, C. Szegedy 2015, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift, International Conference on Machine Learning
  • K. He, X. Zhang, S. Ren, J. Sun 2016, Deep Residual Learning for Image Recognition, IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Other Resources
  • Website: TensorFlowhttps://www.tensorflow.org/
  • Website: Theanohttp://deeplearning.net/software/theano/
  • Website: Caffeehttp://caffe.berkeleyvision.org/
 

Cork Institute of Technology
Rossa Avenue, Bishopstown, Cork

Tel: 021-4326100     Fax: 021-4545343
Email: help@cit.edu.ie