PTCCS355 Syllabus - Neural Networks And Deep Learning - 2023 Regulation Anna University

PTCCS355 Syllabus - Neural Networks And Deep Learning - 2023 Regulation Anna University

PTCCS355

NEURAL NETWORKS AND DEEP LEARNING

 L T P C

2 0 2 3

COURSE OBJECTIVES:
• To understand the basics in deep neural networks
• To understand the basics of associative memory and unsupervised learning networks
• To apply CNN architectures of deep neural networks
• To analyze the key computations underlying deep learning, then use them to build and train deep neural networks for various tasks.
• To apply autoencoders and generative models for suitable applications.

UNIT I

INTRODUCTION

6

Neural Networks-Application Scope of Neural Networks-Artificial Neural Network: An Introduction- Evolution of Neural Networks-Basic Models of Artificial Neural Network- Important Terminologies of ANNs-Supervised Learning Network.

UNIT II

ASSOCIATIVE MEMORY AND UNSUPERVISED LEARNING NETWORKS

6

Training Algorithms for Pattern Association-Autoassociative Memory Network-Heteroassociative Memory Network-Bidirectional Associative Memory (BAM)-Hopfield Networks-Iterative Autoassociative Memory Networks-Temporal Associative Memory Network-Fixed Weight Competitive Nets-Kohonen Self-Organizing Feature Maps-Learning Vector Quantization-Counter propagation Networks-Adaptive Resonance Theory Network.

UNIT III

THIRD-GENERATION NEURAL NETWORKS

6

Spiking Neural Networks-Convolutional Neural Networks-Deep Learning Neural Networks-Extreme Learning Machine Model-Convolutional Neural Networks: The Convolution Operation – Motivation – Pooling – Variants of the basic Convolution Function – Structured Outputs – Data Types – Efficient Convolution Algorithms – Neuroscientific Basis – Applications: Computer Vision, Image Generation, Image Compression.

UNIT IV

DEEP FEEDFORWARD NETWORKS

6

History of Deep Learning- A Probabilistic Theory of Deep Learning- Gradient Learning – Chain Rule and Backpropagation - Regularization: Dataset Augmentation – Noise Robustness -Early Stopping, Bagging and Dropout - batch normalization- VC Dimension and Neural Nets.

UNIT V

RECURRENT NEURAL NETWORKS

6

Recurrent Neural Networks: Introduction – Recursive Neural Networks – Bidirectional RNNs – Deep Recurrent Networks – Applications: Image Generation, Image Compression, Natural Language Processing. Complete Auto encoder, Regularized Autoencoder, Stochastic Encoders and Decoders, Contractive Encoders.

30 PERIODS

LAB EXPERIMENTS: 30 PERIODS
1. Implement simple vector addition in TensorFlow.
2. Implement a regression model in Keras.
3. Implement a perceptron in TensorFlow/Keras Environment.
4. Implement a Feed-Forward Network in TensorFlow/Keras.
5. Implement an Image Classifier using CNN in TensorFlow/Keras.
6. Improve the Deep learning model by fine tuning hyper parameters.
7. Implement a Transfer Learning concept in Image Classification.
8. Using a pre trained model on Keras for Transfer Learning
9. Perform Sentiment Analysis using RNN
10. Implement an LSTM based Autoencoder in TensorFlow/Keras.
11. Image generation using GAN

Additional Experiments:
12. Train a Deep learning model to classify a given image using pre trained model
13. Recommendation system from sales data using Deep Learning
14. Implement Object Detection using CNN
15. Implement any simple Reinforcement Algorithm for an NLP problem

TOTAL: 60 PERIODS

COURSE OUTCOMES: At the end of this course, the students will be able to:
CO1: Apply Convolution Neural Network for image processing.
CO2: Understand the basics of associative memory and unsupervised learning networks.
CO3: Apply CNN and its variants for suitable applications.
CO4: Analyze the key computations underlying deep learning and use them to build and train deep neural networks for various tasks.
CO5: Apply autoencoders and generative models for suitable applications.

TEXT BOOKS:
1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016.
2. Francois Chollet, “Deep Learning with Python”, Second Edition, Manning Publications, 2021.

REFERENCES:
1. Aurélien Géron, “Hands-On Machine Learning with Scikit-Learn and TensorFlow”, Oreilly, 2018.
2. Josh Patterson, Adam Gibson, “Deep Learning: A Practitioner’s Approach”, O’Reilly Media, 2017. 3. Charu C. Aggarwal, “Neural Networks and Deep Learning: A Textbook”, Springer International Publishing, 1st Edition, 2018.
4. Learn Keras for Deep Neural Networks, Jojo Moolayil, Apress,2018
5. Deep Learning Projects Using TensorFlow 2, Vinita Silaparasetty, Apress, 2020
6. Deep Learning with Python, FRANÇOIS CHOLLET, MANNING SHELTER ISLAND,2017.
7. S Rajasekaran, G A Vijayalakshmi Pai, “Neural Networks, FuzzyLogic and Genetic Algorithm, Synthesis and Applications”, PHI Learning, 2017.
8. Pro Deep Learning with TensorFlow, Santanu Pattanayak, Apress,2017
9. James A Freeman, David M S Kapura, “Neural Networks Algorithms, Applications, and Programming Techniques”, Addison Wesley, 2003.

Comments

Popular posts from this blog

CS3491 Syllabus - Artificial Intelligence And Machine Learning - 2021 Regulation Anna University

BE3251 - Basic Electrical and Electronics Engineering (Syllabus) 2021-regulation Anna University

CS3401 Syllabus - Algorithms - 2021 Regulation Anna University