Module 2: Train

Module Overview

This module focuses on training neural networks effectively using gradient descent and backpropagation algorithms. You'll learn the fundamental principles behind how neural networks learn from data, including the optimization process that adjusts weights and biases to minimize loss. The module also explores critical hyperparameters like batch size and learning rate that significantly impact model performance and convergence.

Learning Objectives

  • Explain the intuition behind backpropagation and gradient descent
  • Understand the role and importance of batch size
  • Understand the role and importance of learning rate

Guided Project

Open DS_422_Train_Lecture.ipynb in the GitHub repository to follow along with the guided project.

Module Assignment

Continue building a sketch classification model using the Quickdraw dataset. Compare normalized vs. non-normalized data, experiment with different batch sizes and learning rates, and analyze the impact of various optimizers on model convergence using TensorBoard.

Assignment Solution Video