Module 3: Permutation and Boosting
Module Overview
In this module, you'll learn about ensemble methods with a focus on bagging and boosting techniques. You'll understand how gradient boosting models work and how to interpret feature importances through both default and permutation methods. These techniques will help you build more powerful models and gain deeper insights into what drives your predictions.
Learning Objectives
- Bagging vs. Boosting
- Gradient Boosting Model
- Feature Importances (default and permutation)
Guided Project
Open DS_233_guided_project.ipynb in the GitHub repository below to follow along with the guided project:
Guided Project Video - Part One
Guided Project Video - Part Two
Module Assignment
For this assignment, you'll continue working with your portfolio dataset from previous modules. You'll apply what you've learned to engineer meaningful features and select appropriate models for your specific problem.
Note: There is no video for this assignment as you will be working with your own dataset and defining your own machine learning problem.