Decision trees and Random forests
Decision tree based classifiers are very popular in machine learning and data science applications. The most popular methods are random forests and gradient boosted classifiers, both of which are ensembles of decision trees. For this module, we will discuss basic decision tree algorithms and then discuss random forests and gradient boosting.
One reason that decision trees are a popular method is that they are inherently human readable, at least as a single tree, the forest is harder to read. A single decision tree is really a flow chart, a type of diagram that humans have been making and reading for many years! The difference is that the flow chart splits are all created automatically using machine learning rather than by hand.
Topic 1: decision trees
For the first part of this module, we will focus on the basic decision tree algorithm. This will include classification trees and regression trees. For our reading, we will jump back to the remaining section in chapter 19 that we skipped in the previous module.
- (30 min) Decision trees
- What types of trees exist? Why do we want to study trees?
- How do you grow a decision tree?
- Example of choosing the best attribute
- Complete the exercise on decision trees
Topic 2: ensemble methods
For the second topic in this module, we will look into ensemble methods. This will let us learn about random forests and gradient boosted methods as well!
- (30 min) Reading
- Read Section 19.8 (Ensemble Methods)
- (30 min)
- Motivation for ensemble methods: brittleness and bias variance tradeoff
- Random Forests
- Boosting and Gradient Boosted Forests
- Complete the exercise on ensemble tree methods