CS 461: Machine Learning
Instructor: Kiri Wagstaff

Reading Questions for Lecture 2

Decision Trees (Ch. 9.1-9.4)
  1. Why do we say that decision trees learning is greedy?
  2. Why is entropy useful for calculating node purity?
  3. How do you calculate the "rule support" for an IF-THEN rule obtained from a decision tree?
  4. (Lewis) How is pre-pruning done?
Evaluation (Ch. 14.1-14.3)
  1. In K-fold cross-validation, how many folds will any two training sets share?
  2. What are the advantages and drawbacks using a large value of K for K-fold cross-validation?
  3. (TK) What is stratification?
  4. (TK) What is leave-one-out cross-validation (LOOCV)?
  5. (TK) Does LOOCV permit you to do stratification?
  6. (Natalia) How can you get different classifiers from the same training data, and why would you want to do that?