AdaBoost Summary

  1. initialize equal weights for all samples

  2. Repeat t = 1,…,T

    • learn $f_{t}(x)$ with data weights $\alpha_{i}$
    • compute weighted error
    • compute coefficient
      • $\hat{w_{t}}$ is higher when weighted_error is larger
    • recomputed weights $\alpha_{i}$
    • Normalize weights $\alpha_{i}$
      • if $x_{i}$ often mistake, weight $\alpha_{i}$ gets very large
      • if $x_{i}$ often correct, weight $\alpha_{i}$ gets very small

screen shoot

Donate article here
Share the post