- Random forest model
There are two main paramters of random forest model. They are depth and tree count.
My current thoughts.
Increase depth will decrease variance and increase bias.
Increase tree count will decrease bias and may increace variance.
Basically you can use small tree count (e.g. 100) to tune depth first. Increase depth to get low variance (maybe high bias). Then increase tree count to reduce bias.
- Gradient boosting tree
Similar with random forest, GBT mainly has three paramenters. They are tree depth, iterator and learning rate.
My current thoughts.
Increase depth will be learning faster (easier to converge) and maybe jump around when close to converge.
No comments:
Post a Comment