HN Gopher Feed (2017-07-23) - page 1 of 10 ___________________________________________________________________
A Practical Guide to Tree-Based Learning Algorithms
74 points by sadanand4singh
https://sadanand-singh.github.io/posts/treebasedmodels/#.WXT8Kli...___________________________________________________________________
thearn4 - 1 hours ago
As interesting as I find the current state of deep learning to be,
there is something about random forests that I can't help but find
much more cool. Probably the amazing out-of-box performance.
petters - 1 hours ago
Yes, they have very few knobs to turn, which is very attractive.
6502nerdface - 1 hours ago
Nice write-up, thanks for sharing. One possible typo I noticed:>
Maximum depth of tree (vertical depth) The maximum depth of trees.
It is used to control over-fitting, higher values prevent a model
from learning relations which might be highly specific to the
particular sample.Shouldn't it be lower values, i.e., shallower
trees, that control over-fitting?
sadanand4singh - 1 hours ago
Thanks for pointing. Yes it should be lower value to prevent
over-fitting.
iamnafets - 1 hours ago
I've found Adele Cutler's presentation on random forests to be an
outstanding resource for getting intuition of tree-based algorithms
.http://www.math.usu.edu/adele/RandomForests/UofU2013.pdfThinking
about trees as a supervised recursive partitioning algorithm or a
clustering algorithm is useful for problems that may not appear to
be simple classification or regression problems.
claytonjy - 29 minutes ago
On the topic of complementary resources, I really like Ben
Gorman's explanation: https://gormanalysis.com/random-forest-
from-top-to-bottom/. His related posts on singular decisions
trees and GBM's are just as good, too.