
Decision trees make predictions by learning a hierarchy of if-then rules from data — one of the most intuitive ML algorithms.
The algorithm learns which feature splits best separate the classes at each node, building a tree of decisions that ends in a predicted class or value.
Rarely used alone — almost always used as base learners in Random Forest or Gradient Boosting ensembles.
Reference:
TaskLoco™ — The Sticky Note GOAT