boosting allows combining very simple models to boost results. One good algoritm is AdaBoost.
additional information
example
for instance, we can ensemble four decision trees together. You can weight the prediction of each of the trees, and then add them up, and then run some decision boundary over them (i.e. for instance if the output is boolean, you can multiply the boolean as \(\pm 1\) multiplied by a weight)
high level ideas
- add more features (i.e. extract more, increase model complexity)
- add more weak learners together
