What is a "multiple-tree, committee-of-expert method," or "bootstrap aggregation"?
The use of multiple trees in a committee of experts is a relatively new technique, and one of CART's creators has developed a dramatically effective way of combining trees in CART. Prediction errors can be reduced as much as 50 percent by directing CART to draw 50 or more different random samples from the training data, grow a different tree on each sample, and then allow the different trees to "vote" on the best classification. When appropriate, combining trees can yield a substantial performance edge over any other data mining procedure. For more information, see Committee of Experts.