TreeNet® Gradient Boosting is Salford Predictive Modeler’s most flexible and powerful data mining tool, capable of consistently generating extremely accurate models. The TreeNet modeling engine’s level of accuracy is usually not attainable by single models or by ensembles such as bagging or conventional boosting. The TreeNet engine demonstrates remarkable performance for both regression and classification. The algorithm typically generates thousands of small decision trees built in a sequential error–correcting process to converge to an accurate model. The TreeNet modeling engine has been responsible for the majority of Minitab’s modeling competition awards.
The TreeNet® modeling engine adds the advantage of a degree of accuracy usually not attainable by a single model or by ensembles such as bagging or conventional boosting. As opposed to neural networks, the TreeNet methodology is not sensitive to data errors and needs no time-consuming data preparation, pre-processing or imputation of missing values. This type of data error can be very challenging for conventional data mining methods and will be catastrophic for conventional boosting. In contrast, the TreeNet model is generally immune to such errors as it dynamically rejects training data points too much at variance with the existing model. The TreeNet modeling engine robustness extends to data contaminated with erroneous target labels.
Interaction detection establishes whether interactions of any kind are needed in a predictive model, and is a search engine discovering specifically which interactions are required. The interaction detection system not only helps improve model performance (sometimes dramatically) but also assists in the discovery of valuable new segments and previously unrecognized patterns.
Technical Articles by Jerome Friedman are also available for download:
- Greedy Function Approximation: A Gradient Boosting Machine introduces the methodology.
- Stochastic Gradient Boosting discusses several improvements to the original idea.