Bayesian Decision Trees are known for their probabilistic interpretability. However, their construction can sometimes be costly. In this article we present a general Bayesian Decision Tree algorithm applicable to both regression and classification problems. The algorithm does not apply Markov Chain Monte Carlo and does not require a pruning step. While it is possible to construct a weighted probability tree space we find that one particular tree, the greedy-modal tree (GMT), explains most of the information contained in the numerical examples. This approach seems to perform similarly to Random Forests.
翻译:贝叶斯决定树以其概率可解释性而著称,然而,其构造有时成本很高。在本篇文章中,我们提出了一个适用于回归和分类问题的通用贝叶斯决定树算法。算法不适用于Markov 链条蒙特卡洛,也不要求一个修剪步骤。虽然可以构建一个加权概率树间距,但我们发现其中一棵树,即贪婪-模式树(GMT),解释了数字示例中的大部分信息。这个方法似乎与随机森林类似。