Öngelen, Gözdeİnkaya, Tülin2024-09-102024-09-102023-06-270941-0643https://doi.org/10.1007/s00521-023-08773-whttps://link.springer.com/article/10.1007/s00521-023-08773-whttps://hdl.handle.net/11452/44464With the emergence of digitilization, numeric prediction has become a prominent problem in various fields including finance, engineering, industry, and medicine. Among several machine learning methods, regression tree is a widely preferred method due to its simplicity, interpretability and robustness. Motivated by this, we introduce a novel ensemble regression tree based methodology, namely LOF-BRT+OR. The proposed methodology is an integrated solution approach with outlier removal, regression tree and ensemble learning. First, irregular data points are removed using local outlier factor (LOF), which measures the degree of being an outlier for each point. Next, a novel regression tree with LOF weighted node model is introduced. In the proposed node model, the weights of the points in the nodes are determined according to their surrounding neighborhood, as a function of LOF values and neighbor ranks. Finally, in order to increase the prediction performance, ensemble learning is adopted. In particular, bootstrap aggregation is used to generate multiple regression trees with LOF weighted node model. The experimental study shows that the proposed methodology yields the best root mean squared error (RMSE) values in five out of nine data sets. Also, the non-parametric tests demonstrate the statistical significance of the proposed approach over the benchmark methods. The proposed methodology can be applicable to various prediction problems.eninfo:eu-repo/semantics/closedAccessClassificationPredictionSelectionForestModelPredictionRegression treeEnsemble learningLocal outlier factorOutlier removalScience & technologyTechnologyComputer science, artificial intelligenceComputer scienceA novel LOF-based ensemble regression tree methodologyArticle0010175608000011945319463352610.1007/s00521-023-08773-w1433-3058