{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,5]],"date-time":"2026-03-05T07:20:22Z","timestamp":1772695222152,"version":"3.50.1"},"reference-count":31,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2020,3,5]],"date-time":"2020-03-05T00:00:00Z","timestamp":1583366400000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2020,3,5]],"date-time":"2020-03-05T00:00:00Z","timestamp":1583366400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["J Big Data"],"published-print":{"date-parts":[[2020,12]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>One of the ways of increasing recognition ability in classification problem is removing outlier entries as well as redundant and unnecessary features from training set. Filtering and feature selection can have large impact on classifier accuracy and area under the curve (AUC), as noisy data can confuse classifier and lead it to catch wrong patterns in training data. The common approach in data filtering is using proximity graphs. However, the problem of the optimal filtering parameters selection is still insufficiently researched. In this paper filtering procedure based on k-nearest neighbours proximity graph was used. Filtering parameters selection was adopted as the solution of outlier minimization problem: <jats:italic>k<\/jats:italic>-NN proximity graph, power of distance and threshold parameters are selected in order to minimize outlier percentage in training data. Then performance of six commonly used classifiers (Logistic Regression, Na\u00efve Bayes, Neural Network, Random Forest, Support Vector Machine and Decision Tree) and one heterogeneous classifiers combiner (DES-LA) are compared with and without filtering. Dynamic ensemble selection (DES) systems work by estimating the level of competence of each classifier from a pool of classifiers. Only the most competent ones are selected to classify a given test sample. This is achieved by defining a criterion to measure the level of competence of base classifiers, such as, its accuracy in local regions of the feature space around the query instance. In our case the combiner is based on the local accuracy of single classifiers and its output is a linear combination of single classifiers ranking. As results of filtering, accuracy of DES-LA combiner shows big increase for low-accuracy datasets. But filtering doesn\u2019t have sufficient impact on DES-LA performance while working with high-accuracy datasets. The results are discussed, and classifiers, which performance was highly affected by pre-processing filtering step, are defined. The main contribution of the paper is introducing modifications to the DES-LA combiner, as well as comparative analysis of filtering impact on the classifiers of various type. Testing the filtering algorithm on real case dataset (Taiwan default credit card dataset) confirmed the efficiency of automatic filtering approach.<\/jats:p>","DOI":"10.1186\/s40537-020-00297-7","type":"journal-article","created":{"date-parts":[[2020,3,5]],"date-time":"2020-03-05T16:03:56Z","timestamp":1583424236000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":23,"title":["Improving binary classification using filtering based on k-NN proximity graphs"],"prefix":"10.1186","volume":"7","author":[{"given":"Maher","family":"Ala\u2019raj","sequence":"first","affiliation":[]},{"given":"Munir","family":"Majdalawieh","sequence":"additional","affiliation":[]},{"given":"Maysam F.","family":"Abbod","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2020,3,5]]},"reference":[{"key":"297_CR1","doi-asserted-by":"publisher","first-page":"36","DOI":"10.1016\/j.eswa.2016.07.017","volume":"104","author":"M Ala\u2019raj","year":"2016","unstructured":"Ala\u2019raj M, Abbod MF. A new hybrid ensemble credit scoring model based on classifiers consensus system approach. Expert Syst Appl. 2016;104:36\u201355.","journal-title":"Expert Syst Appl"},{"issue":"1","key":"297_CR2","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1175\/1520-0493(1950)078<0001:VOFEIT>2.0.CO;2","volume":"78","author":"GW Brier","year":"1950","unstructured":"Brier GW. Verification of forecasts expressed in terms of probability. Mon Whether Rev. 1950;78(1):1\u20133.","journal-title":"Mon Whether Rev"},{"issue":"1","key":"297_CR3","doi-asserted-by":"publisher","first-page":"131","DOI":"10.1613\/jair.606","volume":"11","author":"CE Brodley","year":"1999","unstructured":"Brodley CE, Friedl MA. Identifying mislabeled training data. J Artif Intell Res. 1999;11(1):131\u201367.","journal-title":"J Artif Intell Res"},{"key":"297_CR4","unstructured":"Chen S. (2017). K-nearest neighbor algorithm optimization in text categorization. IOP Conference Series: Earth and Environmental Science."},{"key":"297_CR5","doi-asserted-by":"publisher","first-page":"104824","DOI":"10.1016\/j.knosys.2019.06.032","volume":"187","author":"Y Chen","year":"2020","unstructured":"Chen Y, Hu X, Fan W, Shen L, Zhang Z, Liu X, Li H. Fast density peak clustering for large scale data based on kNN. Knowledge-Based Syst. 2020;187:104824.","journal-title":"Knowledge-Based Syst."},{"key":"297_CR6","doi-asserted-by":"crossref","unstructured":"Chen Y, Zhou L, Bouguila N, Zhong B, Wu F, Lei Z, Du J, Li H (2018). Semi convex hull tree: fast nearest neighbor queries for large scale data on GPUs. IEEE International Conference on Data Mining, ICDM, IEEE, p. 911\u2013916.","DOI":"10.1109\/ICDM.2018.00110"},{"issue":"2018","key":"297_CR7","first-page":"293","volume":"127","author":"W Cherif","year":"2018","unstructured":"Cherif W. Optimization of K-NN algorithm by clustering and reliability coefficients: application to breast-cancer diagnosis. The First International Conference On Intelligent Computing in Data Sciences, Procedia Computer Science. 2018;127(2018):293\u20139.","journal-title":"The First International Conference On Intelligent Computing in Data Sciences, Procedia Computer Science"},{"issue":"5","key":"297_CR8","doi-asserted-by":"publisher","first-page":"845","DOI":"10.1109\/TNNLS.2013.2292894","volume":"25","author":"B Fr\u00e9nay","year":"2014","unstructured":"Fr\u00e9nay B, Verleysen M. Classification in the presence of label noise: a survey. IEEE Transactions on Neural Networks and Learning Systems. 2014;25(5):845\u201369.","journal-title":"IEEE Transactions on Neural Networks and Learning Systems"},{"key":"297_CR9","doi-asserted-by":"publisher","first-page":"13267","DOI":"10.1016\/j.eswa.2012.05.075","volume":"39","author":"V Garcia","year":"2012","unstructured":"Garcia V, Marqu\u00e9s A, S\u00e1nchez JS. On the use of data filtering techniques for credit risk prediction with instance-based models. Expert Syst Appl. 2012;39:13267\u201376.","journal-title":"Expert Syst Appl"},{"key":"297_CR10","first-page":"172","volume":"2014","author":"F Gieseke","year":"2014","unstructured":"Gieseke F, Heinermann J, Oancea CE, Igel C. Buffer kd trees: processing massive nearest neighbor queries on GPUs. ICML. 2014;2014:172\u201380.","journal-title":"ICML"},{"key":"297_CR11","first-page":"1157","volume":"3","author":"I Guyon","year":"2003","unstructured":"Guyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res. 2003;3:1157\u201382.","journal-title":"J Mach Learn Res"},{"key":"297_CR12","unstructured":"Haberman, S. J. (1976). Generalized Residuals for Log-Linear Models, Proceedings of the 9th International Biometrics Conference, Boston, p. 104\u2013122."},{"key":"297_CR13","doi-asserted-by":"publisher","first-page":"103","DOI":"10.1007\/s10994-009-5119-5","volume":"77","author":"DJ Hand","year":"2009","unstructured":"Hand DJ. Measuring classifier performance: a coherent alternative to the area under the ROC curve. Mach Learn. 2009;77:103\u201323.","journal-title":"Mach Learn"},{"issue":"5","key":"297_CR14","doi-asserted-by":"publisher","first-page":"1718","DOI":"10.1016\/j.patcog.2007.10.015","volume":"41","author":"A Ko","year":"2008","unstructured":"Ko A, Sabourin R, Britto A Jr. From dynamic classifier selection to dynamic ensemble selection. Pattern Recognit. 2008;41(5):1718\u201331.","journal-title":"Pattern Recognit."},{"key":"297_CR15","unstructured":"Kubica J, Moore A. (2003). Probabilistic noise identification and data cleaning. In: Proceedings of the third IEEE International Conference on Data Mining, pages 131\u2013138, 2003."},{"key":"297_CR16","doi-asserted-by":"publisher","first-page":"124","DOI":"10.1016\/j.ejor.2015.05.030","volume":"247","author":"S Lessmann","year":"2015","unstructured":"Lessmann S, Baesens B, Seow H, Thomas LC. Benchmarking state-of-the-art classification algorithms for credit scoring: an update of re- search. Eur J Oper Res. 2015;247:124\u201336.","journal-title":"Eur J Oper Res"},{"key":"297_CR17","unstructured":"Mansourifar H, Shi W (2018) Toward efficient breast cancer diagnosis and survival prediction using L-perceptron. arXiv preprint arXiv:1811.03016."},{"key":"297_CR18","unstructured":"Narassiguin A., Elghaze H, Alex\u00a0Aussem A (2017). Dynamic ensemble selection with probabilistic classifier chains. Joint European Conference on Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2017:\u00a0Machine Learning and Knowledge Discovery in Databases,\u00a0p. 169\u2013186."},{"issue":"4","key":"297_CR19","first-page":"75","volume":"3","author":"K Netti","year":"2016","unstructured":"Netti K, Radhika Y. Minimizing loss of accuracy for seismic hazard prediction using Naive Bayes Classifier. IRJET. 2016;3(4):75\u20137.","journal-title":"IRJET."},{"key":"297_CR20","doi-asserted-by":"crossref","unstructured":"Pereira M., Britto\u00a0A., Oliveira\u00a0L., Sabourin R. (2018). Dynamic ensemble selection by K-nearest local Oracles with Discrimination Index. 2018 IEEE 30th International conference on tools with artificial intelligence (ICTAI), volume: 1, p. 765\u2013771.","DOI":"10.1109\/ICTAI.2018.00120"},{"key":"297_CR21","unstructured":"Peterson A. H. and Martinez T. R. (2005). Estimating the potential for combining learning models. In: Proceedings of the ICML workshop on meta-learning, p. 68\u201375"},{"issue":"1","key":"297_CR22","doi-asserted-by":"publisher","first-page":"355","DOI":"10.1016\/j.patcog.2012.07.009","volume":"46","author":"JA Saez","year":"2013","unstructured":"Saez JA, Luengo J, Herrera F. Predicting noise filtering efficacy with data complexity measures for nearest neighbor classification. Pattern Recognit. 2013;46(1):355\u201364.","journal-title":"Pattern Recognit"},{"key":"297_CR23","doi-asserted-by":"publisher","first-page":"37","DOI":"10.1016\/j.patrec.2018.01.020","volume":"104","author":"Bing Shi","year":"2018","unstructured":"Shi Bing, Han Lixin, Yan Hong. Adaptive clustering algorithm based on kNN and density. Pattern Recognit Lett. 2018;104:37\u201344.","journal-title":"Pattern Recognit Lett"},{"key":"297_CR24","first-page":"262","volume":"10","author":"VG Sigillito","year":"1989","unstructured":"Sigillito VG, Wing SP, Hutton LV, Baker KB. Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Tech Dig. 1989;10:262\u20136.","journal-title":"Johns Hopkins APL Tech Dig"},{"key":"297_CR25","unstructured":"Smith MR, Martinez T, Giraud-Carrier C. (2015) The Potential benefits of data set filtering and learning algorithm hyperparameter optimization. MetaSel\u201915 In: Proceedings of the 2015 international conference on meta-learning and algorithm selection, volume 1455, p. 3\u201314."},{"key":"297_CR26","unstructured":"Tejasvi Malladi, A. Nayeemulla Khan, A.Shahina (2019). Perfecting counterfeit banknote Detection-a classification Strategy. International Journal of Innovative Technology and Exploring Engineering (IJITEE), p. 434\u2013440."},{"key":"297_CR27","doi-asserted-by":"crossref","unstructured":"Vriesmann LM, Britto AS, Luiz SO, Koerich AL, Sabourin R (2015). Combining overall and local class accuracies in an oracle-based method for dynamic ensemble selection. 2015 International Joint Conference on Neural Networks (IJCNN).","DOI":"10.1109\/IJCNN.2015.7280340"},{"issue":"4","key":"297_CR28","doi-asserted-by":"publisher","first-page":"405","DOI":"10.1109\/34.588027","volume":"19","author":"K Woods","year":"1997","unstructured":"Woods K, Kegelmeyer WP, Bowyer K. Combination of multiple classifiers using local accuracy estimates. IEEE Trans Pattern Anal Mach Intell. 1997;19(4):405\u201310.","journal-title":"IEEE Trans Pattern Anal Mach Intell"},{"issue":"2012","key":"297_CR29","doi-asserted-by":"publisher","first-page":"3668","DOI":"10.1016\/j.eswa.2011.09.059","volume":"39","author":"J Xiao","year":"2012","unstructured":"Xiao J, Xie L, He Changzheng, Xiaoyi J. Dynamic classifier ensemble model for customer classification with imbalanced class distribution. Expert Syst Appl. 2012;39(2012):3668\u201375.","journal-title":"Expert Syst Appl"},{"key":"297_CR30","doi-asserted-by":"publisher","first-page":"731","DOI":"10.1109\/CSO.2009.276","volume-title":"Proceeding of the second international joint conference on computational sciences and optimization","author":"J Xiao","year":"2009","unstructured":"Xiao J, He CZ. Dynamic classifier ensemble selection based on GMDH. Proceeding of the second international joint conference on computational sciences and optimization. Washington: IEEE; 2009. p. 731\u20134."},{"key":"297_CR31","doi-asserted-by":"crossref","unstructured":"Zhu Y, Zhang Y, Pan Y (2015). Dynamic ensemble selection with local expertise consistency. 2015 IEEE Conference on computational intelligence in bioinformatics and computational biology (CIBCB).","DOI":"10.1109\/CIBCB.2015.7300336"}],"container-title":["Journal of Big Data"],"original-title":[],"language":"en","link":[{"URL":"http:\/\/link.springer.com\/content\/pdf\/10.1186\/s40537-020-00297-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/link.springer.com\/article\/10.1186\/s40537-020-00297-7\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/link.springer.com\/content\/pdf\/10.1186\/s40537-020-00297-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2021,3,5]],"date-time":"2021-03-05T00:12:21Z","timestamp":1614903141000},"score":1,"resource":{"primary":{"URL":"https:\/\/journalofbigdata.springeropen.com\/articles\/10.1186\/s40537-020-00297-7"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,3,5]]},"references-count":31,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2020,12]]}},"alternative-id":["297"],"URL":"https:\/\/doi.org\/10.1186\/s40537-020-00297-7","relation":{},"ISSN":["2196-1115"],"issn-type":[{"value":"2196-1115","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,3,5]]},"assertion":[{"value":"26 November 2019","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"21 February 2020","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"5 March 2020","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"Not applicable.","order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval and consent to participate"}},{"value":"Not applicable.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for publication"}},{"value":"The authors declare that they have no competing interests","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"15"}}