Repository logo
 

A split-then-join lightweight hybrid majority vote classifier

Thumbnail Image

Date

2022

Authors

Gadebe, Moses L.
Ojo, Sunday O.
Kogeda, Okuthe P.

Journal Title

Journal ISSN

Volume Title

Publisher

Springer International Publishing

Abstract

Classification of human activities using smallest dataset is achievable with tree-oriented (C4.5, Random Forest, Bagging) algorithms. However, the KNN and Gaussian Naïve Bayes (GNB) achieve higher accuracy only with largest dataset. Of interest KNN is challenged with minor feature problem, where two similar features are predictable far from each other because of limited number of classification features. In this paper the split-then-join combiner strategy is employed to split classification features into first and secondary (KNN and GNB) classifier based on integral conditionality function. Therefore, top K prediction voting list of both classifier are joined for final voting. We simulated our combined algorithm and compared it with other classification algorithms (Support Vector Machine, C4.5, K NN, and Naïve Bayes, Random Forest) using R programming language with Caret, Rweka and e1071 libraries using 3 selected datasets with 27 combined human activities. The result of the study indicates that our combined classifier is effective and reliable than its predecessor Naïve Bayes and KNN. The results of study shows that our proposed algorithm is compatible with C4.5, Boosted Trees and Random Forest and other ensemble algorithms with accuracy and precision reaching 100% in most of 27 human activities.

Description

Keywords

Split-then-join, Ensemble, KNN, Gaussian Naïve Bayes, Lightweight algorithm

Citation

Gadebe, M.L., Ojo, S.O. and Kogeda, O.P. 2022. A split-then-join lightweight hybrid majority vote classifier. Communications in Computer and Information Science. 1572 CCIS: 167-180. doi:10.1007/978-3-031-05767-0_14

DOI

10.1007/978-3-031-05767-0_14

Endorsement

Review

Supplemented By

Referenced By