KOMPARASI METODE DECISION TREE, NAIVE BAYES DAN K-NEAREST NEIGHBOR PADA KLASIFIKASI KINERJA SISWA

  • Tyas Setiyorini (1*) STMIK Nusa Mandiri
  • Rizky Tri Asmono (2) STMIK Swadharma

  • (*) Corresponding Author
Keywords: Decision Tree, Naive Bayes, K-Nearest Neighbor, Student Performance

Abstract

In education, student performance is an important part. To achieve good and quality student performance requires analysis or evaluation of
factors that influence student performance. The method still using an evaluation based only on the educator's assessment of information on the
progress of student learning. This method is not effective because information such as student learning progress is not enough to form indicators in evaluating student performance and helping students and educators to make improvements in learning and teaching. Previous studies have been conducted but it is not yet known which method is best in classifying student performance. In this study, the Decision Tree, Naive Bayes and K-Nearest Neighbor methods were compared using student performance datasets. By using the Decision Tree method, the accuracy is 78.85, using the Naive Bayes method, the accuracy is 77.69 and by using the K-Nearest Neighbor method, the accuracy is
79.31. After comparison the results show, by using the K-Nearest Neighbor method, the highest accuracy is obtained. It concluded that the KNearest Neighbor method had better performance than the Decision Tree and Naive Bayes methods

References

Adeniyi, D. A., Wei, Z., & Yongquan, Y. (2016). Automated web usage data mining and recommendation system using K-Nearest Neighbor (KNN) classification method. Applied Computing and Informatics, 12(1), 90–108. https://doi.org/10.1016/j.aci.2014.10.001

Al-Shehri, H., Al-Qarni, A., Al-Saati, L., Batoaq, A., Badukhen, H., Alrashed, S., … Olatunji, S. O. (2017). Student performance prediction using Support Vector Machine and K-Nearest Neighbor. Canadian Conference on Electrical and Computer Engineering, 17–20. https://doi.org/10.1109/CCECE.2017.7946847

Alkhasawneh, R., & Hobson, R. (2011). Modeling student retention in science and engineering disciplines using neural networks. In 2011 IEEE Global Engineering Education Conference, EDUCON 2011 (pp. 660–663). https://doi.org/10.1109/EDUCON.2011.5773209

Bin Mat, U., Buniyamin, N., Arsad, P. M., & Kassim, R. A. (2014). An overview of using academic analytics to predict and improve students’ achievement: A proposed proactive intelligent intervention. 2013 IEEE 5th International Conference on Engineering Education: Aligning Engineering Education with Industrial Needs for Nation Development, ICEED 2013, 126–130. https://doi.org/10.1109/ICEED.2013.6908316

Breiman, L. (2001). Classification and regression tree.

Chen, S. W., Lin, S. C., & Chang, K. E. (2001). Attributed concept maps: Fuzzy integration and fuzzy matching. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 31(5), 842–852. https://doi.org/10.1109/3477.956047

Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data: A comparison of 17 blended courses using moodle LMS. IEEE Transactions on Learning Technologies, 10(1), 17–29. https://doi.org/10.1109/TLT.2016.2616312

Cortez, P., & Silva, A. (2008). Using Data Mining to Predict Secondary School Student Performance. In A. Brito and J. Teixeira Eds., Proceedings of 5th FUture BUsiness TEChnology Conference (FUBUTEC 2008), 5–12.

Cover, T., & Hart, P. E. (1967). Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), 21–27.

D. Magdalene Delighta Angeline. (2013). Association Rule Generation for Student Performance Analysis using Apriori Algorithm. The SIJ Transactions on Computer Science Engineering & Its Applications (CSEA), 1(1), 12–16.

Deverapalli, P. S. D. (2016). A Critical Study of Classification Algorithms Using Diabetes Diagnosis. 2016 IEEE 6th International Conference on Advanced Computing (IACC).

Gou, J., Zhan, Y., Rao, Y., Shen, X., Wang, X., & He, W. (2014). Improved pseudo nearest neighbor classification. Knowledge-Based Systems, 70, 361–375. https://doi.org/10.1016/j.knosys.2014.07.020

Gries, D., & Schneider, F. B. (2010). Texts in Computer Science. Media (Vol. 42). https://doi.org/10.1007/978-1-84882-256-6

Hamsa, H., Indiradevi, S., & Kizhakkethottam, J. J. (2016). Student Academic Performance Prediction Model Using Decision Tree and Fuzzy Genetic Algorithm. Procedia Technology, 25, 326–332. https://doi.org/10.1016/j.protcy.2016.08.114

Ibrahim, Z., & Rusli, D. (2007). Predicting Students’ Academic Performance: Comparing Artificial Neural Network, Decision tree And Linear Regression. Proceedings of the 21st Annual SAS Malaysia Forum, (September), 1–6. Retrieved from https://www.researchgate.net/profile/Daliela_Rusli/publication/228894873_Predicting_Students’_Academic_Performance_Comparing_Artificial_Neural_Network_Decision_Tree_and_Linear_Regression/links/0deec51bb04e76ed93000000.pdf

Jiang, L., Cai, Z., & Wang, D. (2010). IMPROVING NAIVE BAYES FOR CLASSIFICATION. International Journal of Computers and Applications, 32(3). https://doi.org/10.2316/Journal.202.2010.3.202-2747

Jiang, L., Wang, D., Cai, Z., & Yan, X. (2007). Survey of Improving Naive Bayes for Classification. Proceedings of the Third International Conference of Advanced Data Mining and Applications, 4632, 134–145. https://doi.org/10.1007/978-3-540-73871-8_14

Kumar, S., & Sahoo, G. (2015). Classification of heart disease using Naïve Bayes and genetic algorithm. Smart Innovation, Systems and Technologies. https://doi.org/10.1007/978-81-322-2208-8_25

Lakshmi, B. N., Indumathi, T. S., & Ravi, N. (2016). A Study on C.5 Decision Tree Classification Algorithm for Risk Predictions During Pregnancy. Procedia Technology, 24, 1542–1549. https://doi.org/10.1016/j.protcy.2016.05.128

Larose, D. T., & Larose, C. D. (2014). Discovering Knowledge in Data. https://doi.org/10.1002/9781118874059

Lin, Y., Li, J., Lin, M., & Chen, J. (2014). A new nearest neighbor classifier via fusing neighborhood information. Neurocomputing, 143, 164–169. https://doi.org/10.1016/j.neucom.2014.06.009

Liu, H., & Zhang, S. (2012). Noisy data elimination using mutual k-nearest neighbor for classification mining. Journal of Systems and Software, 85(5), 1067–1074. https://doi.org/10.1016/j.jss.2011.12.019

Lolli, F., Ishizaka, A., Gamberini, R., Balugani, E., & Rimini, B. (2017). Decision Trees for Supervised Multi-criteria Inventory Classification. Procedia Manufacturing, 11(June), 1871–1881. https://doi.org/10.1016/j.promfg.2017.07.326

Lopez Guarin, C. E., Guzman, E. L., & Gonzalez, F. A. (2015). A Model to Predict Low Academic Performance at a Specific Enrollment Using Data Mining. Revista Iberoamericana de Tecnologias Del Aprendizaje, 10(3), 119–125. https://doi.org/10.1109/RITA.2015.2452632

Setiyorini, T., & Asmono, R. T. (2018). Laporan Akhir Penelitian Mandiri. Jakarta: STMIK Nusa Mandiri

Shahiri, A. M., Husain, W., & Rashid, N. A. (2015). A Review on Predicting Student’s Performance Using Data Mining Techniques. Procedia Computer Science, 72, 414–422. https://doi.org/10.1016/j.procs.2015.12.157

Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell Labs Technical Journal, 27(3), 379–423.

Stecker, P. M., Fuchs, L. S., & Fuchs, D. (2005). Using Curriculum-Based Measurement to Improve Student Achievement: Review of Research. Psychology in the Schools, 42(8), 795–819. https://doi.org/10.1002/pits.20113

Turhan, B., & Bener, A. (2009). Analysis of Naive Bayes’ assumptions on software fault data: An empirical study. Data and Knowledge Engineering, 68(2), 278–290. https://doi.org/10.1016/j.datak.2008.10.005

Won Yoon, J., & Friel, N. (2015). Efficient model selection for probabilistic K nearest neighbour classification. Neurocomputing, 149(PB), 1098–1108. https://doi.org/10.1016/j.neucom.2014.07.023

Wu, J., & Cai, Z. (2011). Attribute Weighting via Differential Evolution Algorithm for Attribute Weighted Naive Bayes ( WNB ). Journal of Computational Information Systems, 5(5), 1672–1679.

Wu, X., & Kumar, V. (2009). The Top Ten Algorithms in Data Mining. Physics of Fluids. https://doi.org/10.1063/1.2756553

Yang, F., & Li, F. W. B. (2018). Study on student performance estimation, student progress analysis, and student potential prediction based on data mining. Computers and Education, 123(October 2017), 97–108. https://doi.org/10.1016/j.compedu.2018.04.006

Zhang, H., & Sheng, S. (2004). Learning weighted naive bayes with accurate ranking. In Proceedings - Fourth IEEE International Conference on Data Mining, ICDM 2004 (pp. 567–570). https://doi.org/10.1109/ICDM.2004.10030
Published
2018-09-15
How to Cite
Setiyorini, T., & Asmono, R. (2018). KOMPARASI METODE DECISION TREE, NAIVE BAYES DAN K-NEAREST NEIGHBOR PADA KLASIFIKASI KINERJA SISWA. Jurnal Techno Nusa Mandiri, 15(2), 85-92. https://doi.org/10.33480/techno.v15i2.16
Article Metrics

Abstract viewed = 246 times
PDF downloaded = 299 times