• Arif Mudi Priyatno (1*) Universitas Negeri Malang
  • Triyanna Widiyaningtyas (2) Universitas Negeri Malang

  • (*) Corresponding Author
Keywords: hybrid, modified, recursive feature elimination, strategy, systematic literature review


Recursive feature elimination (RFE) is a feature selection algorithm that works by gradually eliminating unimportant features. RFE has become a popular method for feature selection in various machine learning applications, such as classification and prediction. However, there is no systematic literature review (SLR) that discusses recursive feature elimination algorithms. This article conducts a SLR on RFE algorithms. The goal is to provide an overview of the current state of the RFE algorithm. This SLR uses IEEE Xplore, ScienceDirect, Springer, and Scopus (publish and publish) databases from 2018 to 2023. This SLR received 76 relevant papers with 49% standard RFEs, 43% strategy RFEs, and 8% modified RFEs. Research using RFE continues to increase every year, from 2018 to 2023. The feature selection method used simultaneously or for comparison is based on a filter approach, namely Pearson correlation, and an embedded approach, namely random forest. The most widely used machine learning algorithms are support vector machines and random forests, with 19.5% and 16.7%, respectively. Strategy RFE and modified RFE can be referred to as hybrid RFEs. Based on relevant papers, it is found that the RFE strategy is broadly divided into two categories: using RFE after other feature selection methods and using RFE simultaneously with other methods. Modification of the RFE is done by modifying the flow of the RFE. The modification process is divided into two categories: before the process of calculating the smallest weight criteria and after calculating the smallest weight criteria. Calculating the smallest weight criteria in this RFE modification is still a challenge at this time to obtain optimal results.


Download data is not yet available.


J. Piri, P. Mohapatra, and R. Dey, “Fetal Health Status Classification Using MOGA - CD Based Feature Selection Approach,” in 2020 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), IEEE, Jul. 2020, pp. 1–6. doi: 10.1109/CONECCT50063.2020.9198377.

J. Piri et al., “Feature Selection Using Artificial Gorilla Troop Optimization for Biomedical Data: A Case Analysis with COVID-19 Data,” Mathematics, vol. 10, no. 15, p. 2742, Aug. 2022, doi: 10.3390/math10152742.

J. Piri and P. Mohapatra, “Exploring Fetal Health Status Using an Association Based Classification Approach,” in 2019 International Conference on Information Technology (ICIT), IEEE, Dec. 2019, pp. 166–171. doi: 10.1109/ICIT48102.2019.00036.

T. Bhattacharyya, B. Chatterjee, P. K. Singh, J. H. Yoon, Z. W. Geem, and R. Sarkar, “Mayfly in Harmony: A New Hybrid Meta-Heuristic Feature Selection Algorithm,” IEEE Access, vol. 8, pp. 195929–195945, 2020, doi: 10.1109/ACCESS.2020.3031718.

E. Akkur, F. Türk, and O. Erogul, “Breast cancer classification using a novel hybrid feature selection approach,” Neural Netw. World, vol. 33, no. 2, pp. 67–83, 2023, doi: 10.14311/NNW.2023.33.005.

K. Jermsittiparsert et al., “Pattern recognition and features selection for speech emotion recognition model using deep learning,” Int. J. Speech Technol., vol. 23, no. 4, pp. 799–806, Dec. 2020, doi: 10.1007/s10772-020-09690-2.

A. Naik, V. Kuppili, and D. Reddy Edla, “Binary Dragonfly Algorithm and Fisher Score Based Hybrid Feature Selection Adopting a Novel Fitness Function Applied to Microarray Data,” in 2019 International Conference on Applied Machine Learning (ICAML), IEEE, May 2019, pp. 40–43. doi: 10.1109/ICAML48257.2019.00015.

P. Schumann et al., “Detection of Fall Risk in Multiple Sclerosis by Gait Analysis—An Innovative Approach Using Feature Selection Ensemble and Machine Learning Algorithms,” Brain Sci., vol. 12, no. 11, p. 1477, Oct. 2022, doi: 10.3390/brainsci12111477.

J. T. Pintas, L. A. F. Fernandes, and A. C. B. Garcia, “Feature selection methods for text classification: a systematic literature review,” Artif. Intell. Rev., vol. 54, no. 8, pp. 6149–6200, 2021, doi: 10.1007/s10462-021-09970-6.

M. Rostami, K. Berahmand, E. Nasiri, and S. Forouzandeh, “Review of swarm intelligence-based feature selection methods,” Eng. Appl. Artif. Intell., vol. 100, no. September 2020, p. 104210, Apr. 2021, doi: 10.1016/j.engappai.2021.104210.

Q. Al-Tashi, S. J. Abdulkadir, H. M. Rais, S. Mirjalili, and H. Alhussian, “Approaches to Multi-Objective Feature Selection: A Systematic Literature Review,” IEEE Access, vol. 8, pp. 125076–125096, 2020, doi: 10.1109/ACCESS.2020.3007291.

B. Pes, “Ensemble feature selection for high-dimensional data: a stability analysis across multiple domains,” Neural Comput. Appl., vol. 32, no. 10, pp. 5951–5973, May 2020, doi: 10.1007/s00521-019-04082-3.

H. A. Mohamed Shaffril, S. F. Samsuddin, and A. Abu Samah, “The ABC of systematic literature review: the basic methodological guidance for beginners,” Qual. Quant., vol. 55, no. 4, pp. 1319–1346, 2021, doi: 10.1007/s11135-020-01059-6.

Z. M. Zain, S. Sakri, and N. H. A. Ismail, “Application of Deep Learning in Software Defect Prediction: Systematic Literature Review and Meta-analysis,” Inf. Softw. Technol., vol. 158, no. January, p. 107175, Jun. 2023, doi: 10.1016/j.infsof.2023.107175.

D. Bassi and H. Singh, “A Systematic Literature Review on Software Vulnerability Prediction Models,” IEEE Access, vol. 11, no. October, pp. 110289–110311, 2023, doi: 10.1109/ACCESS.2023.3312613.

J. Paul, W. M. Lim, A. O’Cass, A. W. Hao, and S. Bresciani, “Scientific procedures and rationales for systematic literature reviews (SPAR‐4‐SLR),” Int. J. Consum. Stud., vol. 45, no. 4, pp. 1–16, Jul. 2021, doi: 10.1111/ijcs.12695.

E. Tieppo, R. R. dos Santos, J. P. Barddal, and J. C. Nievola, Hierarchical classification of data streams: a systematic literature review, vol. 55, no. 4. Springer Netherlands, 2022. doi: 10.1007/s10462-021-10087-z.

H. Jeon and S. Oh, “Hybrid-recursive feature elimination for efficient feature selection,” Appl. Sci., vol. 10, no. 9, p. 3211, May 2020, doi: 10.3390/app10093211.

S. ur R. Khan et al., “Enhanced Machine-Learning Techniques for Medium-Term and Short-Term Electric-Load Forecasting in Smart Grids,” Energies, vol. 16, no. 1, p. 276, Dec. 2022, doi: 10.3390/en16010276.

S. Xu, B. Lu, M. Baldea, T. F. Edgar, and M. Nixon, “An improved variable selection method for support vector regression in NIR spectral modeling,” J. Process Control, vol. 67, pp. 83–93, Jul. 2018, doi: 10.1016/j.jprocont.2017.06.001.

H. Sanz, C. Valim, E. Vegas, J. M. Oller, and F. Reverter, “SVM-RFE: selection and visualization of the most relevant features through non-linear kernels,” BMC Bioinformatics, vol. 19, no. 1, p. 432, Dec. 2018, doi: 10.1186/s12859-018-2451-4.

R. Su, X. Liu, and L. Wei, “MinE-RFE: determine the optimal subset from RFE by minimizing the subset-accuracy–defined energy,” Brief. Bioinform., vol. 21, no. 2, pp. 687–698, Mar. 2020, doi: 10.1093/bib/bbz021.

S. Sahran, D. Albashish, A. Abdullah, N. A. Shukor, and S. Hayati Md Pauzi, “Absolute cosine-based SVM-RFE feature selection method for prostate histopathological grading,” Artif. Intell. Med., vol. 87, pp. 78–90, May 2018, doi: 10.1016/j.artmed.2018.04.002.

F. Ali et al., “DBPPred-PDSD: Machine learning approach for prediction of DNA-binding proteins using Discrete Wavelet Transform and optimized integrated features space,” Chemom. Intell. Lab. Syst., vol. 182, no. August, pp. 21–30, Nov. 2018, doi: 10.1016/j.chemolab.2018.08.013.

X. Gao and X. Liu, “A novel effective diagnosis model based on optimized least squares support machine for gene microarray,” Appl. Soft Comput., vol. 66, pp. 50–59, May 2018, doi: 10.1016/j.asoc.2018.02.009.

E. Valla, S. Nõmm, K. Medijainen, P. Taba, and A. Toomela, “Tremor-related feature engineering for machine learning based Parkinson’s disease diagnostics,” Biomed. Signal Process. Control, vol. 75, no. October 2021, 2022, doi: 10.1016/j.bspc.2022.103551.

A. Thalor, H. Kumar Joon, G. Singh, S. Roy, and D. Gupta, “Machine learning assisted analysis of breast cancer gene expression profiles reveals novel potential prognostic biomarkers for triple-negative breast cancer,” Comput. Struct. Biotechnol. J., vol. 20, pp. 1618–1631, 2022, doi: 10.1016/j.csbj.2022.03.019.

T. Gaber, A. El-Ghamry, and A. E. Hassanien, “Injection attack detection using machine learning for smart IoT applications,” Phys. Commun., vol. 52, no. September 2020, p. 101685, 2022, doi: 10.1016/j.phycom.2022.101685.

L. Li, M. Wang, X. Jiang, and Y. Lin, “Universal multi-factor feature selection method for radiomics-based brain tumor classification,” Comput. Biol. Med., vol. 164, p. 107122, Sep. 2023, doi: 10.1016/j.compbiomed.2023.107122.

Uzma, U. Manzoor, and Z. Halim, “Protein encoder: An autoencoder-based ensemble feature selection scheme to predict protein secondary structure,” Expert Syst. Appl., vol. 213, p. 119081, Mar. 2023, doi: 10.1016/j.eswa.2022.119081.

Y. Zhang, H. Guo, Y. Zhou, C. Xu, and Y. Liao, “Recognising drivers’ mental fatigue based on EEG multi-dimensional feature selection and fusion,” Biomed. Signal Process. Control, vol. 79, no. P2, p. 104237, 2023, doi: 10.1016/j.bspc.2022.104237.

M. G. Albayrak, E. Evi̇n, O. Yi̇ği̇t, M. Toğaçar, and B. Ergen, “Experimental and artificial intelligence approaches to measuring the wear behavior of DIN St28 steel boronized by the box boronizing method using a mechanically alloyed powder source,” Eng. Appl. Artif. Intell., vol. 120, no. January, p. 105910, Apr. 2023, doi: 10.1016/j.engappai.2023.105910.

H. Qi, X. Song, S. Liu, Y. Zhang, and K. K. L. Wong, “KFPredict: An ensemble learning prediction framework for diabetes based on fusion of key features,” Comput. Methods Programs Biomed., vol. 231, p. 107378, Apr. 2023, doi: 10.1016/j.cmpb.2023.107378.

R. Alzubi, N. Ramzan, H. Alzoubi, and A. Amira, “A Hybrid Feature Selection Method for Complex Diseases SNPs,” IEEE Access, vol. 6, pp. 1292–1301, 2018, doi: 10.1109/ACCESS.2017.2778268.

A. Moghimi, C. Yang, and P. M. Marchetto, “Ensemble Feature Selection for Plant Phenotyping: A Journey From Hyperspectral to Multispectral Imaging,” IEEE Access, vol. 6, pp. 56870–56884, 2018, doi: 10.1109/ACCESS.2018.2872801.

S. Byun et al., “Detection of major depressive disorder from linear and nonlinear heart rate variability features during mental task protocol,” Comput. Biol. Med., vol. 112, no. August, p. 103381, Sep. 2019, doi: 10.1016/j.compbiomed.2019.103381.

G. Fang, P. Xu, and W. Liu, “Automated Ischemic Stroke Subtyping Based on Machine Learning Approach,” IEEE Access, vol. 8, pp. 118426–118432, 2020, doi: 10.1109/ACCESS.2020.3004977.

S. A.-F. Sayed, A. M. Elkorany, and S. Sayed Mohammad, “Applying Different Machine Learning Techniques for Prediction of COVID-19 Severity,” IEEE Access, vol. 9, pp. 135697–135707, 2021, doi: 10.1109/ACCESS.2021.3116067.

A. Tousi and M. Lujan, “Comparative Analysis of Machine Learning Models for Performance Prediction of the SPEC Benchmarks,” IEEE Access, vol. 10, pp. 11994–12011, 2022, doi: 10.1109/ACCESS.2022.3142240.

N. Omar, H. Aly, and T. Little, “Optimized Feature Selection Based on a Least-Redundant and Highest-Relevant Framework for a Solar Irradiance Forecasting Model,” IEEE Access, vol. 10, pp. 48643–48659, 2022, doi: 10.1109/ACCESS.2022.3171230.

U. Ullah, A. G. O. Jurado, I. D. Gonzalez, and B. Garcia-Zapirain, “A Fully Connected Quantum Convolutional Neural Network for Classifying Ischemic Cardiopathy,” IEEE Access, vol. 10, pp. 134592–134605, 2022, doi: 10.1109/ACCESS.2022.3232307.

Doreswamy, M. K. Hooshmand, I. Gad, M. K. H. Doreswamy, and I. Gad, “Feature selection approach using ensemble learning for network anomaly detection,” CAAI Trans. Intell. Technol., vol. 5, no. 4, pp. 283–293, Dec. 2020, doi: 10.1049/trit.2020.0073.

N. Ayub et al., “Big Data Analytics for Short and Medium-Term Electricity Load Forecasting Using an AI Techniques Ensembler,” Energies, vol. 13, no. 19, p. 5193, Oct. 2020, doi: 10.3390/en13195193.

Z. Chen et al., “Detecting Abnormal Brain Regions in Schizophrenia Using Structural MRI via Machine Learning,” Comput. Intell. Neurosci., vol. 2020, pp. 1–13, Mar. 2020, doi: 10.1155/2020/6405930.

H. Lin, Y. Xue, K. Chen, S. Zhong, and L. Chen, “Acute coronary syndrome risk prediction based on gradient boosted tree feature selection and recursive feature elimination: A dataset-specific modeling study,” PLoS One, vol. 17, no. 11 November, p. e0278217, Nov. 2022, doi: 10.1371/journal.pone.0278217.

S. Bhattacharjee, D. Prakash, C.-H. Kim, H.-C. Kim, and H.-K. Choi, “Texture, Morphology, and Statistical Analysis to Differentiate Primary Brain Tumors on Two-Dimensional Magnetic Resonance Imaging Scans Using Artificial Intelligence Techniques,” Healthc. Inform. Res., vol. 28, no. 1, pp. 46–57, Jan. 2022, doi: 10.4258/hir.2022.28.1.46.

M. S. Rahman, M. K. Rahman, S. Saha, M. Kaykobad, and M. S. Rahman, “Antigenic: An improved prediction model of protective antigens,” Artif. Intell. Med., vol. 94, no. May 2018, pp. 28–41, Mar. 2019, doi: 10.1016/j.artmed.2018.12.010.

F. Y. Chin, K. H. Lem, and K. M. Wong, “Improving handwritten digit recognition using hybrid feature selection algorithm,” Appl. Comput. Informatics, Jul. 2022, doi: 10.1108/ACI-02-2022-0054.

D. Papathanasiou, K. Demertzis, and N. Tziritas, “Machine Failure Prediction Using Survival Analysis,” Futur. Internet, vol. 15, no. 5, 2023, doi: 10.3390/fi15050153.

J. Jemai and A. Zarrad, “Feature Selection Engineering for Credit Risk Assessment in Retail Banking,” Inf., vol. 14, no. 3, 2023, doi: 10.3390/info14030200.

H. Castro, J. D. Garcia-Racines, and A. Bernal-Norena, “Methodology for the prediction of paroxysmal atrial fibrillation based on heart rate variability feature analysis,” Heliyon, vol. 7, no. 11, p. e08244, Nov. 2021, doi: 10.1016/j.heliyon.2021.e08244.

J. Cui, H. Xia, R. Zhang, B. Hu, and X. Cheng, “Optimization scheme for intrusion detection scheme GBDT in edge computing center,” Comput. Commun., vol. 168, no. December 2020, pp. 136–145, Feb. 2021, doi: 10.1016/j.comcom.2020.12.007.

B. Fu et al., “Comparison of optimized object-based RF-DT algorithm and SegNet algorithm for classifying Karst wetland vegetation communities using ultra-high spatial resolution UAV data,” Int. J. Appl. Earth Obs. Geoinf., vol. 104, p. 102553, Dec. 2021, doi: 10.1016/j.jag.2021.102553.

S. de Roda Husman, J. J. van der Sanden, S. Lhermitte, and M. A. Eleveld, “Integrating intensity and context for improved supervised river ice classification from dual-pol Sentinel-1 SAR data,” Int. J. Appl. Earth Obs. Geoinf., vol. 101, no. April, p. 102359, Sep. 2021, doi: 10.1016/j.jag.2021.102359.

S. S. Subbiah and J. Chinnappan, “Deep learning based short term load forecasting with hybrid feature selection,” Electr. Power Syst. Res., vol. 210, no. February, p. 108065, 2022, doi: 10.1016/j.epsr.2022.108065.

R. K. Batchu and H. Seetha, “An integrated approach explaining the detection of distributed denial of service attacks,” Comput. Networks, vol. 216, no. February, p. 109269, 2022, doi: 10.1016/j.comnet.2022.109269.

R. C. Chen, C. Dewi, S. W. Huang, and R. E. Caraka, “Selecting critical features for data classification based on machine learning methods,” J. Big Data, vol. 7, no. 1, p. 52, Dec. 2020, doi: 10.1186/s40537-020-00327-4.

A. Thakkar and R. Lohiya, “Attack classification using feature selection techniques: a comparative study,” J. Ambient Intell. Humaniz. Comput., vol. 12, no. 1, pp. 1249–1266, Jan. 2021, doi: 10.1007/s12652-020-02167-9.

L. B. Zeferino, L. F. T. de Souza, C. H. do Amaral, E. I. Fernandes Filho, and T. S. de Oliveira, “Does environmental data increase the accuracy of land use and land cover classification?,” Int. J. Appl. Earth Obs. Geoinf., vol. 91, no. April, p. 102128, Sep. 2020, doi: 10.1016/j.jag.2020.102128.

D. Albashish, A. I. Hammouri, M. Braik, J. Atwan, and S. Sahran, “Binary biogeography-based optimization based SVM-RFE for feature selection,” Appl. Soft Comput., vol. 101, 2021, doi: 10.1016/j.asoc.2020.107026.

Q. Zhang, P. Liu, X. Wang, Y. Zhang, Y. Han, and B. Yu, “StackPDB: Predicting DNA-binding proteins based on XGB-RFE feature optimization and stacked ensemble classifier,” Appl. Soft Comput., vol. 99, p. 106921, Feb. 2021, doi: 10.1016/j.asoc.2020.106921.

X. Ding, F. Yang, S. Jin, and J. Cao, “An efficient alpha seeding method for optimized extreme learning machine-based feature selection algorithm,” Comput. Biol. Med., vol. 134, no. May, p. 104505, Jul. 2021, doi: 10.1016/j.compbiomed.2021.104505.

R. Wazirali, R. Ahmad, and A. A.-K. Abu-Ein, “Sustaining accurate detection of phishing URLs using SDN and feature selection approaches,” Comput. Networks, vol. 201, no. October, p. 108591, Dec. 2021, doi: 10.1016/j.comnet.2021.108591.

Y. Guo, Z. Zhang, and F. Tang, “Feature selection with kernelized multi-class support vector machine,” Pattern Recognit., vol. 117, p. 107988, Sep. 2021, doi: 10.1016/j.patcog.2021.107988.

C. E. Garcia, M. R. Camana, and I. Koo, “Machine learning-based scheme for multi-class fault detection in turbine engine disks,” ICT Express, vol. 7, no. 1, pp. 15–22, 2021, doi: 10.1016/j.icte.2021.01.009.

Y. Ding, L. Fan, and X. Liu, “Analysis of feature matrix in machine learning algorithms to predict energy consumption of public buildings,” Energy Build., vol. 249, p. 111208, Oct. 2021, doi: 10.1016/j.enbuild.2021.111208.

P. Theerthagiri, “Predictive analysis of cardiovascular disease using gradient boosting based learning and recursive feature elimination technique,” Intell. Syst. with Appl., vol. 16, no. September, p. 200121, Nov. 2022, doi: 10.1016/j.iswa.2022.200121.

P. R. Kannari, N. S. Chowdary, and R. Laxmikanth Biradar, “An anomaly-based intrusion detection system using recursive feature elimination technique for improved attack detection,” Theor. Comput. Sci., vol. 931, pp. 56–64, Sep. 2022, doi: 10.1016/j.tcs.2022.07.030.

X. Jiang, Y. Zhang, Y. Li, and B. Zhang, “Forecast and analysis of aircraft passenger satisfaction based on RF-RFE-LR model,” Sci. Rep., vol. 12, no. 1, p. 11174, Jul. 2022, doi: 10.1038/s41598-022-14566-3.

M. Lee, J. H. Lee, and D. H. Kim, “Gender recognition using optimal gait feature based on recursive feature elimination in normal walking,” Expert Syst. Appl., vol. 189, p. 116040, Mar. 2022, doi: 10.1016/j.eswa.2021.116040.

C. Xu, J. Ding, Y. Qiao, and L. Zhang, “Tomato disease and pest diagnosis method based on the Stacking of prescription data,” Comput. Electron. Agric., vol. 197, no. 17, p. 106997, 2022, doi: 10.1016/j.compag.2022.106997.

S. Yang et al., “Intelligent multiobjective optimization for high-performance concrete mix proportion design: A hybrid machine learning approach,” Eng. Appl. Artif. Intell., vol. 126, no. PB, p. 106868, Nov. 2023, doi: 10.1016/j.engappai.2023.106868.

A. Chandramouli, V. R. Hyma, P. S. Tanmayi, T. G. Santoshi, and B. Priyanka, “Diabetes prediction using Hybrid Bagging Classifier,” Entertain. Comput., vol. 47, no. May, p. 100593, 2023, doi: 10.1016/j.entcom.2023.100593.

S. K. Prabhakar and D.-O. Won, “HISET: Hybrid interpretable strategies with ensemble techniques for respiratory sound classification,” Heliyon, vol. 9, no. 8, p. e18466, Aug. 2023, doi: 10.1016/j.heliyon.2023.e18466.

M. M. Alani, A. Mashatan, and A. Miri, “XMal: A lightweight memory-based explainable obfuscated-malware detector,” Comput. Secur., vol. 133, no. November 2022, p. 103409, Oct. 2023, doi: 10.1016/j.cose.2023.103409.

D. H. Djarum, Z. Ahmad, and J. Zhang, “Reduced Bayesian Optimized Stacked Regressor (RBOSR): A highly efficient stacked approach for improved air pollution prediction,” Appl. Soft Comput., vol. 144, p. 110466, Sep. 2023, doi: 10.1016/j.asoc.2023.110466.

H. Dong et al., “Non-destructive detection of CAD stenosis severity using ECG-PCG coupling analysis,” Biomed. Signal Process. Control, vol. 86, no. PC, p. 105328, Sep. 2023, doi: 10.1016/j.bspc.2023.105328.

Q. Shang, L. Feng, and S. Gao, “A Hybrid Method for Traffic Incident Detection Using Random Forest-Recursive Feature Elimination and Long Short-Term Memory Network with Bayesian Optimization Algorithm,” IEEE Access, vol. 9, pp. 1219–1232, 2021, doi: 10.1109/ACCESS.2020.3047340.

X. Ding, F. Yang, Y. Zhong, and J. Cao, “A Novel Recursive Gene Selection Method Based on Least Square Kernel Extreme Learning Machine,” IEEE/ACM Trans. Comput. Biol. Bioinforma., vol. 19, no. 4, pp. 2026–2038, Jul. 2022, doi: 10.1109/TCBB.2021.3068846.

K. Ren, Y. Zeng, Z. Cao, and Y. Zhang, “ID-RDRL: a deep reinforcement learning-based feature selection intrusion detection model,” Sci. Rep., vol. 12, no. 1, p. 15370, Sep. 2022, doi: 10.1038/s41598-022-19366-3.

J. Sung et al., “Classification of Stroke Severity Using Clinically Relevant Symmetric Gait Features Based on Recursive Feature Elimination With Cross-Validation,” IEEE Access, vol. 10, pp. 119437–119447, 2022, doi: 10.1109/ACCESS.2022.3218118.

S. Haroun, A. N. Seghir, and S. Touati, “Multiple features extraction and selection for detection and classification of stator winding faults,” IET Electr. Power Appl., vol. 12, no. 3, pp. 339–346, 2018, doi: 10.1049/IET-EPA.2017.0457.

X. Su, H. Liu, and L. Tao, “TF Entropy and RFE Based Diagnosis for Centrifugal Pumps Subject to the Limitation of Failure Samples,” Appl. Sci., vol. 10, no. 8, p. 2932, Apr. 2020, doi: 10.3390/app10082932.

J. L. Ferrando Chacón, T. Fernández de Barrena, A. García, M. Sáez de Buruaga, X. Badiola, and J. Vicente, “A Novel Machine Learning-Based Methodology for Tool Wear Prediction Using Acoustic Emission Signals,” Sensors, vol. 21, no. 17, p. 5984, Sep. 2021, doi: 10.3390/s21175984.

B. Zhang, Y. Zhang, and X. Jiang, “Feature selection for global tropospheric ozone prediction based on the BO-XGBoost-RFE algorithm,” Sci. Rep., vol. 12, no. 1, p. 9244, Jun. 2022, doi: 10.1038/s41598-022-13498-2.

M. Mravik, T. Vetriselvi, K. Venkatachalam, M. Sarac, N. Bacanin, and S. Adamovic, “Diabetes prediction algorithm using recursive ridge regression l2,” Comput. Mater. Contin., vol. 71, no. 1, pp. 457–471, 2022, doi: 10.32604/cmc.2022.020687.

M. D. Fathima, S. J. Samuel, and S. P. Raja, “HDDSS: An Enhanced Heart Disease Decision Support System using RFE-ABGNB Algorithm,” Int. J. Interact. Multimed. Artif. Intell., vol. In Press, no. In Press, p. 1, 2021, doi: 10.9781/ijimai.2021.10.003.

A. Ebrahimi, U. K. Wiil, R. Baskaran, A. Peimankar, K. Andersen, and A. S. Nielsen, “AUD-DSS: a decision support system for early detection of patients with alcohol use disorder,” BMC Bioinformatics, vol. 24, no. 1, p. 329, Sep. 2023, doi: 10.1186/s12859-023-05450-6.

F. Soares and M. J. Anzanello, “Support vector regression coupled with wavelength selection as a robust analytical method,” Chemom. Intell. Lab. Syst., vol. 172, no. December, pp. 167–173, Jan. 2018, doi: 10.1016/j.chemolab.2017.12.007.

S. Chatterjee, D. Dey, and S. Munshi, “Optimal selection of features using wavelet fractal descriptors and automatic correlation bias reduction for classifying skin lesions,” Biomed. Signal Process. Control, vol. 40, pp. 252–262, Feb. 2018, doi: 10.1016/j.bspc.2017.09.028.

S. Chatterjee, D. Dey, and S. Munshi, “Integration of morphological preprocessing and fractal based feature extraction with recursive feature elimination for skin lesion types classification,” Comput. Methods Programs Biomed., vol. 178, pp. 201–218, Sep. 2019, doi: 10.1016/j.cmpb.2019.06.018.

H. Ebrahimy and M. Azadbakht, “Downscaling MODIS land surface temperature over a heterogeneous area: An investigation of machine learning techniques, feature selection, and impacts of mixed pixels,” Comput. Geosci., vol. 124, pp. 93–102, 2019, doi: 10.1016/j.cageo.2019.01.004.

H. Shi, H. Wang, Y. Huang, L. Zhao, C. Qin, and C. Liu, “A hierarchical method based on weighted extreme gradient boosting in ECG heartbeat classification,” Comput. Methods Programs Biomed., vol. 171, pp. 1–10, Apr. 2019, doi: 10.1016/j.cmpb.2019.02.005.

S. Sun, M. Hu, S. Wang, and C. Zhang, “How to capture tourists’ search behavior in tourism forecasts? A two-stage feature selection approach,” Expert Syst. Appl., vol. 213, no. PA, p. 118895, 2023, doi: 10.1016/j.eswa.2022.118895.

H. H. Htun, M. Biehl, and N. Petkov, “Survey of feature selection and extraction techniques for stock market prediction,” Financ. Innov., vol. 9, no. 1, p. 26, Jan. 2023, doi: 10.1186/s40854-022-00441-7.

N. V. Sharma and N. S. Yadav, “An optimal intrusion detection system using recursive feature elimination and ensemble of classifiers,” Microprocess. Microsyst., vol. 85, no. March, p. 104293, Sep. 2021, doi: 10.1016/j.micpro.2021.104293.

P. Misra and A. S. Yadav, “Improving the classification accuracy using recursive feature elimination with cross-validation,” Int. J. Emerg. Technol., vol. 11, no. 3, pp. 659–665, 2020, [Online]. Available: https://api.elsevier.com/content/abstract/scopus_id/85086443367

How to Cite
A. Priyatno and T. Widiyaningtyas, “A SYSTEMATIC LITERATURE REVIEW: RECURSIVE FEATURE ELIMINATION ALGORITHMS”, jitk, vol. 9, no. 2, pp. 196-207, Feb. 2024.
Article Metrics

Abstract viewed = 94 times
PDF downloaded = 35 times