Uneingeschränkter Zugang

Feature selection for high-dimensional data based on scaled cross operator threshold filtering specific memory algorithm

, , ,  und   
26. März 2025

Zitieren
COVER HERUNTERLADEN

Bommert, A., Sun, X., Bischl, B., Rahnenführer, J., & Lang, M. (2020). Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics & Data Analysis, 143, 106839. BommertA.SunX.BischlB.RahnenführerJ. & LangM. (2020). Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics & Data Analysis, 143, 106839.Search in Google Scholar

Tang, X., Dai, Y., & Xiang, Y. (2019). Feature selection based on feature interactions with application to text categorization. Expert Systems with Applications, 120, 207-216. TangX.DaiY. & XiangY. (2019). Feature selection based on feature interactions with application to text categorization. Expert Systems with Applications, 120, 207-216.Search in Google Scholar

Li, J., Cheng, K., Wang, S., Morstatter, F., Trevino, R. P., Tang, J., & Liu, H. (2017). Feature selection: A data perspective. ACM computing surveys (CSUR), 50(6), 1-45. LiJ.ChengK.WangS.MorstatterF.TrevinoR. P.TangJ. & LiuH. (2017). Feature selection: A data perspective. ACM computing surveys (CSUR), 50(6), 1-45.Search in Google Scholar

Rao, H., Shi, X., Rodrigue, A. K., Feng, J., Xia, Y., Elhoseny, M., ... & Gu, L. (2019). Feature selection based on artificial bee colony and gradient boosting decision tree. Applied Soft Computing, 74, 634-642. RaoH.ShiX.RodrigueA. K.FengJ.XiaY.ElhosenyM. ... & GuL. (2019). Feature selection based on artificial bee colony and gradient boosting decision tree. Applied Soft Computing, 74, 634-642.Search in Google Scholar

Zebari, R., Abdulazeez, A., Zeebaree, D., Zebari, D., & Saeed, J. (2020). A comprehensive review of dimensionality reduction techniques for feature selection and feature extraction. Journal of Applied Science and Technology Trends, 1(1), 56-70. ZebariR.AbdulazeezA.ZeebareeD.ZebariD. & SaeedJ. (2020). A comprehensive review of dimensionality reduction techniques for feature selection and feature extraction. Journal of Applied Science and Technology Trends, 1(1), 56-70.Search in Google Scholar

Degenhardt, F., Seifert, S., & Szymczak, S. (2019). Evaluation of variable selection methods for random forests and omics data sets. Briefings in bioinformatics, 20(2), 492-503. DegenhardtF.SeifertS. & SzymczakS. (2019). Evaluation of variable selection methods for random forests and omics data sets. Briefings in bioinformatics, 20(2), 492-503.Search in Google Scholar

Alelyani, S., Tang, J., & Liu, H. (2018). Feature selection for clustering: A review. Data Clustering, 29-60. AlelyaniS.TangJ. & LiuH. (2018). Feature selection for clustering: A review. Data Clustering, 29-60.Search in Google Scholar

Remeseiro, B., & Bolon-Canedo, V. (2019). A review of feature selection methods in medical applications. Computers in biology and medicine, 112, 103375. RemeseiroB. & Bolon-CanedoV. (2019). A review of feature selection methods in medical applications. Computers in biology and medicine, 112, 103375.Search in Google Scholar

Cai, J., Luo, J., Wang, S., & Yang, S. (2018). Feature selection in machine learning: A new perspective. Neurocomputing, 300, 70-79. CaiJ.LuoJ.WangS. & YangS. (2018). Feature selection in machine learning: A new perspective. Neurocomputing, 300, 70-79.Search in Google Scholar

Belloni, A., Chernozhukov, V., Fernandez‐Val, I., & Hansen, C. (2017). Program evaluation and causal inference with high‐dimensional data. Econometrica, 85(1), 233-298. BelloniA.ChernozhukovV.Fernandez‐ValI. & HansenC. (2017). Program evaluation and causal inference with high‐dimensional data. Econometrica, 85(1), 233-298.Search in Google Scholar

Viegas, F., Rocha, L., Gonçalves, M., Mourão, F., Sá, G., Salles, T., ... & Sandin, I. (2018). A genetic programming approach for feature selection in highly dimensional skewed data. Neurocomputing, 273, 554-569. ViegasF.RochaL.GonçalvesM.MourãoF.G.SallesT. ... & SandinI. (2018). A genetic programming approach for feature selection in highly dimensional skewed data. Neurocomputing, 273, 554-569.Search in Google Scholar

Sheikhpour, R., Sarram, M. A., Gharaghani, S., & Chahooki, M. A. Z. (2017). A survey on semi-supervised feature selection methods. Pattern recognition, 64, 141-158. SheikhpourR.SarramM. A.GharaghaniS. & ChahookiM. A. Z. (2017). A survey on semi-supervised feature selection methods. Pattern recognition, 64, 141-158.Search in Google Scholar

Khaire, U. M., & Dhanalakshmi, R. (2022). Stability of feature selection algorithm: A review. Journal of King Saud University-Computer and Information Sciences, 34(4), 1060-1073. KhaireU. M. & DhanalakshmiR. (2022). Stability of feature selection algorithm: A review. Journal of King Saud University-Computer and Information Sciences, 34(4), 1060-1073.Search in Google Scholar

Abualigah, L. M. Q. (2019). Feature selection and enhanced krill herd algorithm for text document clustering (Vol. 816, pp. 1-165). Berlin: Springer. AbualigahL. M. Q. (2019). Feature selection and enhanced krill herd algorithm for text document clustering (Vol. 816, pp. 1-165). Berlin: Springer.Search in Google Scholar

Candes, E., Fan, Y., Janson, L., & Lv, J. (2018). Panning for gold:‘model-X’knockoffs for high dimensional controlled variable selection. Journal of the Royal Statistical Society Series B: Statistical Methodology, 80(3), 551-577. CandesE.FanY.JansonL. & LvJ. (2018). Panning for gold:‘model-X’knockoffs for high dimensional controlled variable selection. Journal of the Royal Statistical Society Series B: Statistical Methodology, 80(3), 551-577.Search in Google Scholar

Zhou, Y., Cheng, G., Jiang, S., & Dai, M. (2020). Building an efficient intrusion detection system based on feature selection and ensemble classifier. Computer networks, 174, 107247. ZhouY.ChengG.JiangS. & DaiM. (2020). Building an efficient intrusion detection system based on feature selection and ensemble classifier. Computer networks, 174, 107247.Search in Google Scholar

Venkatesh, B., & Anuradha, J. (2019). A review of feature selection and its methods. Cybernetics and information technologies, 19(1), 3-26. VenkateshB. & AnuradhaJ. (2019). A review of feature selection and its methods. Cybernetics and information technologies, 19(1), 3-26.Search in Google Scholar

Speiser, J. L., Miller, M. E., Tooze, J., & Ip, E. (2019). A comparison of random forest variable selection methods for classification prediction modeling. Expert systems with applications, 134, 93-101. SpeiserJ. L.MillerM. E.ToozeJ. & IpE. (2019). A comparison of random forest variable selection methods for classification prediction modeling. Expert systems with applications, 134, 93-101.Search in Google Scholar

Fonti, V., & Belitser, E. (2017). Feature selection using lasso. VU Amsterdam research paper in business analytics, 30, 1-25. FontiV. & BelitserE. (2017). Feature selection using lasso. VU Amsterdam research paper in business analytics, 30, 1-25.Search in Google Scholar

Rohart, F., Gautier, B., Singh, A., & Lê Cao, K. A. (2017). mixOmics: An R package for ‘omics feature selection and multiple data integration. PLoS computational biology, 13(11), e1005752. RohartF.GautierB.SinghA. & Lê CaoK. A. (2017). mixOmics: An R package for ‘omics feature selection and multiple data integration. PLoS computational biology, 13(11), e1005752.Search in Google Scholar

Sprache:
Englisch
Zeitrahmen der Veröffentlichung:
1 Hefte pro Jahr
Fachgebiete der Zeitschrift:
Biologie, Biologie, andere, Mathematik, Angewandte Mathematik, Mathematik, Allgemeines, Physik, Physik, andere