Cargando…
Discretization and Feature Selection Based on Bias Corrected Mutual Information Considering High-Order Dependencies
Mutual Information (MI) based feature selection methods are popular due to their ability to capture the nonlinear relationship among variables. However, existing works rarely address the error (bias) that occurs due to the use of finite samples during the estimation of MI. To the best of our knowled...
Autores principales: | Roy, Puloma, Sharmin, Sadia, Ali, Amin Ahsan, Shoyaib, Mohammad |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206174/ http://dx.doi.org/10.1007/978-3-030-47426-3_64 |
Ejemplares similares
-
Use of relevancy and complementary information for discriminatory gene selection from high-dimensional gene expression data
por: Haque, Md Nazmul, et al.
Publicado: (2021) -
A Proximity Weighted Evidential k Nearest Neighbor Classifier for Imbalanced Data
por: Kadir, Md. Eusha, et al.
Publicado: (2020) -
Mutual Information between Discrete and Continuous Data Sets
por: Ross, Brian C.
Publicado: (2014) -
Mutual Information between Order Book Layers
por: Libman, Daniel, et al.
Publicado: (2022) -
Multi-Label Feature Selection with Conditional Mutual Information
por: Wang, Xiujuan, et al.
Publicado: (2022)