Cargando…
The fuzzy Kullback–Leibler divergence for estimating parameters of the probability distribution in fuzzy data: an application to classifying Vietnamese Herb Leaves
In this paper, we address the challenge of estimating probability distributions which are typically represented by parameter-based values. However, this estimation is prone to errors and does not comprehensively capture the nature of real-world data. Additionally, real-world data often follows a mix...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10477302/ https://www.ncbi.nlm.nih.gov/pubmed/37666854 http://dx.doi.org/10.1038/s41598-023-40992-y |
Sumario: | In this paper, we address the challenge of estimating probability distributions which are typically represented by parameter-based values. However, this estimation is prone to errors and does not comprehensively capture the nature of real-world data. Additionally, real-world data often follows a mixed form of probability distributions, where sub-datasets may contain incomplete information. To enhance flexibility, especially in classification problems, we propose a new method for describing parameters estimated through Bayesian statistics. Our method introduces fuzzy parameters and assesses the similarity between probability distributions using the fuzzy extended Kullback–Leibler divergence. We demonstrate the practical application of our approach in Vietnamese Herb Leaves classification. By incorporating fuzzy parameters and leveraging Bayesian statistics, our method provides more robust estimations of probability distributions and enables improved flexibility in classification tasks. |
---|