Cargando…
Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications
Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer US
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9838479/ https://www.ncbi.nlm.nih.gov/pubmed/36685642 http://dx.doi.org/10.1007/s10489-022-04378-3 |
_version_ | 1784869296168501248 |
---|---|
author | Luo, Zhiwen Amayri, Manar Fan, Wentao Bouguila, Nizar |
author_facet | Luo, Zhiwen Amayri, Manar Fan, Wentao Bouguila, Nizar |
author_sort | Luo, Zhiwen |
collection | PubMed |
description | Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among groups. However, the restriction of Dirichlet prior and the significant privacy risk have hampered those models’ performance and utility. Training those cross-collection topic models may, in particular, leak sensitive information from the training dataset. To address the two issues mentioned above, we propose a novel model, cross-collection latent Beta-Liouville allocation (ccLBLA), which operates a more powerful prior, Beta-Liouville distribution with a more general covariance structure that enhances topic correlation analysis. To provide privacy protection for the ccLBLA model, we leverage the inherent differential privacy guarantee of the Collapsed Gibbs Sampling (CGS) inference scheme and then propose a hybrid privacy protection algorithm for the ccLBLA model (HPP-ccLBLA) that prevents inferring data from intermediate statistics during the CGS training process without sacrificing its utility. More crucially, our technique is the first attempt to use the cross-collection topic model in image classification applications and investigate the cross-collection topic model’s capabilities beyond text analysis. The experimental results for comparative text mining and image classification will show the merits of our proposed approach. |
format | Online Article Text |
id | pubmed-9838479 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Springer US |
record_format | MEDLINE/PubMed |
spelling | pubmed-98384792023-01-17 Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications Luo, Zhiwen Amayri, Manar Fan, Wentao Bouguila, Nizar Appl Intell (Dordr) Article Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among groups. However, the restriction of Dirichlet prior and the significant privacy risk have hampered those models’ performance and utility. Training those cross-collection topic models may, in particular, leak sensitive information from the training dataset. To address the two issues mentioned above, we propose a novel model, cross-collection latent Beta-Liouville allocation (ccLBLA), which operates a more powerful prior, Beta-Liouville distribution with a more general covariance structure that enhances topic correlation analysis. To provide privacy protection for the ccLBLA model, we leverage the inherent differential privacy guarantee of the Collapsed Gibbs Sampling (CGS) inference scheme and then propose a hybrid privacy protection algorithm for the ccLBLA model (HPP-ccLBLA) that prevents inferring data from intermediate statistics during the CGS training process without sacrificing its utility. More crucially, our technique is the first attempt to use the cross-collection topic model in image classification applications and investigate the cross-collection topic model’s capabilities beyond text analysis. The experimental results for comparative text mining and image classification will show the merits of our proposed approach. Springer US 2023-01-13 /pmc/articles/PMC9838479/ /pubmed/36685642 http://dx.doi.org/10.1007/s10489-022-04378-3 Text en © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Luo, Zhiwen Amayri, Manar Fan, Wentao Bouguila, Nizar Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications |
title | Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications |
title_full | Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications |
title_fullStr | Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications |
title_full_unstemmed | Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications |
title_short | Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications |
title_sort | cross-collection latent beta-liouville allocation model training with privacy protection and applications |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9838479/ https://www.ncbi.nlm.nih.gov/pubmed/36685642 http://dx.doi.org/10.1007/s10489-022-04378-3 |
work_keys_str_mv | AT luozhiwen crosscollectionlatentbetaliouvilleallocationmodeltrainingwithprivacyprotectionandapplications AT amayrimanar crosscollectionlatentbetaliouvilleallocationmodeltrainingwithprivacyprotectionandapplications AT fanwentao crosscollectionlatentbetaliouvilleallocationmodeltrainingwithprivacyprotectionandapplications AT bouguilanizar crosscollectionlatentbetaliouvilleallocationmodeltrainingwithprivacyprotectionandapplications |