Cargando…
Multisource Deep Transfer Learning Based on Balanced Distribution Adaptation
The current traditional unsupervised transfer learning assumes that the sample is collected from a single domain. From the aspect of practical application, the sample from a single-source domain is often not enough. In most cases, we usually collect labeled data from multiple domains. In recent year...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9038385/ https://www.ncbi.nlm.nih.gov/pubmed/35479600 http://dx.doi.org/10.1155/2022/6915216 |
Sumario: | The current traditional unsupervised transfer learning assumes that the sample is collected from a single domain. From the aspect of practical application, the sample from a single-source domain is often not enough. In most cases, we usually collect labeled data from multiple domains. In recent years, multisource unsupervised transfer learning with deep learning has focused on aligning in the common feature space and then seeking to minimize the distribution difference between the source and target domains, such as marginal distribution, conditional distribution, or both. Moreover, conditional distribution and marginal distribution are often treated equally, which will lead to poor performance in practical applications. The existing algorithms that consider balanced distribution are often based on a single-source domain. To solve the above-mentioned problems, we propose a multisource transfer learning algorithm based on distribution adaptation. This algorithm considers adjusting the weights of two distributions to solve the problem of distribution adaptation in multisource transfer learning. A large number of experiments have shown that our method MTLBDA has achieved significant results in popular image classification datasets such as Office-31. |
---|