Cargando…

Learning Domain-Independent Deep Representations by Mutual Information Minimization

Domain transfer learning aims to learn common data representations from a source domain and a target domain so that the source domain data can help the classification of the target domain. Conventional transfer representation learning imposes the distributions of source and target domain representat...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Ke, Liu, Jiayong, Wang, Jing-Yan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6604496/
https://www.ncbi.nlm.nih.gov/pubmed/31316558
http://dx.doi.org/10.1155/2019/9414539
_version_ 1783431726592163840
author Wang, Ke
Liu, Jiayong
Wang, Jing-Yan
author_facet Wang, Ke
Liu, Jiayong
Wang, Jing-Yan
author_sort Wang, Ke
collection PubMed
description Domain transfer learning aims to learn common data representations from a source domain and a target domain so that the source domain data can help the classification of the target domain. Conventional transfer representation learning imposes the distributions of source and target domain representations to be similar, which heavily relies on the characterization of the distributions of domains and the distribution matching criteria. In this paper, we proposed a novel framework for domain transfer representation learning. Our motive is to make the learned representations of data points independent from the domains which they belong to. In other words, from an optimal cross-domain representation of a data point, it is difficult to tell which domain it is from. In this way, the learned representations can be generalized to different domains. To measure the dependency between the representations and the corresponding domain which the data points belong to, we propose to use the mutual information between the representations and the domain-belonging indicators. By minimizing such mutual information, we learn the representations which are independent from domains. We build a classwise deep convolutional network model as a representation model and maximize the margin of each data point of the corresponding class, which is defined over the intraclass and interclass neighborhood. To learn the parameters of the model, we construct a unified minimization problem where the margins are maximized while the representation-domain mutual information is minimized. In this way, we learn representations which are not only discriminate but also independent from domains. An iterative algorithm based on the Adam optimization method is proposed to solve the minimization to learn the classwise deep model parameters and the cross-domain representations simultaneously. Extensive experiments over benchmark datasets show its effectiveness and advantage over existing domain transfer learning methods.
format Online
Article
Text
id pubmed-6604496
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-66044962019-07-17 Learning Domain-Independent Deep Representations by Mutual Information Minimization Wang, Ke Liu, Jiayong Wang, Jing-Yan Comput Intell Neurosci Research Article Domain transfer learning aims to learn common data representations from a source domain and a target domain so that the source domain data can help the classification of the target domain. Conventional transfer representation learning imposes the distributions of source and target domain representations to be similar, which heavily relies on the characterization of the distributions of domains and the distribution matching criteria. In this paper, we proposed a novel framework for domain transfer representation learning. Our motive is to make the learned representations of data points independent from the domains which they belong to. In other words, from an optimal cross-domain representation of a data point, it is difficult to tell which domain it is from. In this way, the learned representations can be generalized to different domains. To measure the dependency between the representations and the corresponding domain which the data points belong to, we propose to use the mutual information between the representations and the domain-belonging indicators. By minimizing such mutual information, we learn the representations which are independent from domains. We build a classwise deep convolutional network model as a representation model and maximize the margin of each data point of the corresponding class, which is defined over the intraclass and interclass neighborhood. To learn the parameters of the model, we construct a unified minimization problem where the margins are maximized while the representation-domain mutual information is minimized. In this way, we learn representations which are not only discriminate but also independent from domains. An iterative algorithm based on the Adam optimization method is proposed to solve the minimization to learn the classwise deep model parameters and the cross-domain representations simultaneously. Extensive experiments over benchmark datasets show its effectiveness and advantage over existing domain transfer learning methods. Hindawi 2019-06-16 /pmc/articles/PMC6604496/ /pubmed/31316558 http://dx.doi.org/10.1155/2019/9414539 Text en Copyright © 2019 Ke Wang et al. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Wang, Ke
Liu, Jiayong
Wang, Jing-Yan
Learning Domain-Independent Deep Representations by Mutual Information Minimization
title Learning Domain-Independent Deep Representations by Mutual Information Minimization
title_full Learning Domain-Independent Deep Representations by Mutual Information Minimization
title_fullStr Learning Domain-Independent Deep Representations by Mutual Information Minimization
title_full_unstemmed Learning Domain-Independent Deep Representations by Mutual Information Minimization
title_short Learning Domain-Independent Deep Representations by Mutual Information Minimization
title_sort learning domain-independent deep representations by mutual information minimization
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6604496/
https://www.ncbi.nlm.nih.gov/pubmed/31316558
http://dx.doi.org/10.1155/2019/9414539
work_keys_str_mv AT wangke learningdomainindependentdeeprepresentationsbymutualinformationminimization
AT liujiayong learningdomainindependentdeeprepresentationsbymutualinformationminimization
AT wangjingyan learningdomainindependentdeeprepresentationsbymutualinformationminimization