Cargando…

Soft Tensor Regression

Statistical methods relating tensor predictors to scalar outcomes in a regression model generally vectorize the tensor predictor and estimate the coefficients of its entries employing some form of regularization, use summaries of the tensor covariate, or use a low dimensional approximation of the co...

Descripción completa

Detalles Bibliográficos
Autores principales: Papadogeorgou, Georgia, Zhang, Zhengwu, Dunson, David B.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9222480/
https://www.ncbi.nlm.nih.gov/pubmed/35754924
_version_ 1784732875737792512
author Papadogeorgou, Georgia
Zhang, Zhengwu
Dunson, David B.
author_facet Papadogeorgou, Georgia
Zhang, Zhengwu
Dunson, David B.
author_sort Papadogeorgou, Georgia
collection PubMed
description Statistical methods relating tensor predictors to scalar outcomes in a regression model generally vectorize the tensor predictor and estimate the coefficients of its entries employing some form of regularization, use summaries of the tensor covariate, or use a low dimensional approximation of the coefficient tensor. However, low rank approximations of the coefficient tensor can suffer if the true rank is not small. We propose a tensor regression framework which assumes a soft version of the parallel factors (PARAFAC) approximation. In contrast to classic PARAFAC where each entry of the coefficient tensor is the sum of products of row-specific contributions across the tensor modes, the soft tensor regression (Softer) framework allows the row-specific contributions to vary around an overall mean. We follow a Bayesian approach to inference, and show that softening the PARAFAC increases model flexibility, leads to improved estimation of coefficient tensors, more accurate identification of important predictor entries, and more precise predictions, even for a low approximation rank. From a theoretical perspective, we show that employing Softer leads to a weakly consistent posterior distribution of the coefficient tensor, irrespective of the true or approximation tensor rank, a result that is not true when employing the classic PARAFAC for tensor regression. In the context of our motivating application, we adapt Softer to symmetric and semi-symmetric tensor predictors and analyze the relationship between brain network characteristics and human traits.
format Online
Article
Text
id pubmed-9222480
institution National Center for Biotechnology Information
language English
publishDate 2021
record_format MEDLINE/PubMed
spelling pubmed-92224802022-06-23 Soft Tensor Regression Papadogeorgou, Georgia Zhang, Zhengwu Dunson, David B. J Mach Learn Res Article Statistical methods relating tensor predictors to scalar outcomes in a regression model generally vectorize the tensor predictor and estimate the coefficients of its entries employing some form of regularization, use summaries of the tensor covariate, or use a low dimensional approximation of the coefficient tensor. However, low rank approximations of the coefficient tensor can suffer if the true rank is not small. We propose a tensor regression framework which assumes a soft version of the parallel factors (PARAFAC) approximation. In contrast to classic PARAFAC where each entry of the coefficient tensor is the sum of products of row-specific contributions across the tensor modes, the soft tensor regression (Softer) framework allows the row-specific contributions to vary around an overall mean. We follow a Bayesian approach to inference, and show that softening the PARAFAC increases model flexibility, leads to improved estimation of coefficient tensors, more accurate identification of important predictor entries, and more precise predictions, even for a low approximation rank. From a theoretical perspective, we show that employing Softer leads to a weakly consistent posterior distribution of the coefficient tensor, irrespective of the true or approximation tensor rank, a result that is not true when employing the classic PARAFAC for tensor regression. In the context of our motivating application, we adapt Softer to symmetric and semi-symmetric tensor predictors and analyze the relationship between brain network characteristics and human traits. 2021 /pmc/articles/PMC9222480/ /pubmed/35754924 Text en https://creativecommons.org/licenses/by/4.0/License: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided at http://jmlr.org/papers/v22/20-476.html.
spellingShingle Article
Papadogeorgou, Georgia
Zhang, Zhengwu
Dunson, David B.
Soft Tensor Regression
title Soft Tensor Regression
title_full Soft Tensor Regression
title_fullStr Soft Tensor Regression
title_full_unstemmed Soft Tensor Regression
title_short Soft Tensor Regression
title_sort soft tensor regression
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9222480/
https://www.ncbi.nlm.nih.gov/pubmed/35754924
work_keys_str_mv AT papadogeorgougeorgia softtensorregression
AT zhangzhengwu softtensorregression
AT dunsondavidb softtensorregression