Cargando…

Independent component analysis: recent advances

Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of...

Descripción completa

Detalles Bibliográficos
Autor principal: Hyvärinen, Aapo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society Publishing 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3538438/
https://www.ncbi.nlm.nih.gov/pubmed/23277597
http://dx.doi.org/10.1098/rsta.2011.0534
Descripción
Sumario:Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of non-Gaussianity, which enables the identification of original, underlying components, in contrast to classical methods. The basic theory of independent component analysis was mainly developed in the 1990s and summarized, for example, in our monograph in 2001. Here, we provide an overview of some recent developments in the theory since the year 2000. The main topics are: analysis of causal relations, testing independent components, analysing multiple datasets (three-way data), modelling dependencies between the components and improved methods for estimating the basic model.