Cargando…
Deciphering spatial domains from spatially resolved transcriptomics with an adaptive graph attention auto-encoder
Recent advances in spatially resolved transcriptomics have enabled comprehensive measurements of gene expression patterns while retaining the spatial context of the tissue microenvironment. Deciphering the spatial context of spots in a tissue needs to use their spatial information carefully. To this...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8976049/ https://www.ncbi.nlm.nih.gov/pubmed/35365632 http://dx.doi.org/10.1038/s41467-022-29439-6 |
Sumario: | Recent advances in spatially resolved transcriptomics have enabled comprehensive measurements of gene expression patterns while retaining the spatial context of the tissue microenvironment. Deciphering the spatial context of spots in a tissue needs to use their spatial information carefully. To this end, we develop a graph attention auto-encoder framework STAGATE to accurately identify spatial domains by learning low-dimensional latent embeddings via integrating spatial information and gene expression profiles. To better characterize the spatial similarity at the boundary of spatial domains, STAGATE adopts an attention mechanism to adaptively learn the similarity of neighboring spots, and an optional cell type-aware module through integrating the pre-clustering of gene expressions. We validate STAGATE on diverse spatial transcriptomics datasets generated by different platforms with different spatial resolutions. STAGATE could substantially improve the identification accuracy of spatial domains, and denoise the data while preserving spatial expression patterns. Importantly, STAGATE could be extended to multiple consecutive sections to reduce batch effects between sections and extracting three-dimensional (3D) expression domains from the reconstructed 3D tissue effectively. |
---|