Cargando…

A Registration Method Based on Contour Point Cloud for 3D Whole-Body PET and CT Images

The PET and CT fusion image, combining the anatomical and functional information, has important clinical meaning. An effective registration of PET and CT images is the basis of image fusion. This paper presents a multithread registration method based on contour point cloud for 3D whole-body PET and...

Descripción completa

Detalles Bibliográficos
Autores principales: Song, Zhiying, Jiang, Huiyan, Yang, Qiyao, Wang, Zhiguo, Zhang, Guoxu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi Publishing Corporation 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5339628/
https://www.ncbi.nlm.nih.gov/pubmed/28316979
http://dx.doi.org/10.1155/2017/5380742
Descripción
Sumario:The PET and CT fusion image, combining the anatomical and functional information, has important clinical meaning. An effective registration of PET and CT images is the basis of image fusion. This paper presents a multithread registration method based on contour point cloud for 3D whole-body PET and CT images. Firstly, a geometric feature-based segmentation (GFS) method and a dynamic threshold denoising (DTD) method are creatively proposed to preprocess CT and PET images, respectively. Next, a new automated trunk slices extraction method is presented for extracting feature point clouds. Finally, the multithread Iterative Closet Point is adopted to drive an affine transform. We compare our method with a multiresolution registration method based on Mattes Mutual Information on 13 pairs (246~286 slices per pair) of 3D whole-body PET and CT data. Experimental results demonstrate the registration effectiveness of our method with lower negative normalization correlation (NC = −0.933) on feature images and less Euclidean distance error (ED = 2.826) on landmark points, outperforming the source data (NC = −0.496, ED = 25.847) and the compared method (NC = −0.614, ED = 16.085). Moreover, our method is about ten times faster than the compared one.