Please use this identifier to cite or link to this item: http://repositorio.unicamp.br/jspui/handle/REPOSIP/195948
Type: Artigo de periódico
Title: Interactive Volume Segmentation With Differential Image Foresting Transforms.
Author: Falcão, Alexandre X
Bergo, Felipe P G
Abstract: The absence of object information very often asks for considerable human assistance in medical image segmentation. Many interactive two-dimensional and three-dimensional (3-D) segmentation methods have been proposed, but their response time to user's actions should be considerably reduced to make them viable from the practical point of view. We circumvent this problem in the framework of the image foresting transform (IFT)--a general tool for the design of image operators based on connectivity--by introducing a new algorithm (DIFT) to compute sequences of IFTs in a differential way. We instantiate the DIFT algorithm for watershed-based and fuzzy-connected segmentations under two paradigms (single-object and multiple-object) and evaluate the efficiency gains of both approaches with respect to their linear-time implementation based on the nondifferential IFT. We show that the DIFT algorithm provides efficiency gains from 10 to 17, reducing the user's waiting time for segmentation with 3-D visualization on a common PC from 19-36 s to 2-3 s. We also show that the multiple-object approach is more efficient than the single-object paradigm for both segmentation methods.
Subject: Algorithms
Brain
Computer Graphics
Humans
Image Interpretation, Computer-assisted
Imaging, Three-dimensional
Magnetic Resonance Imaging
Online Systems
Pattern Recognition, Automated
Reproducibility Of Results
Sensitivity And Specificity
Software
User-computer Interface
Rights: fechado
Identifier DOI: 10.1109/TMI.2004.829335
Address: http://www.ncbi.nlm.nih.gov/pubmed/15377119
Date Issue: 2004
Appears in Collections:Unicamp - Artigos e Outros Documentos

Files in This Item:
File SizeFormat 
pmed_15377119.pdf1.16 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.