Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

< Back to Article

Fig 1.

Imaging modalities in cardiovascular interventions.

A: The heart is transparent in real-time X-ray fluoroscopic images. B: Pre-operative CT scan images provide a high contrast for understanding heart anatomy with (C) the 3D reconstruction.

More »

Fig 1 Expand

Fig 2.

Overall procedure of the proposed AR assisted guidance system.

More »

Fig 2 Expand

Fig 3.

Segmentation of anatomical structures.

A: A slice of CT scans shows the segmented region after thresholding. B: 3D reconstruction of the heart and spine by selecting the largest object and filling inner holes. C: The 3D models are cleaned by deleting peripheral blood vessels and bones. D: 3D models are further modified and fabricated using a 3D printer. E&F: The 3D printed model is tested under the X-ray fluoroscopy machine.

More »

Fig 3 Expand

Fig 4.

The Fourier based registration method.

A: A projectional view of the 3D model that is reconstructed from CT images. B: The original fluoroscopic image is taken at RAO30. C: Only the spine image is created from the 3D model. D: The spine image detected from the fluoroscopic image. E-F: The polar-logarithmic transformed Fourier images corresponding to C-D; the rotation and scale factors are converted to the translations in X and Y axis. G: The phase correlation plot shows the maximum point is located at the position corresponding to the rotation and scale shifts. H: The overlaid image shows the final registration result.

More »

Fig 4 Expand

Fig 5.

The catheter is detected from the fluoroscopic image.

A: Original image. B: The ROI is first extracted at the bottom of the image. C: The gradient of the ROI reveals the edge of the catheter. D: The derivatives of the pixel intensity along the X-axis shows the two edge points are located at the two global peaks. E: An updated ROI is created on top of the previous ROI along the center line of the catheter. F: The final detection results are overlaid to the original image.

More »

Fig 5 Expand

Fig 6.

Localization of catheter in 3D space.

A, B&C: The catheter is detected from fluoroscopic images that are captured at LAO30, AP, and RAO30 angles. The centerline of the catheter is displayed in red. D&E: The 3D locations of the catheter are determined by using three pairs of images (i.e., AP+LAO, AP+RAO, and LAO+RAO).

More »

Fig 6 Expand

Fig 7.

The enhanced visualization on the augmented reality device.

A: The 3D rendering of the heart and spine are displayed as holograms on the HoloLens. B: The preprocedural planning is shown as red lines crossing the ideal transseptal puncture site (shown as target rings). C: The catheter is rendered in the 3D space and its position is determined by processing the fluoroscopic images. D-F: A virtual camera is attached to the endpoint of the catheter to provide the first-person view of the catheter when inserting through the inferior vena cava (D), entering the right atrium (E), and approaching the transseptal puncture target (F).

More »

Fig 7 Expand

Fig 8.

Performance of image registration.

A: The misalignment error is defined as the distance between the centroid of the paired vertebrae detected from the fluoroscopic image (in gray color) and projectional CT image (in yellow color). B: The results show that the misalignment errors are consistently below 1 mm for the five experimental groups when capturing the fluoroscopic images at different angles.

More »

Fig 8 Expand

Table 1.

Comparison of registration performance.

More »

Table 1 Expand

Fig 9.

The evaluation results on the determined 3D position of the catheter.

A: The mean standard errors for 1000 points on the catheter are all below 2 pixels (i.e., 0.5 mm). B: A histogram indicates the majority of mean standard errors are below 0.5 pixel.

More »

Fig 9 Expand

Table 2.

Mean standard errors of 3D position of the catheter.

More »

Table 2 Expand