Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

< Back to Article

Fig 1.

Slices from three different viewpoints.

Pancreas is highlighted in red and is a small organ with vague boundary and irregular shape.

More »

Fig 1 Expand

Fig 2.

Illustration of the segmentation results of two different input regions.

A smaller input region improves the segmentation performance.

More »

Fig 2 Expand

Fig 3.

Illustration of the coarse-to-fine framework.

The framework segments the pancreas through coronal, sagittal and axial viewpoints and fuses the results into a new segmentation volume.

More »

Fig 3 Expand

Fig 4.

Illustration of the proposed multi-scale attention net.

Dense block is show in Fig 5 and the proposed attention module is shown in Fig 6.

More »

Fig 4 Expand

Fig 5.

Illustration of the dense block.

Dense block is composed of 6 convolution layers to obtain a large receptive field and skip connections are used to avoid the risk of network degradation, gradient vanish and gradient explosion.

More »

Fig 5 Expand

Fig 6.

Illustration of the proposed attention module.

It is composed of a spatial attention module and a channel attention module, and make the network focus on the most relevant regions and channels.

More »

Fig 6 Expand

Table 1.

Evaluation for coarse segmentation of three axes and fusion segmentation is reported in the table.

More »

Table 1 Expand

Table 2.

Comparison between our approach and the state-of-the-art approaches on NIH pancreas dataset.

More »

Table 2 Expand

Fig 7.

An example of the segmentation results.

Compared with the coarse segmentation, fine segmentation result has been improved from 70.45% to 84.31%. However, we can observe that we cannot predict the boundary of the pancreas very well.

More »

Fig 7 Expand

Table 3.

Ablation study of our proposed 2.5D U-net and proposed attention module.

AG: The attention gate proposed in [24]. DA: The dual attention module proposed in [22]. HA: Our proposed attention module. Params: The number of parameters.

More »

Table 3 Expand

Table 4.

Analysis of the DSC distribution of all slices.

More »

Table 4 Expand