Fig 1.
Automated and manual data annotation results.
A: Automated nuclei annotation considers a fraction of manually annotated nuclei by coincidently ensuring high precision of labelled nuclei. B: Manual nuclei annotation considers all of the nuclei within the microscopy image. In both cases a training dataset of images and corresponding annotation masks is provided. Human mammary gland epithelial cells (primary cells) are used for demonstration purposes.
Fig 2.
Process of automated and manual nuclei annotation for segmentation with a neural network.
Automated data annotation considers image pre-processing, binary thresholding, watershed-segmentation, filtering and post-processing to provide a training dataset for nuclei segmentation with a neural network. Manual data annotation is applied on raw image data (after pre-processing) in order to provide a training dataset of images and annotation masks.
Fig 3.
Performance of automated and manual nuclei annotation.
A: Number of automatically and manually annotated nuclei for a dataset of five images (I1-I5). B: Regions of interest provided by automatic and manual nuclei annotation. C: Annotation duration of a dataset containing 6409 fluorescently labelled nuclei with automatic (~5 minutes) and manual data annotation (~15 hours).
Table 1.
Performance of automatically and manually annotated nuclei.
Fig 4.
Nuclei segmentation performance of a neural network trained on an automatically and manually annotated training dataset.
A: Average F1 score compared to intersection over union of a segmented microscopy dataset trained on automatically and manually annotated nuclei. The average F1 score measures the proportion of correctly segmented objects considering True Positives, False Positives and False Negatives. The intersection over union (IoU) thresholds determine the segmentation accuracy of the neural network (trained on automatically and manually annotated data) compared to the ground truth. High thresholds indicate a strict boundary matching. Average F1 scores remain nearly constant up to IoU = 0.90. At even higher thresholds, accuracy decreases sharply. B: Regions of interest of nuclei segmentation. Ground truth nuclei data are compared to segmentation of a neural network trained on an automatically and manually annotated dataset. Segmentation differences are indicated with green and red arrows. A neural network trained on automatically annotated nuclei provides higher segmentation precision of small nuclei (green arrows). A neural network trained on manually annotated nuclei provides higher accuracy segmenting touching nuclei (red arrows).
Table 2.
F1 score of segmented nuclei images trained on an automatically and a manually annotated dataset.
Fig 5.
Adaptability of an automatically and manually annotated dataset for nuclei segmentation.
Applying automated annotation to an individual dataset and training these data with a neural network provides accurate segmentation results. Nuclei segmentation performs poorly when relying on the training dataset acquired with lower magnifications. To achieve accurate segmentation a time-intensive manual annotation process is required.
Fig 6.
Effect of random noise added to automatically annotated training dataset.
Comparison between learning curves of a neural network with and without random noise added to the training dataset. Validation and training loss curves are converging if random noise is added to the automatically annotated training dataset, as the filtered images are without background noise.
Fig 7.
Nuclei tracking results based on an automatically annotated dataset.
A: Tracks of fluorescently labelled nuclei. The color of the tracks indicates the speed of each individual nucleus/cell (from blue: Low speed, to red: High speed). B: Mean moving direction of nuclei within 8 hours 20 minutes. Predominant movement directions between 45°-75° and 315°-345° from its starting position.
Fig 8.
Large scale nuclei segmentation of a widefield microscopy image trained on an automatically annotated dataset.
The image has with a view size of 4.2 mm x 3 mm containing more than 60000 single nuclei. Segmentation results are obtained within one hour, including (automatic) annotation and training with a neural network.