Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

< Back to Article

Figure 1.

Illustration of data set.

(a) Example of registered nodes. (b) Distances between coordinate pairs excluding symmetries. Numbers 1 to 48 correspond to landmarks; red: pairwise edges, excluding symmetries; black: Delaunay triangulation. Example of symmetric distances (25, 24) and (23,24).

More »

Figure 1 Expand

Table 1.

Description of data set with number of patients per class.

More »

Table 1 Expand

Figure 2.

Importance weighting.

Illustration of the procedure to compute importance for point δ. Contributions of point p1, area of triangle t1, distance d1, and angle a1 (blue) are weighted according to distance to δ (red). Distances to p1, centroid c1, midpoint m1, vertex v1 are used for p1, t1, d1, and a1, respectively.

More »

Figure 2 Expand

Figure 3.

Average misclassification error glmnet.

Average misclassification error with 95% confidence intervals across leave-one-out cross-validation for models with different values of mixing parameter α. (a) all features (red) and only points (blue) were used and (b) all features and their squares (red) and only points and their squares (blue) were used.

More »

Figure 3 Expand

Table 2.

Average misclassification error (AME) with 95% confidence interval for leave-one-out cross validation for glmnet, 20 different values of α (see text), and PCA using only points (p), all features (a), only points and their squares (p+p2) and all features and their squares (a+a2).

More »

Table 2 Expand

Figure 4.

Average misclassification error for values of tuning parameter λ when α = .11.

More »

Figure 4 Expand

Table 3.

Simultaneous average misclassification error (AME) per syndrome.

More »

Table 3 Expand

Table 4.

Confusion matrix for the best glmnet model, α = .11, using all features.

More »

Table 4 Expand

Table 5.

Number of non zero coefficients for each syndrome for the best glmnet model (α = .11 using all features).

More »

Table 5 Expand

Table 6.

Pairwise average misclassification error rate for the best glmnet model.

More »

Table 6 Expand

Figure 5.

Importance plots glmnet.

Visualization of simultaneous classification for syndromes. For each syndrome an importance plot (row I) and a plot visualizing classification features (row F) is provided. Importance plot assigns an importance with respect to classification to each point as described in the text. Feature plots visualize absolute regression coefficients by thickness of line segments (distances), size of points (coordinates), color of areas (areas; dark red more important than light red) and small triangles (angles; dark red more important than light red).

More »

Figure 5 Expand

Figure 6.

Importance plots PCA.

Visualizations analogous to figure 5 for PCA based classification.

More »

Figure 6 Expand