Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

< Back to Article

Fig 1.

Self-attention mechanism.

More »

Fig 1 Expand

Fig 2.

Multi-head self-attention mechanism.

More »

Fig 2 Expand

Fig 3.

Overall model architecture.

More »

Fig 3 Expand

Fig 4.

The process of regular window partition and reverse.

More »

Fig 4 Expand

Fig 5.

The process of shifting window partition and reverse.

More »

Fig 5 Expand

Table 1.

Performance comparison of our method with baselines on the Market1501, DukeMTMC-reID and MSMT17 dataset.

More »

Table 1 Expand

Fig 6.

Loss and top1 error curve with Market1501 dataset.

More »

Fig 6 Expand

Fig 7.

Loss and top1 error curve with DukeMTMC-reID dataset.

More »

Fig 7 Expand

Fig 8.

Loss and top1 error curve with MSMT17 dataset.

More »

Fig 8 Expand

Fig 9.

ROC curve with Market1501 dataset.

More »

Fig 9 Expand

Fig 10.

ROC curve with DukeMTMC-reID dataset.

More »

Fig 10 Expand

Fig 11.

ROC curve with MSMT17 dataset.

More »

Fig 11 Expand

Fig 12.

Example 1 of ranking results.

More »

Fig 12 Expand

Fig 13.

Example 2 of ranking results.

More »

Fig 13 Expand

Fig 14.

Example 3 of ranking results.

More »

Fig 14 Expand

Fig 15.

The examples of the feature visualization for different methods.

More »

Fig 15 Expand

Table 2.

Ablation experiments of our method on the Market1501, DukeMTMC-reID and MSMT17 datasets.

More »

Table 2 Expand

Table 3.

Comparison of computation efficiency among different methods.

More »

Table 3 Expand