Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

< Back to Article

SensiMix: Sensitivity-Aware 8-bit index & 1-bit value mixed precision quantization for BERT compression

Table 5

Comparison of the sensitivity of Self-Attention layer and FFN in BERT.

The result indicates that Self-Attention (SA) layer is more sensitive than FFN.

Table 5