Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

< Back to Article

SensiMix: Sensitivity-Aware 8-bit index & 1-bit value mixed precision quantization for BERT compression

Table 4

Effectiveness of the three proposed 1-bit quantization-aware training methods.

ABWR, PT, and ILF improve the performance (average score in the GLUE tasks) of SensiMix by 1.1%, 1.4%, and 1.4%, respectively.

Table 4