Counterexample Guided Neural Network Quantization Refinement

Joao Batista P. Matos Jr, Eddie B. de Lima Filho, Iury Bessa, Edoardo Manino, Xidan Song, Lucas C. Cordeiro

Research output: Contribution to journalArticlepeer-review

30 Downloads (Pure)


Deploying Neural networks (NNs) in low-resource domains is challenging because of their high computing, memory, and power requirements. For this reason, NNs are often quantized before deployment, but such an approach degrades their
accuracy. Thus, we propose the counterexample guided neural network quantization refinement (CEG4N) framework, which combines search-based quantization and equivalence checking. The former minimizes computational requirements, while the latter guarantees that the behavior of an NN does not change after quantization. We evaluate CEG4N on a diverse set of benchmarks, including large and small NNs. Our technique successfully quantizes the networks in the chosen evaluation set, while producing models with up to 163% better accuracy than state-of-the-art techniques.


  • cs.LG
  • cs.AI
  • cs.SE


Dive into the research topics of 'Counterexample Guided Neural Network Quantization Refinement'. Together they form a unique fingerprint.

Cite this