Activation Sensitivity as a Unifying Principle for Post-Training Quantization

This research establishes activation sensitivity as the rigorous theoretical foundation for post-training quantization in large language models, linking grad...

Level: expert

By Bruce Changlong Xu

Category: research