PocketLLM: Ultimate Compression of Large Language Models via Meta Networks

Explore PocketLLM, a novel meta-network architecture achieving 10x compression of Llama 2-7B through latent-space projection, enabling efficient edge deploym...

Level: advanced

By Ye Tian, Chengcheng Wang, Jing Han, Yehui Tang, Kai Han

Category: research