Papers
arxiv:2308.07209

Unified Data-Free Compression: Pruning and Quantization without Fine-Tuning

Published on Aug 14, 2023
Authors:
,
,
,
,

Abstract

Structured pruning and <PRE_TAG>quantization</POST_TAG> are promising approaches for reducing the inference time and memory footprint of neural networks. However, most existing methods require the original training dataset to fine-tune the model. This not only brings heavy resource consumption but also is not possible for applications with sensitive or proprietary data due to privacy and security concerns. Therefore, a few data-free methods are proposed to address this problem, but they perform data-free pruning and <PRE_TAG>quantization</POST_TAG> separately, which does not explore the complementarity of pruning and <PRE_TAG>quantization</POST_TAG>. In this paper, we propose a novel framework named Unified Data-Free Compression(UDFC), which performs pruning and <PRE_TAG>quantization</POST_TAG> simultaneously without any data and fine-tuning process. Specifically, UDFC starts with the assumption that the partial information of a damaged(e.g., pruned or quantized) channel can be preserved by a linear combination of other channels, and then derives the reconstruction form from the assumption to restore the information loss due to compression. Finally, we formulate the reconstruction error between the original network and its compressed network, and theoretically deduce the closed-form solution. We evaluate the UDFC on the large-scale image classification task and obtain significant improvements over various network architectures and compression methods. For example, we achieve a 20.54% accuracy improvement on ImageNet dataset compared to SOTA method with 30% pruning ratio and 6-bit <PRE_TAG>quantization</POST_TAG> on ResNet-34.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2308.07209 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2308.07209 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2308.07209 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.