• Jan 04, 2024 News!IJKE will adopt Article-by-Article Work Flow. For the Biannually journal, each issue will be released at the end of the issue month.
  • Nov 28, 2023 News!Vol.9, No.2 has been published with online version.   [Click]
  • Jun 06, 2023 News!Vol.9, No.1 has been published with online version.   [Click]
General Information
    • ISSN: 2382-6185
    • Abbreviated Title: Int. J. Knowl. Eng.
    • Frequency: Semiyearly
    • DOI: 10.18178/IJKE
    • Editor-in-Chief: Prof. Chen-Huei Chou
    • Executive Editor: Ms. Shira,W.Lu
    • Indexed by: Google Scholar, Crossref, ProQuest
    • E-mail: ijke@ejournal.net
Editor-in-chief
Prof. Chen-Huei Chou
College of Charleston, SC, USA
It is my honor to be the editor-in-chief of IJKE. I will do my best to help develop this journal better.
IJKE 2022 Vol.8(1): 1-6
doi: 10.18178/ijke.2022.8.1.136

An Evolution Approach for Pre-trained Neural Network Pruning without Original Training Dataset

Abstract—Model pruning is an important technique in real-world machine learning problems, especially in deep learning. This technique has provided some methods for compressing a large model to a smaller model while retaining the most accuracy. However, a majority of these approaches require a full original training set. This might not always be possible in practice if the model is trained in a large-scale dataset or on a dataset whose release poses privacy. Although we cannot access the original training set in some cases, pre-trained models are available more often. This paper aims to solve the model pruning problem without the initial training set by finding the sub-networks in the initial pre-trained model. We propose an approach of using genetic algorithms (GA) to find the sub-networks systematically and automatically. Experimental results show that our algorithm can find good sub-networks efficiently. Theoretically, if we had unlimited time and hardware power, we could find the optimized sub-networks of any pre-trained model and achieve the best results in the future. Our code and pre-trained models are available at: https://github.com/sun-asterisk-research/ga_pruning_research.

Index Terms—Genetic algorithm - GA, model compression, data-free learning..

Toan Pham Van, Thanh Nguyen Tung, Linh Bao Doan are with Sun-Asterisk Inc., Vietnam (e-mail: pham.van.toan@sun-asterisk.com, nguyen.tung.thanh@sun-asterisk.com, doan.bao.linh@sun-asterisk.com).
Ta Minh Thanh is with Le Quy Don Technical University, Ha Noi, Vietnam. He is now with the Faculty of Information Technology, Le Quy Don Technical University (e-mail: thanhtm@lqdtu.edu.vn).

[PDF]
 

Cite: Toan Pham Van, Thanh Nguyen Tung, Linh Bao Doan, and Thanh Ta Minh, "An Evolution Approach for Pre-trained Neural Network Pruning without Original Training Dataset," International Journal of Knowledge Engineering vol. 8, no. 1, pp. 1-6, 2022.

Copyright © 2022 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

PREVIOUS PAPER
First page
NEXT PAPER
Last page
Copyright © 2008-2024   International Journal of Knowledge Engineering. All rights reserved.
E-mail: ijke@ejournal.net