The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
CNN(컨벌루션 신경망)의 계산 비용과 저장 공간의 증가로 인해 최근 몇 년 동안 리소스가 제한된 장치에 대한 적용이 심각하게 방해를 받았습니다. 결과적으로 특정 방법으로 네트워크를 가속화해야 할 필요성이 임박했습니다. 본 논문에서는 CNN의 중복 채널을 제거하는 손실 기반 방법을 제안합니다. 스케일링 및 이동 요소와 관련된 Taylor 확장 기술을 사용하여 중요하지 않은 채널을 식별하고 고정된 백분위수 임계값으로 해당 채널을 정리합니다. 그렇게 함으로써 매개변수와 FLOP 소비가 적은 컴팩트한 네트워크를 얻을 수 있습니다. 실험 섹션에서는 VGG-19, DenseNet-40 및 ResNet-164를 포함한 여러 인기 있는 네트워크가 포함된 CIFAR 데이터 세트에서 제안된 방법을 평가하고 실험 결과는 제안된 방법이 성능 없이 70% 이상의 채널과 매개변수를 정리할 수 있음을 보여줍니다. 손실. 더욱이, 반복적 가지치기(iterative pruning)를 사용하여 보다 컴팩트한 네트워크를 얻을 수 있습니다.
Xin LONG
National University of Defense Technology
Xiangrong ZENG
National University of Defense Technology
Chen CHEN
National University of Defense Technology
Huaxin XIAO
National University of Defense Technology
Maojun ZHANG
National University of Defense Technology
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
부
Xin LONG, Xiangrong ZENG, Chen CHEN, Huaxin XIAO, Maojun ZHANG, "Loss-Driven Channel Pruning of Convolutional Neural Networks" in IEICE TRANSACTIONS on Information,
vol. E103-D, no. 5, pp. 1190-1194, May 2020, doi: 10.1587/transinf.2019EDL8200.
Abstract: The increase in computation cost and storage of convolutional neural networks (CNNs) severely hinders their applications on limited-resources devices in recent years. As a result, there is impending necessity to accelerate the networks by certain methods. In this paper, we propose a loss-driven method to prune redundant channels of CNNs. It identifies unimportant channels by using Taylor expansion technique regarding to scaling and shifting factors, and prunes those channels by fixed percentile threshold. By doing so, we obtain a compact network with less parameters and FLOPs consumption. In experimental section, we evaluate the proposed method in CIFAR datasets with several popular networks, including VGG-19, DenseNet-40 and ResNet-164, and experimental results demonstrate the proposed method is able to prune over 70% channels and parameters with no performance loss. Moreover, iterative pruning could be used to obtain more compact network.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2019EDL8200/_p
부
@ARTICLE{e103-d_5_1190,
author={Xin LONG, Xiangrong ZENG, Chen CHEN, Huaxin XIAO, Maojun ZHANG, },
journal={IEICE TRANSACTIONS on Information},
title={Loss-Driven Channel Pruning of Convolutional Neural Networks},
year={2020},
volume={E103-D},
number={5},
pages={1190-1194},
abstract={The increase in computation cost and storage of convolutional neural networks (CNNs) severely hinders their applications on limited-resources devices in recent years. As a result, there is impending necessity to accelerate the networks by certain methods. In this paper, we propose a loss-driven method to prune redundant channels of CNNs. It identifies unimportant channels by using Taylor expansion technique regarding to scaling and shifting factors, and prunes those channels by fixed percentile threshold. By doing so, we obtain a compact network with less parameters and FLOPs consumption. In experimental section, we evaluate the proposed method in CIFAR datasets with several popular networks, including VGG-19, DenseNet-40 and ResNet-164, and experimental results demonstrate the proposed method is able to prune over 70% channels and parameters with no performance loss. Moreover, iterative pruning could be used to obtain more compact network.},
keywords={},
doi={10.1587/transinf.2019EDL8200},
ISSN={1745-1361},
month={May},}
부
TY - JOUR
TI - Loss-Driven Channel Pruning of Convolutional Neural Networks
T2 - IEICE TRANSACTIONS on Information
SP - 1190
EP - 1194
AU - Xin LONG
AU - Xiangrong ZENG
AU - Chen CHEN
AU - Huaxin XIAO
AU - Maojun ZHANG
PY - 2020
DO - 10.1587/transinf.2019EDL8200
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E103-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2020
AB - The increase in computation cost and storage of convolutional neural networks (CNNs) severely hinders their applications on limited-resources devices in recent years. As a result, there is impending necessity to accelerate the networks by certain methods. In this paper, we propose a loss-driven method to prune redundant channels of CNNs. It identifies unimportant channels by using Taylor expansion technique regarding to scaling and shifting factors, and prunes those channels by fixed percentile threshold. By doing so, we obtain a compact network with less parameters and FLOPs consumption. In experimental section, we evaluate the proposed method in CIFAR datasets with several popular networks, including VGG-19, DenseNet-40 and ResNet-164, and experimental results demonstrate the proposed method is able to prune over 70% channels and parameters with no performance loss. Moreover, iterative pruning could be used to obtain more compact network.
ER -