TY - JOUR
T1 - DivNet
T2 - Efficient Convolutional Neural Network via Multilevel Hierarchical Architecture Design
AU - Kaddar, Bachir
AU - Fizazi, Hadria
AU - Hernandez-Cabronero, Miguel
AU - Sanchez, Victor
AU - Serra-Sagrista, Joan
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2021
Y1 - 2021
N2 - Designing small and efficient mobile neural networks is difficult because the challenge is to determine the architecture that achieves the best performance under a given limited computational scenario. Previous lightweight neural networks rely on a cell module that is repeated in all stacked layers across the network. These approaches do not permit layer diversity, which is critical for achieving strong performance. This paper presents an experimental study to develop an efficient mobile network using a hierarchical architecture. Our proposed mobile network, called Diversity Network (DivNet), has been shown to perform better than the basic architecture generally employed by the best high-efficiency models-with simply stacked layers-, regarding complexity cost and performance. A set of architectural design decisions are described that reduce the proposed model size while yielding a significant performance improvement. Our experiments on image classification show that compared to, respectively, MobileNetV2, SqueezeNet, and ShuffleNetV2, our proposal DivNet can improve accuracy by 2.09%, 0.76%, and 0.66% on the CIFAR100 dataset, and by 0.05%, 4.96%, and 1.13% on the CIFAR10 dataset. On more complex datasets, e.g., ImageNet, our proposal DivNet achieves 70.65% Top-1 accuracy and 90.23% Top-5 accuracy, still better than other small models like MobilNet, SqueezeNet, ShuffleNet.
AB - Designing small and efficient mobile neural networks is difficult because the challenge is to determine the architecture that achieves the best performance under a given limited computational scenario. Previous lightweight neural networks rely on a cell module that is repeated in all stacked layers across the network. These approaches do not permit layer diversity, which is critical for achieving strong performance. This paper presents an experimental study to develop an efficient mobile network using a hierarchical architecture. Our proposed mobile network, called Diversity Network (DivNet), has been shown to perform better than the basic architecture generally employed by the best high-efficiency models-with simply stacked layers-, regarding complexity cost and performance. A set of architectural design decisions are described that reduce the proposed model size while yielding a significant performance improvement. Our experiments on image classification show that compared to, respectively, MobileNetV2, SqueezeNet, and ShuffleNetV2, our proposal DivNet can improve accuracy by 2.09%, 0.76%, and 0.66% on the CIFAR100 dataset, and by 0.05%, 4.96%, and 1.13% on the CIFAR10 dataset. On more complex datasets, e.g., ImageNet, our proposal DivNet achieves 70.65% Top-1 accuracy and 90.23% Top-5 accuracy, still better than other small models like MobilNet, SqueezeNet, ShuffleNet.
KW - Deep neural network
KW - mobile network
KW - network compression
UR - http://www.scopus.com/inward/record.url?scp=85111604609&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2021.3099952
DO - 10.1109/ACCESS.2021.3099952
M3 - Article
AN - SCOPUS:85111604609
SN - 2169-3536
VL - 9
SP - 105892
EP - 105901
JO - IEEE Access
JF - IEEE Access
M1 - 9495788
ER -