A significant progress has been made in automated medical diagnosis with the advent of deep learning methods in recent years. However, deploying a deep learning model for mobile and small-scale, low-cost devices is a major bottleneck. Further, breast cancer is more prevalent currently, and ductal carcinoma being its most common type. Although many machine/deep learning methods have already been investigated, still, there is a need for further improvement.
This paper proposes a novel deep convolutional neural network (CNN) based transfer learning approach complemented with structured filter pruning for histopathological image classification, and to bring down the run-time resource requirement of the trained deep learning models. In the proposed method, first, the less important filters are pruned from the convolutional layers and then the pruned models are trained on the histopathological image dataset.
We performed extensive experiments using three popular pre-trained CNNs, VGG19, ResNet34, and ResNet50. With VGG19 pruned model, we achieved an accuracy of 91.25% outperforming earlier methods on the same dataset and architecture while reducing 63.46% FLOPs. Whereas, with the ResNet34 pruned model, the accuracy increases to 91.80% with 40.63% fewer FLOPs. Moreover, with the ResNet50 model, we achieved an accuracy of 92.07% with 30.97% less FLOPs.
The experimental results reveal that the pre-trained model’s performance complemented with filter pruning exceeds original pre-trained models. Another important outcome of the research is that the pruned model with reduced resource requirements can be deployed in point-of-care devices for automated diagnosis applications with ease.

Copyright © 2021 Elsevier Ltd. All rights reserved.

Author