Adaptasi Model CNN Terlatih pada Aplikasi Bergerak untuk Klasifikasi Citra Termal Payudara
Abstract
The model development for breast thermal image classification can be done using deep learning methods, especially the convolutional neural network (CNN) architecture. This article focuses on adapting a trained CNN (trained model) on a mobile application for binary classification of breast thermal images into normal and abnormal classes. The CNN model applied in this study was based on ShuffleNet, called BreaCNet, with a learning weight of 1028 filters generated from training on images downloaded from the Database for Mastology Research (DMR) and a model size of 22 MB. The model must be converted into a mobile application to enable a trained model to be adapted into a mobile platform. The BreaCNet model was built using MatLab; thus, the stages in the adaptation process consisted of converting the model into ONNX file format, converting ONNX files into Tensorflow files, and Tensorflow files into Tensorflow Lite format. However, not all nodes are fully supported by MATLAB. The shuffle node on ShuffleNet cannot be fully exported using ExportToOnnx, so it needs to be re-defined with a placeholder named “MATLAB PLACEHOLDER”. In addition to the model conversion process, this article describes the user interaction process with the application using UML diagrams and application feature menu designs. The application was also tested on 20 thermal images of the breast. The testing results show that the application can perform the image classification process on mobile devices in less than 1 second with an accuracy rate of 85%. Finally, the breast thermal image screening application has been successfully built by directly interpreting the thermal image of the breast on a mobile device to keep the user data private.
Keywords
Full Text:
PDFReferences
World Health Organization, “Breast cancer,” https://www.who.int/news-room/fact-sheets/detail/breast-cancer, 2021. [Online]. Available: https://www.who.int/news-room/fact-sheets/detail/breast-cancer. [Accessed: 14-Feb-2022].
I. A. for R. on Cancer, “Global Cancer Observatory.” [Online]. Available: http://gco.iarc.fr/.
R. Roslidar et al., “A Review on Recent Progress in Thermal Imaging and Deep Learning Approaches for Breast Cancer Detection,” IEEE Access, vol. 8, pp. 116176–116194, 2020.
A. Kamilaris and F. X. Prenafeta-Boldú, “Deep learning in agriculture: A survey,” Computers and Electronics in Agriculture, vol. 147, pp. 70–90, 2018.
P. Puri et al., “Deep learning for dermatologists: Part II. Current applications,” Journal of the American Academy of Dermatology, May 2020.
L. Zhou, C. Zhang, F. Liu, Z. Qiu, and Y. He, “Application of Deep Learning in Food: A Review,” Comprehensive Reviews in Food Science and Food Safety, vol. 18, no. 6, pp. 1793–1811, Nov. 2019.
R. Roslidar, K. Saddami, F. Arnia, M. Syukri, and K. Munadi, “A study of fine-tuning CNN models based on thermal imaging for breast cancer classification,” in 2019 IEEE International Conference on Cybernetics and Computational Intelligence (CyberneticsCom), 2019, pp. 77–81.
F. J. Fernández-Ovies, E. S. Alférez-Baquero, E. J. de Andrés-Galiana, A. Cernea, Z. Fernández-Muñiz, and J. L. Fernández-Mart’inez, “Detection of breast cancer using infrared thermography and deep neural networks,” in International Work-Conference on Bioinformatics and Biomedical Engineering, 2019, pp. 514–523.
S. Tello-Mijares, F. Woo, and F. Flores, “Breast Cancer Identification via Thermography Image Segmentation with a Gradient Vector Flow and a Convolutional Neural Network,”Journal of Healthcare Engineering, vol. 2019, 2019.
R. Sánchez-Cauce, J. Pérez-Martín, and M. Luque, “Multi-input convolutional neural network for breast cancer detection using thermal images and clinical data,” Computer Methods and Programs in Biomedicine, vol. 204, p. 106045, Jun. 2021.
J. Zuluaga-Gomez, Z. Al Masry, K. Benaggoune, S. Meraghni, and N. Zerhouni, “A CNN-based methodology for breast cancer diagnosis using thermal images,” arXiv preprint arXiv:1910.13757, 2019.
M. A. S. Al Husaini, M. H. Habaebi, M. R. Islam, and T. S. Gunawan, “Self-detection of early breast cancer application with infrared camera and deep learning,” Electronics (Switzerland), vol. 10, no. 20, Oct. 2021.
R. Roslidar et al., “BreaCNet: A high-accuracy breast thermogram classifier based on mobile convolutional neural network,” Mathematical Biosciences and Engineering, vol. 19, no. 2, pp. 1304–1331, 2021.
R. C. Castanyer, S. Martinez-Fernandez, and X. Franch, “Integration of Convolutional Neural Networks in Mobile Applications,” in 2021 IEEE/ACM 1st Workshop on AI Engineering - Software Engineering for AI (WAIN), 2021, pp. 27–34.
A. Farrokhi, R. Farahbakhsh, J. Rezazadeh, and R. Minerva, “Application of Internet of Things and artificial intelligence for smart fitness: A survey,” Computer Networks, vol. 189. Elsevier B.V., 22-Apr-2021.
L. Fraiwan, J. Ninan, and M. Al-Khodari, “Mobile Application for Ulcer Detection,” The Open Biomedical Engineering Journal, vol. 12, no. 1, pp. 16–26, Jun. 2018.
R. Roslidar et al., “BreaCNet: A high-accuracy breast thermogram classifier based on mobile convolutional neural network,”Mathematical Biosciences and Engineering, vol. 19, no. 2, pp. 1304–1331, 2021.
L. F. Silva, D. C. M. Saade, G. O. Sequeiros, A. C. Silva, and A. C. Paiva, “A new database for breast research with infrared image,” Journal of Medical Imaging and Health Informatics, vol. 4, no. 1, pp. 92–100, 2014.
N. Ma, X. Zhang, H. Zheng, and J. Sun, “ShuffleNet V2: Practical guidelines for efficient CNN architecture design,” 2018.
“TensorFlow Lite | ML for Mobile and Edge Devices.” [Online]. Available: https://www.tensorflow.org/lite. [Accessed: 29-May-2022].
A. Zeroual, M. Derdour, M. Amroune, and A. Bentahar, “Using a Fine-Tuning Method for a Deep Authentication in Mobile Cloud Computing Based on Tensorflow Lite Framework,” in 2019 International Conference on Networking and Advanced Systems (ICNAS), 2019, pp. 1–5.
DOI: https://doi.org/10.17529/jre.v18i3.8754
Refbacks
- There are currently no refbacks.
Jurnal Rekayasa Elektrika (JRE) is published under license of Creative Commons Attribution-ShareAlike 4.0 International License.