Burning Skin Detection System in Human Body
Abstract
Early accurate burn depth diagnosis is crucial for selecting the appropriate clinical intervention strategies and assessing burn patient prognosis quality. However, with limited diagnostic accuracy, the current burn depth diagnosis approach still primarily relies on the empirical subjective assessment of clinicians. With the quick development of artificial intelligence technology, integration of deep learning algorithms with image analysis technology can more accurately identify and evaluate the information in medical images. The objective of the work is to detect and classify burn area in medical images using an unsupervised deep learning algorithm. The main contribution is to developing computations using one of the deep learning algorithm. To demonstrate the effectiveness of the proposed framework, experiments are performed on the benchmark to evaluate system stability. The results indicate that, the proposed system is simple and suits real life applications. The system accuracy was 75%, when compared with some of the state-of-the-art techniques.
Downloads
References
Alcala-Fdez, J. and Alonso, J.M., 2016. A survey of fuzzy systems software: Taxonomy, current research trends and prospects. IEEE Transactions on Fuzzy Systems, 24(1), pp.40-56.
Badea, M.S., Vertan, C., Florea, C., Florea, L. and Bădoiu, S., 2016. Automatic Burn Area Identification in Color Images. In: 2016 International Conference on Communications (COMM). pp.65-68.
Badrinarayanan, V., Kendall, A. and Cipolla, R., 2017. SegNet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), pp.2481-2495.
Bekir, K., 2016. The positive effects of fuzzy C-means clustering on supervised learning classifiers. International Journal of Artificial Intelligence and Expert Systems, 7(1), pp.1-8.
Bezdek, J.C., 1981. Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum Press, New York.
Brenda, R.O. and Roberto, R.R., 2021. Detection and classification of burnt skin via sparse representation of signals by over-redundant dictionaries. Computers in Biology and Medicine, 132, p.104310.
Chong, J., Kehua, S., Weiguo, X. and Ziqing, Y., 2019. Burn image segmentation based on mask regions with convolutional neural network deep learning framework: More accurate and more convenient. Burns and Trauma, 7, p.6.
Cirillo, M.D., Mirdell, R., Sjöberg, F. and Pham, T.D., 2021. Improving burn depth assessment for pediatric scalds by AI based on semantic segmentation of polarized light photography images. Burns, 47(7), pp.1586-1593.
Deepak, L., Antony, J. and Niranjan, U.C., 2012. Hardware Co-simulation of Skin Burn Image Analysis. In: 19th IEEE International Conference in High Performance Computing (HiPC-2012).
Despo, O., Yeung, S., Jopling, J., Pridgen, B., Sheckter, C., Silberstein, S., FeiFei, L. and Milstein, A., 2017. BURNED: Towards Efficient and Accurate Burn Prognosis Using Deep Learning. Avaliable from: [http://cs231n.stanford.edu/reports/2017/pdfs/507.pdf [Last accessed on 2022 Feb 01].
Domagoj, M. and Damir, F., 2020. A systematic overview of recent methods for non-contact chronic wound analysis. Applied Science, 10, p.7613.
Fangzhao, L., Changjian, W., Xiaohui, L., Yuxing, P. and Shiyao, J., 2018. A composite model of wound segmentation based on traditional methods and deep neural networks. Computational Intelligence and Neuroscience, 2018, p.4149103.
Hansen, G.L., Sparrow, E.M., Kokate, J.Y., Leland, K.J. and Iaizzo, P.A., 1997. Wound status evaluation using color image processing. IEEE Transactions on Medical Imaging, 16(1), pp.78-86.
Hao, L., Kepiang, Y., Siyi, C., Wenjun, L. and Zhihui, F., 2021. Aframework for automatic burn image segmentation and burn depth diagnosis using deep learning. Computational and Mathematical Methods in Medicine, 2021, p. 5514224.
Hasan, M.M., Ibraheem, N.A. and Abdulhadi, N.M., 2022. 2D geometric object shapes detection and classification. Webology, 19(1), pp.1689-1702.
Health Encyclopedia Site. Available from: https://www.urmc.rochester.edu/encyclopedia/content.aspx?ContentTypeID=90&ContentID=P09575 [Last accessed on 2022 Oct 01].
Hussein, S., 2021. Automatic layer segmentation in H&E images of mice skin based on colour deconvolution and fuzzy C-mean clustering. Informatics in Medicine Unlocked, 25, p.100692.
Kuan, P.N., Chua, S., Safawi, E.B., Wang, H.H. and Tiong, W., 2018. A comparative study of the classification of skin burn depth in human. Journal of Telecommunication Electronic and Computer Engineering, 9(2-10), pp.15-23.
Liangrui, P., Zhichao, F. and Shaoliang, P., 2022. A review of machine learning approaches, challenges and prospects for computational tumor pathology. arXiv, 2206, p.01728.
Liu, X., Song, L., Liu, S. and Zhang, Y., 2021. A review of deep-learning-based medical image segmentation methods. Sustainability, 13(3), p.1224.
Malini, S., Siva, K. and Niranjan, U., 2013. Classification methods of skin burn images. International Journal of Computer Science and Information Technology, 5(1), pp.109-118.
Miller, S.J., Burke, E.M., Rader, M.D., Coulombe, P.A., and Lavker, R.M., 1998. Reepithelialization of porcine skin by the sweat apparatus. The Journal of Investigative Dermatology, 110(1), pp.13-19.
Murfi, H., Rosaline, N. and Hariadi, N., 2022. Deep autoencoder-based fuzzy c-means for topic detection, Array, 13, p.100124.
Nameirakpam, D., Khumanthem, M. and Yambem, J.C., 2015. Image segmentation using K-means clustering algorithm and subtractive clustering algorithm. Procedia Computer Science, 54, pp.764-771.
Papini, R., 2004. Management of burn injuries of various depths. British Medical Journal, 329(7458), pp. 158-160.
Prabhakar, K., Gaurave, S. and Kailesh, P., 2020. Burn image segmentation based on mask regions with convolutional neural network deep learning framework. International Journal of Research in Engineering Science and Management, 3(8), pp.478-482.
Rahman, T. and Islam, M.S., 2021. Image Segmentation Based on Fuzzy C Means Clustering Algorithm and Morphological Reconstruction. In: International Conference on Information and Communication Technology for Sustainable Development (ICICT4SD). pp.259-263.
Sabeena, B. and Rajkumar, P., 2019. Diagnosis and detection of skin burn analysis segmentation in colour skin images. International Journal of Advanced Research in Computer and Communication Engineering, 6(2), pp. 369-374.
Stephen, T., 2020. Stock Pictures of Wounds. Medetec Wound Database. Available from: https://www.medetec.co.uk/files/medetec-image-databases.html [Last accessed on 2022 Feb 01].
Tina, M., Uros, M., Karin, S.K., Raščan, I.M. and Dragica, M.S., 2015. Advanced therapies of skin injuries. Wiener Klinische Wochenschrift, 127(5), pp.187-198.
Ugur, U.K., Erdinç, K., Tolga, B., Yesim, A. and Serder, T., 2019. Automatic classification of skin burn colour images using texture-based feature extraction. IET Image Processing, 13(11), pp.2018-2028.
Umme, S., Morium, A. and Mohammed, S.U., 2019. Image quality assessment through FSIM, SSIM, MSE and PSNR-a comparative study. Journal of Computer and Communications, 7(3), pp.8-18.
Wagh, A., Jain, S., Mukherjee, A., Agu, E., Pedersen, P.C., Strong, D., Tulu, B. Lindsay, C. and Liu, Z., 2020. Semantic segmentation of smartphone wound images: Comparative analysis of AHRF and CNN-based approaches. IEEE Access, 8, pp.181590-181604.
Wang, C., Anisuzzaman, D.M., Williamson, V., Mrinal, K.D., Behrouz, R., Jeffrey, G., Sandeep, G. and Zeyen, Y., 2020. Fully automatic wound segmentation with deep convolutional neural networks. Scientific Reports, 10(1), p.21897.
Wang, C., Pedrycz, W., Li, Z. and Zhou, M., 2021. Residual-driven fuzzy C-means clustering for image segmentation. IEEE/CAA Journal of Automatica Sinica, 8(4), pp.876-889.
Yeganejou, M. and Dick, S., 2018. Classification Via Deep Fuzzy C-means Clustering. In: IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2018. pp.1-6.
Copyright (c) 2022 Noor M. Abdulhadi, Noor A. Ibraheem, Mokhtar M. Hasan
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Authors who choose to publish their work with Aro agree to the following terms:
-
Authors retain the copyright to their work and grant the journal the right of first publication. The work is simultaneously licensed under a Creative Commons Attribution License [CC BY-NC-SA 4.0]. This license allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors have the freedom to enter into separate agreements for the non-exclusive distribution of the journal's published version of the work. This includes options such as posting it to an institutional repository or publishing it in a book, as long as proper acknowledgement is given to its initial publication in this journal.
-
Authors are encouraged to share and post their work online, including in institutional repositories or on their personal websites, both prior to and during the submission process. This practice can lead to productive exchanges and increase the visibility and citation of the published work.
By agreeing to these terms, authors acknowledge the importance of open access and the benefits it brings to the scholarly community.