Investigating the Impact of Min-Max Data Normalization on the Regression Performance of K-Nearest Neighbor with Different Similarity Measurements
Abstract
K-nearest neighbor (KNN) is a lazy supervised learning algorithm, which depends on computing the similarity between the target and the closest neighbor(s). On the other hand, min-max normalization has been reported as a useful method for eliminating the impact of inconsistent ranges among attributes on the efficiency of some machine learning models. The impact of min-max normalization on the performance of KNN models is still not clear, and it needs more investigation. Therefore, this research examines the impacts of the min-max normalization method on the regression performance of KNN models utilizing eight different similarity measures, which are City block, Euclidean, Chebychev, Cosine, Correlation, Hamming, Jaccard, and Mahalanobis. Five benchmark datasets have been used to test the accuracy of the KNN models with the original dataset and the normalized dataset. Mean squared error (MSE) has been utilized as a performance indicator to compare the results. It’s been concluded that the impact of min-max normalization on the KNN models utilizing City block, Euclidean, Chebychev, Cosine, and Correlation depends on the nature of the dataset itself, therefore, testing models on both original and normalized datasets are recommended. The performance of KNN models utilizing Hamming, Jaccard, and Mahalanobis makes no difference by adopting min-max normalization because of their ratio nature, and dataset covariance involvement in the similarity calculations. Results showed that Mahalanobis outperformed the other seven similarity measures. This research is better than its peers in terms of reliability, and quality because it depended on testing different datasets from different application fields.
Downloads
References
Ahsan, M.M., Mahmud, M.A.P, Saha, P.K., Gupta, K.D. and Siddique, Z., 2021. Effect of data scaling methods on machine learning algorithms and model performance. Technologies, 9, p.52.
Aksu, G., Güzeller, C.O. and Eser, M.T., 2019. The effect of the normalization method used in different sample sizes on the success of artificial neural network model. International Journal of Assessment Tools in Education, 6(2), pp.170-192.
Ambarwari, A., Adrian, Q.J. and Herdiyeni, Y., 2020. Analysis of the effect of data scaling on the performance of the machine learning algorithm for plant identification. Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi), 4(1), pp.117-122.
Bhardwaj, C.A., Mishra, M. and Desikan, K., 2018. Dynamic Feature Scaling for K-Nearest Neighbor Algorithm. Available from: https://www.arxiv.org/ftp/ arxiv/papers/1811/1811.05062.pdf [Last accessed on 2022 Feb 01].
Brooks, T.F., Pope, D.S. and Marcolini, M.A., 1989. Airfoil Self-noise and Prediction (NASA Reference Publication). Technical Report 1218. National Aeronautics and Space Administration, United States.
Cha, S.H., 2007. Comprehensive survey on distance/similarity measures between probability density functions. International Journal of Mathematical Models and Methods in Applied Sciences, 4(1), pp.300-307.
Cover, T. and Hart, P., 1976. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), pp.21-27.
Dadzie, E. and Kwakye, K., 2021. Developing a Machine Learning Algorithm- Based Classification Models for the Detection of High-Energy Gamma Particles. Available from: https://www.hal.archives-ouvertes.fr/hal-03425661 [Last accessed on 2022 Feb 01].
Dua, D. and Graff, C., 2019. UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA, p.27. Available from: http://www.archive.ics.uci.edu/ml [Last accessed on 2022 Feb 01].
Fix, E. and Hodges, J.L., 1951. Discriminatory Analysis, Nonparametric Discrimination: Consistency Properties. Technical Report 4, USAF School of Aviation Medicine, Randolph Field, United States.
Jayalakshmi, T. and Santhakumaran, A., 2011. Statistical normalization and back propagationfor classification. International Journal of Computer Theory and Engineering, 3, pp.89-93.
Pires, I.M., Hussain, F., Garcia, N.M., Lamesk, P. and Zdravevski, E., 2020. Homogeneous data normalization and deep learning: A case study in human activity classification. Future Internet, 12, pp.1-14.
Prasetyo, J., Setiawan, N.A. and Adji, T.B., 2020. Improving normalization method of higher-order neural network in the forecasting of oil production. E3S Web of Conferences, 200, p.02016.
Rajeswari, D. and Thangavel, K., 2020. The performance of data normalization techniques on heart disease datasets. International Journal of Advanced Research in Engineering and Technology, 11, pp.2350-2357.
Rana, P.S., 2013. Physicochemical Properties of Protein Tertiary Structure Data Set. UCI Machine Learning Repository. Available from: https://www.archive.ics. uci.edu/ml/datasets/Physicochemical+Properties+of+Protein+Tertiary+Structure
Shorman, A.R.A., Faris, H., Castillo, P.A., Merelo, J.J. and Al-Madi, N., 2018. The influence of input data standardization methods on the prediction accuracy of genetic programming generated classifiers. In: Proceedings of the 10th International Joint Conference on Computational Intelligence. SciTePress, Portugal, pp.79-85.
Singh, B.K., Verma, K. and Thoke, A.S., 2015. Investigations on Impact of Feature Normalization Techniques on Classifier’s Performance in Breast Tumor Classification. International Journal of Computer Applications, 116(19), pp.975-8887.
Tüfekci, P., 2014. Prediction of full load electrical power output of a base load operated combined cycle power plant using machine learning methods. International Journal of Electrical Power and Energy Systems, 60, pp.126-140.
Yeh, I.C. and Hsu, T.K., 2018. Building real estate valuation models with comparative approach through case-based reasoning. Applied Soft Computing, 65, pp.260-271.
Yeh, I.C., 1998. Modeling of strength of high-performance concrete using artificial neural networks. Cement and Concrete Research, 28(12), pp.1797-1808.
Copyright (c) 2022 Peshawa J. Muhammad Ali
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Authors who choose to publish their work with Aro agree to the following terms:
-
Authors retain the copyright to their work and grant the journal the right of first publication. The work is simultaneously licensed under a Creative Commons Attribution License [CC BY-NC-SA 4.0]. This license allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors have the freedom to enter into separate agreements for the non-exclusive distribution of the journal's published version of the work. This includes options such as posting it to an institutional repository or publishing it in a book, as long as proper acknowledgement is given to its initial publication in this journal.
-
Authors are encouraged to share and post their work online, including in institutional repositories or on their personal websites, both prior to and during the submission process. This practice can lead to productive exchanges and increase the visibility and citation of the published work.
By agreeing to these terms, authors acknowledge the importance of open access and the benefits it brings to the scholarly community.