Penerapan Fungsi Exponential Pada Pembobotan Fungsi Jarak Euclidean Algoritma K-Nearest Neighbor
DOI:
https://doi.org/10.29407/gj.v6i2.18070Keywords:
Classification, k-NN, Attribute Weighting, Exponential Function, Euclidean DistanceAbstract
– k-Nearest Neighbor (k-NN) is one of the popular classification algorithms and is widely used to solve classification cases. This is because the k-NN algorithm has advantages such as being simple, easy to explain, and easy to implement. However, the k-NN algorithm has a lack of classification results that are strongly influenced by the scale of input data and Euclidean which treats attribute data evenly, not according to the relevance of each data attribute. This causes a decrease in the classification results. One way to improve the classification accuracy performance of the k-NN algorithm is the method of weighting its features when measuring the Euclidean distance. The exponential function of the optimized Euclidean distance measurement is applied to the k-NN algorithm as a distance measurement method. Improving the performance of the k-NN method with the Exponential function for weighting features on k-NN will be carried out by experimentation using the Data Mining method. Then the results of the performance of the objective method will be compared with the original k-NN method and the previous k-NN weighting research method. As a result of the closest distance decision, taking the closest distance to k-NN will be determined with a value of k=5. After the experiment, the goal algorithm was compared with the k-NN, Wk-NN, and DWk-NN algorithms. Overall the comparison results obtained an average value of k-NN 85.87%, Wk-NN 86.98%, DWk-NN 88.19% and the k-NN algorithm given the weighting of the Exponential function obtained a value of 90.17%.
References
[2] M. Bicego, “Weighted K-Nearest Neighbor Revisited,” pp. 1643–1648, 2016.
[3] J. Gou, H. Ma, W. Ou, S. Zeng, Y. Rao, and H. Yang, “A generalized mean distance-based k -nearest neighbor classifier,” Expert Syst. Appl., vol. 115, pp. 356–372, 2019, doi: 10.1016/j.eswa.2018.08.021.
[4] J. Gou, “A New Distance-weighted k -nearest Neighbor Classifier,” no. November 2011, 2014.
[5] P. Cao et al., “Nonlinearity-aware based dimensionality reduction and over-sampling for AD/MCI classification from MRI measures,” Comput. Biol. Med., vol. 91, pp. 21–37, Dec. 2017, doi: https://doi.org/10.1016/j.compbiomed.2017.10.002.
[6] I. Witten, E. Frank, and M. Hall, Data Mining Practical Machine Learning Tools and Techniques Third Edition, vol. 277, no. Tentang Data Mining. 2011.
[7] X. Zheng, Z. Lin, H. Xu, C. Chen, and T. Ye, “Efficient learning ensemble SuperParent-one-dependence estimator by maximizing conditional log likelihood,” Expert Syst. Appl., vol. 42, no. 21, pp. 7732–7745, Nov. 2015, doi: https://doi.org/10.1016/j.eswa.2015.05.051.
[8] U. R. Yelipe, S. Porika, and M. Golla, “An efficient approach for imputation and classification of medical data values using class-based clustering of medical records,” Comput. Electr. Eng., vol. 0, pp. 1–18, 2017, doi: 10.1016/j.compeleceng.2017.11.030.
[9] P. Rani, “A Review of various KNN Techniques,” vol. 5, no. Viii, pp. 1174–1179, 2017.
[10] X. Wu et al., Top 10 algorithms in data mining. 2008.
[11] I. H. Witten, E. Frank, and M. A. Hall, Data Mining Third Edition. Elsevier, 2011.
[12] J. Xia et al., “Adjusted weight voting algorithm for random forests in handling missing values,” Pattern Recognit., vol. 69, pp. 52–60, 2017, doi: 10.1016/j.patcog.2017.04.005.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
1. Copyright on any article is retained by the author(s).
2. The author grants the journal, right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work’s authorship and initial publication in this journal.
3. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal’s published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
4. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
5. The article and any associated published material is distributed under the Creative Commons Attribution-ShareAlike 4.0 International License