Development of a CNN-Based Knowledge System for Rupiah Currency Authenticity Detection and Nominal Classification
DOI:
https://doi.org/10.29407/gj.v10i1.27464Keywords:
Rupiah Currency , CNN , Authenticity Detection and Denomination ClassificationAbstract
The circulation of counterfeit money in Indonesia inflicts substantial losses on the public and financial institutions. Manual verification of money is inefficient and error-prone, especially during high transaction volumes, because counterfeit bills exhibit physical characteristics nearly identical to genuine currency. To uncover counterfeit notes, an ultraviolet lamp exposes invisible ink. This research employs the Convolutional Neural Network (CNN) to detect authenticity and classify Indonesian rupiah banknotes. The CNN is trained using images of authentic banknotes captured with a camera and ultraviolet light across various denominations. The system stores the images and trains the model to identify authenticity and denomination features. Experimental results demonstrate that the proposed approach achieves high classification accuracy in distinguishing genuine and counterfeit Rupiah banknotes, as well as in recognising their respective denominations. The testing phase introduces real notes exposed to ultraviolet light, producing images that reveal invisible ink patterns. The authenticity detection achieved a 100% success rate, while the denomination recognition rates were 70% for Rp. 5,000 notes, 80% for Rp. 10,000 and Rp. 20,000 notes, and 90% for Rp. 50,000 and Rp. 100,000 notes. The system’s overall success rate is 82%.
References
[1] K. Sn and Chitra, “Evaluating Machine Learning Algorithms for Fake Currency Detection,” 2024.
[2] S. Jhosna, “IMAGE BASED FAKE CURRENCY DETECTION USING EXPLAINABLE AI.” [Online]. Available: www.ijerst.com
[3] K. Patil, K. Pawar, G. Singh Rajput, D. Kumawat, and A. Professor, “Counterfeit Currency Detection: Leveraging Image Processing and Machine Learning Techniques.”
[4] S. Andini, “Image-Based Detection of Reduced Security Features in Indonesian Banknotes Using U-Net Architecture,” Journal of Applied Data Sciences, vol. 7, no. 1, pp. 425–437, Jan. 2026, doi: 10.47738/jads.v7i1.1087.
[5] A. Zuhair, K. T. Pambudi, M. Syukron, A. Prasetyaningtyas, and T. Baggas Pratama, “SISTEM PENDETEKSI KEASLIAN NOMINAL UANG KERTAS UNTUK TUNANETRA DENGAN SENSOR WARNA DAN SISTEM SUARA”.
[6] A. Zafar et al., “Convolutional Neural Networks: A Comprehensive Evaluation and Benchmarking of Pooling Layer Variants,” Symmetry (Basel), vol. 16, no. 11, Nov. 2024, doi: 10.3390/sym16111516.
[7] H. C. Ukwuoma, G. Dusserre, G. Coatrieux, and J. Vincent, “Analysis of digital twin and its physical object: Exploring the efficiency and accuracy of datasets for real-world application,” Data Science and Management, vol. 7, no. 4, pp. 361–375, Dec. 2024, doi: 10.1016/j.dsm.2024.04.002.
[8] M. A. Sausan, L. R. Hasanah, and M. Sirozi, “Analysis of Scientific Knowledge Requirements from the Perspective of Philosophy of Science in the Digital Age,” Journal of Educational Sciences, vol. 9, no. 6, pp. 6625–6652, 2025, doi: 10.31258/jes.9.6.p.6625-6652.
[9] Y. Gonzalez Tejeda and H. A. Mayer, “Deep Learning with Convolutional Neural Networks: A Compact Holistic Tutorial with Focus on Supervised Regression,” Mach Learn Knowl Extr, vol. 6, no. 4, pp. 2753–2782, Dec. 2024, doi: 10.3390/make6040132.
[10] Bank Indonesia, “Terdapat tiga aspek inovasi penguatan Uang TE 2022 …,” Aug. 12, 2022.
[11] C. Dafri Widagdo and H. Firmansyah, “333 333 THE IMPACT OF COUNTERFEIT MONEY CIRCULATION ON MONETARY STABILITY AND THE ROLE OF BANK INDONESIA IN HANDLING IT,” 2025.
[12] A. Zafar et al., “Convolutional Neural Networks: A Comprehensive Evaluation and Benchmarking of Pooling Layer Variants,” Symmetry (Basel), vol. 16, no. 11, Nov. 2024, doi: 10.3390/sym16111516.
[13] R. Archana and P. S. E. Jeevaraj, “Deep learning models for digital image processing: a review,” Artif Intell Rev, vol. 57, no. 1, Jan. 2024, doi: 10.1007/s10462-023-10631-z.
[14] G. S. Mmbando and K. Ngongolo, “Environmental & health impacts of ultraviolet radiation: current trends and mitigation strategies,” Dec. 01, 2024, Springer Nature. doi: 10.1007/s43621-024-00698-1.
[15] X. Tang, T. Yang, D. Yu, H. Xiong, and S. Zhang, “Current insights and future perspectives of ultraviolet radiation (UV) exposure: Friends and foes to the skin and beyond the skin,” Mar. 01, 2024, Elsevier Ltd. doi: 10.1016/j.envint.2024.108535.
[16] Purwono, A. Ma’arif, W. Rahmaniar, H. I. K. Fathurrahman, A. Z. K. Frisky, and Q. M. U. Haq, “Understanding of Convolutional Neural Network (CNN): A Review,” International Journal of Robotics and Control Systems, vol. 2, no. 4, pp. 739–748, 2022, doi: 10.31763/ijrcs.v2i4.888.
[17] L. Alzubaidi et al., “Review of deep learning: concepts, CNN architectures, challenges, applications, future directions,” J Big Data, vol. 8, no. 1, Dec. 2021, doi: 10.1186/s40537-021-00444-8.
[18] P. Divya Jenifar, V. Harinitha, and A. James Christiya, “Currency Detection and Recognition System Based on Deep Learning,” International Journal of Scientific Research in Engineering and Management, 2023, doi: 10.55041/IJSREM18338.
[19] L. S. Rosidi, N. A. J. Sufri, and M. A. As’Ari, “DEEP LEARNING BASED MALAYSIAN COINS RECOGNITION FOR VISUAL IMPAIRED PERSON,” ASEAN Engineering Journal, vol. 12, no. 2, pp. 119–126, Jun. 2022, doi: 10.11113/aej.V12.17143.
[20] A. Dosovitskiy et al., “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale,” Jun. 2021, [Online]. Available: http://arxiv.org/abs/2010.11929
[21] R. Raj and A. Kos, “An Extensive Study of Convolutional Neural Networks: Applications in Computer Vision for Improved Robotics Perceptions,” Feb. 01, 2025, Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/s25041033.
[22] “scitepress.org_publishedPapers_2025_143230_pdf_index.html_utm_source=chatgpt.com”.
[23] E. D. Cubuk, B. Zoph, V. Vasudevan, and Q. V Le Google Brain, “AutoAugment: Learning Augmentation Strategies from Data.” [Online]. Available: https://pillow.readthedocs.io/en/5.1.x/
[24] A. A. Adegun, J. V. Fonou-Dombeu, S. Viriri, and J. Odindi, “Ontology-Based Deep Learning Model for Object Detection and Image Classification in Smart City Concepts,” Smart Cities, vol. 7, no. 4, pp. 2182–2207, Aug. 2024, doi: 10.3390/smartcities7040086.
[25] “View of Advancing Knowledge with Ultraviolet- Spectroscopy_ Recent Trends and Future Insights”.
[26] R. Raj and A. Kos, “An Extensive Study of Convolutional Neural Networks: Applications in Computer Vision for Improved Robotics Perceptions,” Feb. 01, 2025, Multidisciplinary Digital Publishing Institute (MDPI). doi: 10.3390/s25041033.
[27] I. Shadoul, R. Al-Hmouz, A. Hossen, M. Mesbah, and M. Deveci, “The effect of pooling parameters on the performance of convolution neural network,” Artif Intell Rev, vol. 58, no. 9, Sep. 2025, doi: 10.1007/s10462-025-11273-z.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Ahmad Sahru Romadhon, Prof. Dr.Ir. Syaad Patmanthara , M.Pd, Anik Nur Handayani

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Copyright on any article is retained by the author(s).
- The author grants the journal, the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work’s authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal’s published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
- The article and any associated published material is distributed under the Creative Commons Attribution-ShareAlike 4.0 International License




