Perbandingan Bidirectional Encoder Representations from Transformers (BERT) Language Model pada Deteksi Emosi

Penulis

  • Dwi Hosanna Bangkalang Departemen Sistem Informasi, Universitas Kristen Satya Wacana, Indonesia
  • Nina Setiyawati Departemen Teknik Informatika, Universitas Kristen Satya Wacana, Indonesia

DOI:

https://doi.org/10.52436/1.jpti.988

Kata Kunci:

BERT, Deep Learning, Deteksi Emosi, Klasifikasi Emosi, Teks

Abstrak

Informasi Tekstual menjadi salah satu cara untuk deteksi emosi. Namun, ekstraksi emosi menjadi tantangan tersendiri dikarenakan makna implisit dan eksplisit yang terkandung dalam teks. Pendekatan Ekstraksi makna emosi berbasis teks sudah banyak dilakukan dengan model deep learning. Meski begitu, performa komputasi dan akurasi model sering kali kontradiktif dikarenakan model yang kompleks. Oleh karena itu, dilakukan eksperimental model deep learning menggunakan BERT Model Language untuk deteksi emosi. Tujuan penelitian ini yaitu menghasilkan model deteksi emosi yang optimal dan akurat yang dapat memberikan performa komputasi yang rendah. Metode yang digunakan pada penelitian ini yaitu data collection, data pre-processing, arsitektur BERT, BERT Model Comparison, dan Model Evaluation. Model deteksi emosi terbaik ditemukan pada model DistilBERT dengan akurasi 0.9425 dan nilai F1 0.942. Berdasarkan evaluasi proses pembelajaran model DistilBERT, terdapat loss trend menurun sehingga model semakin mampu untuk melakukan prediksi emosi yang baik terhadap unseen data. Model deteksi emosi yang diusulkan, menghasilkan performa lebih unggul dibandingkan dengan model deteksi emosi menggunakan deep learning pada penelitian sebelumnya dimana model ini dapat meminimalisir kompleksitas model deteksi emosi, mengurangi komputasi namun tetap memberikan performa yang optimal.

Unduhan

Data unduhan belum tersedia.

Referensi

A. Deo, D. Thakore, R. Mistry, V. Iyer, and P. Tawde, “A Review of Emotion Detection Systems,” in 2022 5th International Conference on Contemporary Computing and Informatics (IC3I), Uttar Pradesh, India: IEEE, 2022, pp. 32–37. doi: 10.1109/IC3I56241.2022.10073423.

P. Nandwani and R. Verma, “A review on sentiment analysis and emotion detection from text,” Dec. 01, 2021, Springer. doi: 10.1007/s13278-021-00776-6.

M. Singh, A. K. Jakhar, and S. Pandey, “Sentiment analysis on the impact of coronavirus in social life using the BERT model,” Soc Netw Anal Min, vol. 11, no. 1, Dec. 2021, doi: 10.1007/s13278-021-00737-z.

M. Alrasheedy, R. Muniyandi, and F. Fauzi, “Text-Based Emotion Detection and Applications: A Literature Review,” in International Conference on Cyber Resilience (ICCR), Dubai, United Arab Emirates, Dubai, United Arab Emirates: IEEE, 2022, pp. 1–9. doi: 10.1109/ICCR56254.2022.9995902.

S. Kusal, S. Patil, · Jyoti Choudrie, · Ketan Kotecha, · Deepali Vora, and I. Pappas, “A Review on Text-Based Emotion Detection-Techniques, Applications, Datasets, and Future Directions.”

N. Alswaidan and M. E. B. Menai, “A survey of state-of-the-art approaches for emotion recognition in text,” Knowl Inf Syst, vol. 62, no. 8, pp. 2937–2987, Aug. 2020, doi: 10.1007/s10115-020-01449-0.

M. Munikar, S. Shakya, and A. Shrestha, “Fine-grained Sentiment Classification using BERT,” 2019. Accessed: May 21, 2025. [Online]. Available: https://ieeexplore.ieee.org/document/8947435/

A. R. Abas, I. Elhenawy, M. Zidan, and M. Othman, “Bert-cnn: A deep learning model for detecting emotions from text,” Computers, Materials and Continua, vol. 71, no. 2, pp. 2943–2961, 2022, doi: 10.32604/cmc.2022.021671.

V. Sanh, L. Debut, J. Chaumond, and T. Wolf, “DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter,” Oct. 2019, [Online]. Available: http://arxiv.org/abs/1910.01108

J. Devlin, M.-W. Chang, K. Lee, K. T. Google, and A. I. Language, “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” 2019. [Online]. Available: https://github.com/tensorflow/tensor2tensor

Koroteev MV, “BERT: A Review of Applications in Natural Language Processing and Understanding,” 2021. Accessed: May 21, 2025. [Online]. Available: https://arxiv.org/abs/2103.11943

N. Smairi, H. Abadlia, H. Brahim, and W. L. Chaari, “Fine-tune BERT based on Machine Learning Models For Sentiment Analysis,” in Procedia Computer Science, Elsevier B.V., 2024, pp. 2390–2399. doi: 10.1016/j.procs.2024.09.531.

D. Zülfikar ASLAN, “Emotion Detection Via Bert-Based Deep Learning Approaches In Natural Language Processing.” Accessed: May 21, 2025. [Online]. Available: https://dergipark.org.tr/en/download/article-file/4105316

D. Yohanes, J. S. Putra, K. Filbert, K. M. Suryaningrum, and H. A. Saputri, “Emotion Detection in Textual Data using Deep Learning,” in Procedia Computer Science, Elsevier B.V., 2023, pp. 464–473. doi: 10.1016/j.procs.2023.10.547.

U. Rashid, M. W. Iqbal, M. A. Skiandar, M. Q. Raiz, M. R. Naqvi, and S. K. Shahzad, “Emotion Detection of Contextual Text using Deep learning,” in 4th International Symposium on Multidisciplinary Studies and Innovative Technologies, ISMSIT 2020 - Proceedings, Institute of Electrical and Electronics Engineers Inc., Oct. 2020. doi: 10.1109/ISMSIT50672.2020.9255279.

F. Mohammad et al., “Text Augmentation-Based Model for Emotion Recognition Using Transformers,” Computers, Materials and Continua, vol. 76, no. 3, pp. 3523–3547, 2023, doi: 10.32604/cmc.2023.040202.

D. Zhu, M. A. Hedderich, F. Zhai, D. Ifeoluwa Adelani, and D. Klakow, “Is BERT Robust to Label Noise? A Study on Learning with Noisy Labels in Text Classification,” 2022.

I.-A. Albu and S. Spînu, “Emotion Detection From Tweets Using A Bert And Svm Ensemble Model,” U.P.B. Sci. Bull., Series C, vol. 84, no. 1, 2022, [Online]. Available: https://pypi.org/project/demoji

S. R. Kalmegh, M. Bhushan, and R. Padar, “Empirical Study on Evaluation Metrics for Classification Algorithms,” International Journal of Advanced Research in Science, Communication and Technology (IJARSCT), vol. 3, no. 2, 2023, doi: 10.48175/568.

A. Saputra, A. Saragih, and D. Ronaldo, “Perbandingan Nilai Akurasi Distilbert Dan BERT Pada Dataset Analisis Sentimen Lembaga Kursus,” Jurnal Teknologi Informasi, 2024, doi: https://doi.org/10.47111/JTI.

F. Fajri et al., “Membandingkan Nilai Akurasi BERT dan DistilBERT pada Dataset Twitter,” Jurnal Sistem Informasi, vol. 8, no. 2, pp. 71–80, 2022.

##submission.downloads##

Diterbitkan

2025-10-04

Cara Mengutip

Bangkalang, D. H., & Setiyawati, N. (2025). Perbandingan Bidirectional Encoder Representations from Transformers (BERT) Language Model pada Deteksi Emosi. Jurnal Pendidikan Dan Teknologi Indonesia, 5(9), 2952-2960. https://doi.org/10.52436/1.jpti.988