Implementasi Convolutional Neural Network Untuk Deteksi Emosi Melalui Wajah

Abstract
The human emotional condition can be reflected in speech, gestures, and especially facial expressions. The problem that is often faced is that humans tend to be subjective in assessing people's emotions. Humans can easily guess someone's emotions through the expressions shown, as well as computers. Computers can think like humans if they are given an algorithm for human thinking or artificial intelligence. This research will be an interaction between humans and computers in analyzing human expressions. This research was conducted to prove whether the implementation of CNN (Convolutional Neural Network) can be used in detecting human emotions or not. The material needed to conduct facial recognition research is a dataset in images of various kinds of human expressions. Based on the dataset that has been obtained, the images that have been collected are divided into two parts, namely training data and test data, where each training data and test data has seven different emotion subfolders. Each category of images is 35 thousand data which will later be trimmed to around a few thousand data to balance the dataset. According to their class, these various expressions will be classified into several emotions: angry emotions, happy emotions, fearful emotions, disgusting emotions, surprising emotions, neutral emotions, and sad emotions. The results showed that from the calculation of 40 epochs, 81.92% was obtained for training and 81.69% for testing.