Abstract

Neural network is a popular method used in machine research, and activation functions, especially ReLu and Tanh, have a very important function in neural networks, to minimize the error value between the output layer and the target class. With variations in the number of hidden layers, as well as the number of neurons in each different hidden layer, this study analyzes 8 models to classify the Titanic's Survivor dataset. The result is that the ReLu function has a better performance than the Tanh function, seen from the average value of accuracy and precision which is higher than the Tanh activation function. The addition of the number of hidden layers has no effect on increasing the performance of the classification results, it can be seen from the decrease in the average accuracy and precision of the models that use 3 hidden layers and models that use 4 hidden layers. The highest accuracy value was obtained in the model using the ReLu activation function with 4 hidden layers and 50 neurons in each hidden layer, while the highest precision value was obtained in the model using the ReLu activation function with 4 hidden layers and 100 neurons in each hidden layer

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call