KOMPARASI FUNGSI AKTIVASI NEURAL NETWORK PADA DATA TIME SERIES

  • Ibnu Akil (1*) Universitas Bina Sarana Informatika

  • (*) Corresponding Author
Keywords: activation function, machine learning, neural network

Abstract

Abstract The sophistication and success of machine learning in solving problems in various fields of artificial intelligence cannot be separated from the neural networks that form the basis of its algorithms. Meanwhile, the essence of a neural network lies in its activation function. However because so many activation function which are merged lately, it’s needed to search for proper activation function according to the model and it’s dataset used. In this study, the activation functions commonly used in machine learning models will be tested, namely; ReLU, GELU and SELU, for time series data in the form of stock prices. These activation functions are implemented in python and use the TensorFlow library, as well as a model developed based on the Convolutional Neural Network (CNN). From the results of this implementation, the results obtained with the CNN model, that the GELU activation function for time series data has the smallest loss value

Downloads

Download data is not yet available.

References

Ahmed, G. S. (2023). SELU (Scaled Exponential Linear Unit) Activation Function. Www.Iq.Opengenus.Org. https://iq.opengenus.org/scaled-exponential-linear-unit/

Akil, I. (2023). Face Detection Pada Gambar Dengan Menggunakan OpenCV Haar Cascade. INTI NUSA MANDIRI, 17(2).

Brownlee, J. (2021). How to Choose an Activation Function for Deep Learning. In Machine Learning Mastery (pp. 1–26). https://machinelearningmastery.com/choose-an-activation-function-for-deep-learning/

Gustineli, M. (2022). A survey on recently proposed activation functions for Deep Learning. 1–7. http://arxiv.org/abs/2204.02921

Hackersrealm. (2021). Normalize data using Max Absolute & Min Max Scaling | Machine Learning | Python. Www.Hackersrealm.Net. https://www.hackersrealm.net/post/normalize-data-using-max-absolute-min-max-scaling

Lederer, J. (2021). Activation Functions in Artificial Neural Networks: A Systematic Overview. 1–42. http://arxiv.org/abs/2101.09957

Nwankpa, C., Ijomah, W., Gachagan, A., & Marshall, S. (2018). Activation Functions: Comparison of trends in Practice and Research for Deep Learning. 1–20. http://arxiv.org/abs/1811.03378

Pretorius, A. M., Barnard, E., & Davel, M. H. (2019). ReLU and sigmoidal activation functions. CEUR Workshop Proceedings, 2540, 37–48.

Raschka, S., Patterson, J., & Nolet, C. (2020). Machine learning in python: Main developments and technology trends in data science, machine learning, and artificial intelligence. Information (Switzerland), 11(4), 1–48. https://doi.org/10.3390/info11040193

Ryabtsev, A. (2022). 8 Reasons Why Python is Good for AI and ML. Www.Jangostars.Com. https://djangostars.com/blog/why-python-is-good-for-artificial-intelligence-and-machine-learning/#:~:text=Python for machine learning is,and quickly see the results.

Shaw, S. (2022). Activation Functions Compared With Experiments. Www.Wandb.Ai. https://wandb.ai/shweta/Activation Functions/reports/Activation-Functions-Compared-With-Experiments--VmlldzoxMDQwOTQ

Szandała, T. (2018). Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks. Available: https://link.springer.com/chapter/10.1007/978-981-15-5495-7_11.

Ureña-Pliego, M., Martínez-Marín, R., González-Rodrigo, B., & Marchamalo-Sacristán, M. (2023). Automatic Building Height Estimation: Machine Learning Models for Urban Image Analysis. Applied Sciences (Switzerland), 13(8). https://doi.org/10.3390/app13085037

Varshney, M., & Singh, P. (2021). Optimizing nonlinear activation function for convolutional neural networks. SIViP, 15.

Published
2023-08-07
How to Cite
Akil, I. (2023). KOMPARASI FUNGSI AKTIVASI NEURAL NETWORK PADA DATA TIME SERIES. INTI Nusa Mandiri, 18(1), 78 - 83. https://doi.org/10.33480/inti.v18i1.4288
Article Metrics

Abstract viewed = 223 times
PDF downloaded = 357 times