Loading

A Non-Polynomial, Non-Sigmoidal, Bounded and Symmetric Activation Function for Feed – Forward Artificial Neural Networks
Apoorvi Sood1, Pravin Chandra2, Udayan Ghose3

1Apoorvi Sood*, Research Scholar, University School of Information Communication and Technology, Guru Gobind Singh Indrapratha University, N.Delhi, India and Assistant Professor, Netaji Subhash University of Technology, N. Delhi, India.
2Pravin Chandra, University School of Information Communication and Technology, Guru Gobind Singh Indrapratha University, N.Delhi, India.
3Udayan Ghose, University School of Information Communication and Technology, Guru Gobind Singh Indrapratha University, N.Delhi, India.

Manuscript received on September 16, 2019. | Revised Manuscript received on 21 September, 2019. | Manuscript published on October 10, 2019. | PP: 405-410 | Volume-8 Issue-12, October 2019. | Retrieval Number: L33131081219/2019©BEIESP | DOI: 10.35940/ijitee.L3313.1081219
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Feed-forward artificial neural networks are universal approximators of continuous functions. This property enables the use of these networks to solve learning tasks. Learning tasks in this paradigm are cast as function approximation problems. The universal approximation results for these networks require at least one hidden layer with non-linear nodes, and also require that the non-linearities be non-polynomial in nature. In this paper a non-polynomial and non-sigmoidal non-linear function is proposed as a suitable activation function for these networks. The usefulness of the proposed activation function is shown on 12 function approximation task. The obtained results demonstrate that the proposed activation function outperforms the logistic / log-sigmoid and the hyperbolic tangent activation functions.
Keywords: Activation Function, Transfer Function, Squashing Function, Feed-Forward Artificial Neural Networks.
Scope of the Article: Artificial Intelligence