A Novel Bellman-Inspired Gated Activation Mechanism for Dynamic Neural Networks

  • Jincheng Zhang Faculty of Science and Technology, Rajabhat Maha Sarakham University, Maha Sarakham 44000, Thailand. http://orcid.org/0009-0005-1860-0009

Abstract

In deep learning, activation function, as the core mechanism of nonlinear transformation, determines the upper limit of the model's expressive power. However, traditional activation functions are mostly static, point-by-point transformations, lacking the ability to respond to and adjust input structures. This paper proposes a "gated activation mechanism" based on the idea of the Bellman equation in reinforcement learning, which introduces state valuation and future reward modeling into the neural network activation mechanism. Experiments were conducted by replacing the activation method in the standard multilayer perceptron (MLP) structure. The results show that this mechanism brings stable performance improvements without increasing the inference overhead. More importantly, this activation method has good versatility as a module and can be widely embedded in a variety of deep neural networks, providing a new paradigm for activation function design.

Downloads

Download data is not yet available.
Published
2025-11-25
How to Cite
Zhang, J. (2025). A Novel Bellman-Inspired Gated Activation Mechanism for Dynamic Neural Networks. ITEGAM-JETIA, 11(56), 20-28. https://doi.org/10.5935/jetia.v11i56.2156
Section
Articles