def forward(self, input):
"""This function is easily defined as the ratio between the hyperbolic
sine and the cosine functions (or expanded, as the ratio of the
half?difference and half?sum of two exponential functions in the
points :math:`z` and :math:`-z`):
.. math:: tanh(z) & = \\frac{sinh(z)}{cosh(z)} \\\\
& = \\frac{e^z - e^{-z}}{e^z + e^{-z}}
Fortunately, numpy provides :meth:`tanh` methods. So in our implementation,
we directly use :math:`\\varphi(x) = \\tanh(x)`.
Parameters
----------
x : float32
The activation (the summed, weighted input of a neuron).
Returns
-------
float32 in [-1, 1]
The output of the tanh function applied to the activation.
"""
self.last_forward = np.tanh(input)
return self.last_forward
评论列表
文章目录