In recent developments of deep learning the rectified linear unit (ReLU) is more frequently used as one of the possible ways to overcome the numerical problems related to the sigmoids.
https://en.wikipedia.org/wiki/Multilayer_perceptron
In recent developments of deep learning the rectified linear unit (ReLU) is more frequently used as one of the possible ways to overcome the numerical problems related to the sigmoids.
https://en.wikipedia.org/wiki/Multilayer_perceptron
No comments:
Post a Comment