Google News
logo
TensorFlow.js - Interview Questions
What is the ReLU layer?
Rectified Linear Unit Layer acts as an activation layer which activates the function having a value above a specific unit. It replaces the negative values in an image with zero, defining a linear relationship of the variable with the input. It makes the input invariant to noise; hence it is known as subsampling.
Advertisement