ReLU

Group: 5 #group-5

Relations

  • Activation Functions: ReLU (Rectified Linear Unit) is a type of activation function used in neural networks.