Artificial Intelligence (AI) Mastering Development

Why DeconvNet uses ReLU in the backward pass?

Why DeconvNet (Zeiler, 2014) uses ReLU in the backward pass (after unpooling)? Are not the feature maps values already positive due to the ReLU in the forward pass? so, why the authors apply the ReLU again coming back to the input?


Leave a Reply

Your email address will not be published. Required fields are marked *