Categories
Artificial Intelligence (AI) Mastering Development

Why DeconvNet uses ReLU in the backward pass?

Why DeconvNet (Zeiler, 2014) uses ReLU in the backward pass (after unpooling)? Are not the feature maps values already positive due to the ReLU in the forward pass? so, why the authors apply the ReLU again coming back to the input?

ref:

https://arxiv.org/abs/1311.2901

Leave a Reply

Your email address will not be published. Required fields are marked *