-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
According to your definition of general_deconv2d(), the default activation is linear. When defining the decoder, you don't add Relu units after the deconv layers. Only the last layer has a non-linearity (tanh). Is this the intended behavior? What am I missing?
Fader-Networks-Tensorflow/main.py
Line 171 in 22cc9bd
o_d1 = general_deconv2d(o_d0, 512, name="D512_2") |
Metadata
Metadata
Assignees
Labels
No labels