Skip to content

Missing Relu in deconv layers? #4

@davidtellez

Description

@davidtellez

According to your definition of general_deconv2d(), the default activation is linear. When defining the decoder, you don't add Relu units after the deconv layers. Only the last layer has a non-linearity (tanh). Is this the intended behavior? What am I missing?

o_d1 = general_deconv2d(o_d0, 512, name="D512_2")

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions