-
Notifications
You must be signed in to change notification settings - Fork 63
Open
Description
In the decoder an LSTM is used
Pytorch-Sketch-RNN/sketch_rnn.py
Lines 151 to 157 in 5c3e213
class DecoderRNN(nn.Module): | |
def __init__(self): | |
super(DecoderRNN, self).__init__() | |
# to init hidden and cell from z: | |
self.fc_hc = nn.Linear(hp.Nz, 2*hp.dec_hidden_size) | |
# unidirectional lstm: | |
self.lstm = nn.LSTM(hp.Nz+5, hp.dec_hidden_size, dropout=hp.dropout) |
While in the original paper, the description of the architecture at page 6 states that
For the decoder RNN, we use HyperLSTM, as this type of RNN cell excels at sequence generation tasks
Referring to a very different implementation of an LSTM that can generate different weights for itself for every element in a sequence. The model is defined in this paper as well as implementation details defined in the Appendix Sections 2.2 and 2.3
Metadata
Metadata
Assignees
Labels
No labels