- 
                Notifications
    You must be signed in to change notification settings 
- Fork 5
Description
Hi, is there any chance that the codes can be released recently?
BTW, I'm curious about the classification performances of the normalized topic vectors generated by different models, which I think will show whether the topic vectors are actually more representative and distinguishable.
Another question is that can we remove the KLD in perplexity computation? Since the definition of perplexity only involves two conditional probabilities p(z|d) and p (w|z) (which does not include the KL term in ELBO), and we do not concern if the KLD (I haven't seen any paper discuss about the effect of KLD for NTMs) is low enough that the decoder is able to generate new samples with samples drawn from MF-Gaussian (for NTMs, the generated vectors only contains log-likelihoods, which I think is less helpful) ?
Thank for sharing your excellent work!