Course talk:CPSC522/Variational Recurrent Neural Networks
- [View source↑]
- [History↑]
Contents
Thread title | Replies | Last modified |
---|---|---|
Critique | 0 | 20:37, 19 February 2023 |
Critique | 0 | 05:23, 13 February 2023 |
Overall, this is a wonderful page on VRNNs, which models the sequential data by introducing non-determinism in the hidden state of the RNNs by incorporating the VAEs at every timestep of the RNN.
1. Instead of providing the source when clicking on the figures, please provide it in the caption as well.
2. The formulas were easy to understand, with a clear understanding of each variable.
3. Just a suggestion, if you like, to add separate high-level illustrations of STORN and VRNN under that respective section. If you could add a detailed view of a cell at timestep t of VRNN, that would also help a reader understand the clear distinction with RNNs.
4. Do you think you can provide a section on other downstream tasks where VRNNs could be used. For example, what are your thoughts on Machine Translation?
I think the article is a really good length and explains the important details of how the lower bound changes based on the architecture. I can also clearly see the contribution of one paper over the other!
Abstract
- A sentence could be added to mention that a comparison between the performance of the two architectures is done at the end.
Stochastic Recurrent Networks
- I am wondering what g represents.
Figures
- The figures could be made bigger.
- It may also be helpful to move the graphical representation of STORN and VRNN to be in between the two sections so that the reader comes across it earlier.
- It may be helpful to mention that it is a condensed representation and there may be multiple features, hidden states, and k latent variables.
Minor Corrections
Variational Recurrent Neural Networks
- In general, this function is a highly flexible, such as a neural network. -> In general, this function is highly flexible, such as a neural network.