Other articles


  1. RNN generations

    On advice from my uncle I'm continuing to fallback on task difficulty with RNNs.

    Unc's tips: - Swirch to generation task - Try residuals - Go deeper - Add projections - No dropout?

    Let's recreate Karpathy's classic post and train a language model on tiny-shakespeare. We can get the entire dataset which is a text …

    read more

social