You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
text-rnn allows you to create modern neural network architectures which use modern techniques such as skip-embedding and attention weighting. Train either a bidirectional or normal LSTM recurrent neural network to generate text using any dataset. You can continue training a pre-trained model.
GPTs trained with shakespeare dataset. Includes: small 10.8M GPT mimicking Andrej Karpathy's video lecture, Universal Transformer with Adaptive Computation Time