Reading research papers takes up a big chunk of an applied data scientist's time. Using this WebApp you can just copy-paste/ upload the research paper and get a gist within 15 seconds to decide if it is even relevant/ worth reading for your use case.
Input: Dropout: A Simple Way to Prevent Neural Networks from Overfitting (The Research paper which introduced Dropout)
Output(Summary): "Dropout is a technique to randomly drop units from the neural network during training. Dropout improves the performance of neural networks by adding noise to its hidden units. This prevents units from co-adapting too much. The idea of dropping out is not limited to feed-forward neural networks."
- Deploy as a WebApp.
- Retrain on the Scisumm dataset to improve performance.