list of efficient attention modules
-
Updated
Aug 23, 2021 - Python
list of efficient attention modules
Convert pretrained RoBerta models to various long-document transformer models
This GitHub repository implements a novel approach for detecting Initial Public Offering (IPO) underpricing using pre-trained Transformers. The models, extended to handle large S-1 filings, leverage both textual information and financial indicators, outperforming traditional machine learning methods.
Industrial Text Scoring using Multimodal Deep Natural Language Processing 🚀 | Code for IEA AIE 2022 paper
A summarization website that can generate summaries from either YouTube videos or PDF files.
This project applies the Longformer model to sentiment analysis using the IMDB movie review dataset. The Longformer model, introduced in "Longformer: The Long-Document Transformer," tackles long document processing with sliding-window and global attention mechanisms. The implementation leverages PyTorch, following the paper's architecture
A WebApp to summarize research papers using HuggingFace Transformers.
AI Large Document processor using Longformer LLM, FastAPI
Training and inference code for the claim veracity checker built on Longformer-4096 tuned to PUBHEALTH
Add a description, image, and links to the longformer topic page so that developers can more easily learn about it.
To associate your repository with the longformer topic, visit your repo's landing page and select "manage topics."