[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
-
Updated
Feb 4, 2024 - Python
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
PyTorch implementation for "Training and Inference on Any-Order Autoregressive Models the Right Way", NeurIPS 2022 Oral, TPM 2023 Best Paper Honorable Mention
Implementation of Transformer Encoders / Masked Language Modeling Objective
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
Transformer for Automatic Speech Recognition
Code for the publication of WWW'22
Use BERTRAM to get single-token embeddings for idioms on the MAGPIE dataset.
A Pre-trained Language Model for Semantic Similarity Measurement of Persian Informal Short Texts
Customized Pretraining for NLG Tasks
Source codes and materials of Advanced Spelling Error Correction project.
Masked Language Modeling demo using XLM-RoBERTa + Gradio/FastAPI
Add a description, image, and links to the masked-language-modeling topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-modeling topic, visit your repo's landing page and select "manage topics."