An open source implementation of Microsoft's VALL-E X zero-shot TTS model. Demo is available in https://plachtaa.github.io/vallex/
-
Updated
Feb 11, 2024 - Python
An open source implementation of Microsoft's VALL-E X zero-shot TTS model. Demo is available in https://plachtaa.github.io/vallex/
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Code for CRATE (Coding RAte reduction TransformEr).
Implementation of the Swin Transformer in PyTorch.
Minimalist NMT for educational purposes
🌕 [BMVC 2022] You Only Need 90K Parameters to Adapt Light: A Light Weight Transformer for Image Enhancement and Exposure Correction. SOTA for low light enhancement, 0.004 seconds try this for pre-processing.
CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation
[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
The repository of ET-BERT, a network traffic classification model on encrypted traffic. The work has been accepted as The Web Conference (WWW) 2022 accepted paper.
SeqFormer: Sequential Transformer for Video Instance Segmentation (ECCV 2022 Oral)
PyContinual (An Easy and Extendible Framework for Continual Learning)
Attention Is All You Need | a PyTorch Tutorial to Transformers
IEEE TNNLS 2021, transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
An Extensible Continual Learning Framework Focused on Language Models (LMs)
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
PyTorch implementation of the model presented in "Satellite Image Time Series Classification with Pixel-Set Encoders and Temporal Self-Attention"
Fine-tuned pre-trained GPT2 for custom topic specific text generation. Such system can be used for Text Augmentation.
Official PyTorch implementation of our AAAI22 paper: TransMEF: A Transformer-Based Multi-Exposure Image Fusion Framework via Self-Supervised Multi-Task Learning.
Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
Add a description, image, and links to the transformer-architecture topic page so that developers can more easily learn about it.
To associate your repository with the transformer-architecture topic, visit your repo's landing page and select "manage topics."