LightSeq: A High Performance Library for Sequence Processing and Generation
-
Updated
Mar 20, 2023 - C++
LightSeq: A High Performance Library for Sequence Processing and Generation
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Self-contained Machine Learning and Natural Language Processing library in Go
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
MinT: Minimal Transformer Library and Tutorials
Build and train state-of-the-art natural language processing models using BERT
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
Cybertron: the home planet of the Transformers in Go
Multilingual/multidomain question generation datasets, models, and python library for question generation.
BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese (INTERSPEECH 2022)
Code for EMNLP 2021 paper "Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization"
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Abstractive and Extractive Text summarization using Transformers.
Source codes and dataset of Call for Customized Conversation: Customized Conversation Grounding Persona and Knowledge
The first-ever vast natural language generation benchmark for Indonesian, Sundanese, and Javanese. We provide multiple downstream tasks, pre-trained IndoGPT and IndoBART models, and a starter code! (EMNLP 2021)
Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
Add a description, image, and links to the bart topic page so that developers can more easily learn about it.
To associate your repository with the bart topic, visit your repo's landing page and select "manage topics."