An open platform for operating large language models (LLMs) in production. Fine-tune, serve, deploy, and monitor any LLMs with ease.
-
Updated
Jun 23, 2023 - Python
An open platform for operating large language models (LLMs) in production. Fine-tune, serve, deploy, and monitor any LLMs with ease.
PromptCLUE, 全中文任务支持零样本学习模型
simpleT5 is built on top of PyTorch-lightning
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
NLP model zoo for Russian
Abstractive text summarization by fine-tuning seq2seq models.
Examples of inference and fine-tuning T5, GPT-2 and ruGPT-3 models
Materials for "IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation"
This repository contains the data and code for the paper "Diverse Text Generation via Variational Encoder-Decoder Models with Gaussian Process Priors" (SPNLP@ACL2022)
A extension of Transformers library to include T5ForSequenceClassification class.
Automated Headline generation and Aspect Based Sentiment Analysis
End-to-End Model - Finetuned T5 for Text-to-SPARQL Task
A full-text error corrector for English based on transformers and deep learning
Flan-T5 GODEL GPT-2 chatbots without any input and output filters
Implementing 5 Different Approaches To Augmenting Data For Natural Language Processing Tasks.
Official repository of Generating Multiple-Length Summaries via Reinforcement Learning for Unsupervised Sentence Summarization [EMNLP'22 Findings]
Tutorial for text classification with fine tuning of a T5 model on TPUs.
Add a description, image, and links to the t5-model topic page so that developers can more easily learn about it.
To associate your repository with the t5-model topic, visit your repo's landing page and select "manage topics."