#
gpt
Here are 136 public repositories matching this topic...
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
-
Updated
Aug 10, 2021 - Python
-
Updated
Aug 13, 2021 - Rust
LightSeq: A High Performance Library for Sequence Processing and Generation
training
cuda
inference
transformer
accelerate
bart
beam-search
sampling
gpt
bert
multilingual-nmt
diverse-decoding
-
Updated
Aug 12, 2021 - Cuda
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
natural-language-processing
model-zoo
pytorch
classification
bart
chinese
gpt
pegasus
ner
clue
albert
bert
fine-tuning
roberta
elmo
pre-training
gpt-2
t5
unilm
xlm-roberta
-
Updated
Aug 10, 2021 - Python
jb33k
commented
Jun 4, 2019
I'm playing around with this wonderful code but I'm running into a curious issue when I try to train the model with my own data.
I replicated the personachat_self_original.json
file structure and added my own data. I deleted dataset_cache_OpenAIGPTTokenizer
file but when I try to train, I get this error:
INFO:train.py:Pad inputs and convert to Tensor
Traceback (most recent call last)
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
nlp
rust
machine-learning
translation
deep-learning
sentiment-analysis
transformer
rust-lang
question-answering
bart
gpt
ner
bert
language-generation
electra
roberta
gpt-2
-
Updated
Aug 7, 2021 - Rust
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
-
Updated
Jul 15, 2021 - Python
Transformer related optimization, including BERT, GPT
-
Updated
Aug 6, 2021 - C++
An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.
nlp
docker
machine-learning
natural-language-processing
deep-learning
gpu
transformers
pytorch
api-rest
easy
gpt
language-models
deep-learning-tutorial
bert
fine-tuning
ulmfit
xlnet
-
Updated
Aug 13, 2021 - Jupyter Notebook
Super UEFIinSecureBoot Disk: Boot any OS or .efi file without disabling UEFI Secure Boot
-
Updated
Apr 22, 2019
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
nlp
tensorflow
text-generation
transformer
openai
gpt
implementation
pre-training
tensorflow2
gpt-2
gpt2
pretraining
-
Updated
Jun 9, 2021 - Python
A React implementation of the Google DFP/GPT api. https://react-dfp.surge.sh
-
Updated
Jul 4, 2021 - JavaScript
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.
python
nlp
fast
research
ai
deep-learning
neural-network
ml
pytorch
artificial-intelligence
yolo
easy-to-use
object-detection
gpt
dl
bert
tensorflow2
huggingface-transformers
gpt-neo
quickai
-
Updated
Aug 8, 2021 - Python
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
machine-learning
tabular-data
pytorch
artificial-intelligence
transformer
gpt
bert
fraud-detection
icassp
huggingface
credit-card-dataset
prsa-dataset
credit-card-transaction
icassp2021
-
Updated
Jul 21, 2021 - Python
API for the GPT-J language model 🦜 . Including a FastAPI backend and a streamlit frontend
-
Updated
Aug 8, 2021 - Python
Annotations of the interesting ML papers I read
nlp
machine-learning
deep-learning
transformers
gpt
research-paper
bert
gpt-2
xlnet
annotated-paper
megatron-lm
papers-annotations
-
Updated
Jun 19, 2021
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
nlp
deep-learning
word2vec
tutorials
cnn
pytorch
embeddings
transformer
attention
gpt
torchtext
hugging-face
nlp-papers
deeplearning-nlp-models
-
Updated
Dec 10, 2020 - Jupyter Notebook
Discord AI Chatbot using DialoGPT, trained on the game transcript of The World Ends With You
-
Updated
Jun 20, 2021 - Jupyter Notebook
A Personal Arch Installation Guide In Case of Amnesia
-
Updated
Dec 23, 2020
Improve this page
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."
https://github.com/huggingface/transformers/blob/546dc24e0883e5e9f5eb06ec8060e3e6ccc5f6d7/src/transformers/models/gpt2/modeling_gpt2.py#L698
Assertions can't be relied upon for control flow because they can be disabled, as per the following: