Skip to main content

All Questions

Tagged with
Filter by
Sorted by
Tagged with
1 vote
0 answers
394 views

Dataset Format for fine tuning deepset/roberta-base-squad2 hugging face transformer model

I have been trying to fine tune the roberta model for QnA to my specific domain (healthcare). I am unable to find the correct way to provide the dataset format to the tokenizer in order to fine tune ...
Tushar Sethi's user avatar
0 votes
0 answers
41 views

Which steps are involved in sentiment analysis with Huggingface Transformers?

I want to perform a sentiment analysis of a dataset of (Spanish) tweets about COVID-19 vaccines. I've already scraped the tweets and identified a pretrained model I can use for Spanish. What I don't ...
LeLuc's user avatar
  • 131
1 vote
1 answer
3k views

How to do NER predictions with Huggingface BERT transformer

I am trying to do a prediction on a test data set without any labels for an NER problem. Here is some background. I am doing named entity recognition using tensorflow and Keras. I am using huggingface ...
Khachatur Mirijanyan's user avatar
1 vote
0 answers
125 views

Top-K vs AUC - communicating results and next steps [closed]

I have a bi-LSTM multi-label text classification model which when training on a highly imbalanced dataset with 1000 possible labels gives a top-k (k=5) categorical accuracy of 86% and a focal loss of ...
ML_Engine's user avatar
  • 111
1 vote
0 answers
169 views

How to apply pruning on a BERT model?

I have trained a BERT model using ktrain (tensorflow wrapper) to recognize emotion on text, it works but it suffers from really slow inference. That makes my model not suitable for a production ...
Stamatis Tiniakos's user avatar