Skip to content
#

embeddings

Here are 703 public repositories matching this topic...

bloodwass
bloodwass commented Jun 17, 2019

Expected Behavior

I want to convert torch.nn.Linear modules to weight drop linear modules in my model (possibly big), and I want to train my model with multi-GPUs. However, I have RuntimeError in my sample code. First, I have _weight_drop() which drops some part of weights in torch.nn.Linear (see the code below).

Actual Behavior

RuntimeError: arguments are located on different GPUs at /

philippmwirth
philippmwirth commented Sep 24, 2021

Add default parameters for all projection heads

It's helpful to know what the default parameters were in the papers to get started. We should add the default projection head parameters which were used for pre-training on Imagenet to all projection and prediction heads in lightly/models/modules/heads.py.

中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN

  • Updated Sep 2, 2021
  • Python

Improve this page

Add a description, image, and links to the embeddings topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the embeddings topic, visit your repo's landing page and select "manage topics."

Learn more