Skip to content
#

fasttext

Here are 253 public repositories matching this topic...

gensim
prabhakar267
prabhakar267 commented Feb 16, 2018

If I have a word, how do i get top k words closest to that given word. As far as i understand, there is a way to get it from cpp code but I can't find anything in the python library.
Something similar to what gensim word2vec implementation has:

model.most_similar(positive=[your_word_vector], topn=1))
zachgk
zachgk commented Apr 8, 2020

The documentation in DJL was originally written with the expectation that users are reasonably familiar with deep learning. So, it does not go out of the way to define and explain some of the key concepts. To help users who are newer to deep learning, we created a [documentation convention](https://github.com/awslabs/djl/blob/master/docs/development/development_guideline.md#documentation-conventio

中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN

  • Updated Apr 3, 2020
  • Python
giacbrd
giacbrd commented Oct 17, 2016

fastText supervised model does not take into account of the document and words representation, it just learns bag of words and labels.
embeddings are computed only on the relation word->label. it would be interesting to learn jointly the semantic relation label<->document<->word<->context.
for now it is only possible to pre-train word embeddings and then use them as initial vectors for the clas

Improve this page

Add a description, image, and links to the fasttext topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the fasttext topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.