Beyond word2vec: GloVe, fastText, StarSpace


Follow to receive video recommendations   a   A

Word embeddings is a very convenient and efficient way to extract semantic information from large collections of textual or textual-like data. We will be presenting an exploration and comparison of the performance of "traditional" embeddings techniques like word2vec and GloVe as well as fastText and StarSpace in NLP related problems such as metaphor and sarcasm detection

Editors Note:

I am looking for editors/curators to help with branches of the tree. Please send me an email  if you are interested.