What exactly is Word2Vec in the context of CBoW and Skip-gram?

by Seankala   Last Updated July 12, 2019 08:19 AM - source

I was reading the original paper for Word2Vec (Distributed Representations of Words and Phrases and their Compositionality (Mikolov et al., 2013)) and got confused regarding Word2Vec, CBoW, and Skip-gram.

My understanding is that CBoW and Skip-gram are two models that carry out specific tasks (e.g. CBoW predicts a target word by the context) and that Word2Vec is a particular word embedding technique that can be used to train CBoW and Skip-gram models. Is this understanding correct?

If this is correct, then I don't understand what it is meant when it is said that "either CBoW or Skip-gram may be used to obtain Word2Vec," as many blog posts and articles I've been reading claim.

One article that I read that seemed relatively more intuitive said that the weights we learn when training these models compose the word vectors. Does this mean that when we train models such as CBoW and Skip-gram, the weights that are learned are the vectors that form the Word2Vec embedding?

I hope I'm making sense... If not please notify me and I'll try my best to rephrase and clarify my question. Thank you.



Related Questions




Reference corpus for Topic Coherence in LDA

Updated May 28, 2018 05:19 AM