'What is the difference between the model.wv.vectors and model.syn1neg in Gensim's Word2Vec?
I have just trained a Word2Vec model as follows:
from gensim.test.utils import common_texts
from gensim.models import Word2Vec
model = Word2Vec(sentences=common_texts, vector_size=5, window=5, min_count=1, workers=4)
Now, I want to find the actual embedding; the (vocabulary_size x vector_size) array. Looking at dir(model), I see both syn1neg and wv.vectors as Numpy arrays with the dimensionality I am looking for. What information is stored in these arrays?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
