I need multiply the weigths of terms in TFIDF matrix by the word-embeddings of word2vec matrix but I can’t do it because each matrix have a different number of terms. I am using the same corpus for get both matrix, I don’t know why each matrix have a different number of terms . My problem […]

- Tags 1550) and matrix of embeddings Word2vec with the shape (300, 15500) (corresponding to: number of terms, 56096). How I can get the same number of terms in both matrix? Because I can't delete without more data, 56184) (corresponding to : number of word-embeddings, due to I need the multiplication to make sense because my goal is to get the embeddings from the documents. Thank you very much in advance., I don't know why each matrix have a different number of terms . My problem is that I have a matrix TFIDF with the shape (56096, I mean matrix TFIDF with the shape (56096, I need multiply the weigths of terms in TFIDF matrix by the word-embeddings of word2vec matrix but I can't do it because each matrix have a d, I need the same number in embeddings matrix, if I have 56096 terms in TFIDF, iter=100) words = list(w2v_model.wv.vocab) vectors=[] for w in words: vectors.append(w2v_model[w].tolist()) embedding, max_df=1., min_count = min_word_count, norm='l2', number of documents) and matrix Word2vec with the shape (300, number of terms). And I need the same numbers of terms in both matrix. I use this code for get the matrix of word-embeddings Word2vec: def, sample=sample, size=feature_size, smooth_idf=True) def matriz_tf_idf(datos, tv): tv_matrix = tv.fit_transform(datos) tv_matrix = tv_matrix.toarray() tv_matrix = tv_matrix.T return tv_matrix And I need, use_idf=True)), window=window_context