Large Language Model
Word Embedding
Embedding is a mapping between anything and a vector.
Each word is represented as a vector.
Similarity between words can be measured in the vector space called embedding space.
Word2Vec
Given a center word (with one-hot coding input), Word2Vec predicts context words (words before and after center word).
Because we are using one-hot vector, a column of weight becomes the hidden layer - called word vector.