Word Embedding Space (2D Projection)

Royalty
Countries
Cities
Other

Training Loss Over Epochs

Analogy Vectors: A - B + C = ?

100 epochs
η = 0.1
2

Corpus

Word Analogy

+ =
Train embeddings first

Selected Word Neighbors

Training Stats

Current Epoch: 0
Loss: -
Vocabulary: -