Back to lmvizlmviz

Embeddings Explorer

Explore how words become vectors in high-dimensional space

Categories
Show Clusters
How It Works

Word embeddings map words to points in high-dimensional space (50 dimensions here). Words with similar meanings cluster together.

This plot shows a 2D projection of those 50 dimensions using t-SNE, which preserves local neighborhood structure.

Click any word to see its nearest neighbors by cosine similarity — the fundamental measure of vector closeness.