Embeddings are underrated
Jun 13, 2025Embeddings are underrated. Blog post on how underrated embeddings is for technical writers.
I’m still not very familiar with the world of embeddings, it was nice to see concepts. Essentially embeddings is a way of semantically representing text as a multidimensional vector of floats, making it easier to compare similarity across texts.
Word embeddings was introduced in the foundational paper Word2Vec, and is also how Large Language Models represent words and capture semantic relationships, although in more complex and advanced way.
The Illustrated Word2vec illustrates the inner workings of Word2Vec.