The gte-base embedding model encodes English sentences and paragraphs into a 768-dimensional dense vector space, delivering efficient and effective semantic embeddings optimized for textual similarity, semantic search, and clustering applications.
Effective Pricing for GTE-Base
Actual cost per million tokens across providers over the past hour