Microsoft open-sources three versions of the Harrier text embedding model, with the 27B version topping the multilingual MTEB v2 leaderboard.

BlockBeatNews

According to 1M AI News monitoring, Microsoft has open-sourced a multilingual text embedding model family on Hugging Face called harrier-oss-v1, with three tiers: 270M, 0.6B, and 27B. The model card shows that this series uses a decoder-only architecture, last-token pooling, and L2 normalization, supports up to 32768 tokens, and can be used for retrieval, clustering, semantic similarity, classification, bilingual mining, and re-ranking.

Multilingual MTEB v2 is an industry-standard multilingual text embedding benchmark, primarily testing tasks such as retrieval, classification, clustering, and semantic similarity. Microsoft’s model card states that the three tiers achieve scores of 66.5, 69.0, and 74.3 on this benchmark, respectively, and that the 27B version topped the leaderboard on the day it was released. The 270M and 0.6B versions also additionally use larger embedding models for knowledge distillation. All three models are released under the MIT license.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments