Back to Arena
Mixedbread AI
by Mixedbread AI
System Card
OrganizationMixedbread AI
Released2024-01
Architectureexternal-memory-network / embedding and reranking model API
DetailsMixedbread provides mxbai-embed-large (Matryoshka embedding model supporting dimensionality reduction) and mxbai-rerank (GRPO-trained rerankers supporting 100+ languages) via API and open-source HuggingFace weights. The 2D-Matryoshka technique allows simultaneous reduction of layers and dimensions. The Mixedbread Platform alpha integrates embeddings, reranking, document parsing, and vector stores.
Parameters—
Domainrag-retrieval
Open SourcePartial
WebsiteVisit
Matryoshkarerankingopen-weightsmultilingualGRPO
Capability Profile
Benchmark Scores
5 of 14 benchmarksMulti-Turn Recall0/2
LoCoMo
no dataMemoryBank
no dataCross-Session Memory0/1
LongMemEval
no dataMulti-Hop QA2/3
Agent Task Memory0/1
AgentBench-Mem
no dataPersonalization0/1
PerLTQA
no dataFactuality / Grounding1/1
Sources:Mixedbread AI vendor documentation; evaluated on HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering (Stanford / CMU, 1809)Mixedbread AI vendor documentation; evaluated on LongBench: A Bilingual, Multitask Benchmark for Long Context Understanding (Tsinghua KEG, 2308)Mixedbread AI vendor documentation; evaluated on MultiHop-RAG: Benchmarking Retrieval-Augmented Generation for Multi-Hop Queries (HKUST, 2401)Mixedbread AI vendor documentation; evaluated on RAGAS: Automated Evaluation of Retrieval-Augmented Generation (Exploding Gradients, 2309)Mixedbread AI vendor documentation; evaluated on RULER: What's the Real Context Size of Your Long-Context Language Models (NVIDIA, 2404)