MLflow MCP
MLflow MCP server for ML experiment tracking. Query experiments, runs, registered models, and model versions with advanced filtering. 10 GitHub stars and 30 commits on main in the last 30 days. Tier 2 by stars.
“MLflow MCP server for ML experiment tracking. Query experiments, runs, registered models, and model versions from inside agents. 10 stars and 30 commits on main in the last 30 days. Tier 2.”
INSTALL THIS SERVER
{
"mcpServers": {
"mlflow": {
"command": "python",
"args": [
"-m",
"mlflow_mcp"
],
"env": {
"MLFLOW_TRACKING_URI": "https://your-mlflow.example.com",
"MLFLOW_TRACKING_TOKEN": "YOUR_TOKEN"
}
}
}
}
{
"mcpServers": {
"mlflow": {
"command": "python",
"args": [
"-m",
"mlflow_mcp"
],
"env": {
"MLFLOW_TRACKING_URI": "https://your-mlflow.example.com",
"MLFLOW_TRACKING_TOKEN": "YOUR_TOKEN"
}
}
}
}
{
"mcpServers": {
"mlflow": {
"command": "python",
"args": [
"-m",
"mlflow_mcp"
],
"env": {
"MLFLOW_TRACKING_URI": "https://your-mlflow.example.com",
"MLFLOW_TRACKING_TOKEN": "YOUR_TOKEN"
}
}
}
}
{
"mcpServers": {
"mlflow": {
"command": "python",
"args": [
"-m",
"mlflow_mcp"
],
"env": {
"MLFLOW_TRACKING_URI": "https://your-mlflow.example.com",
"MLFLOW_TRACKING_TOKEN": "YOUR_TOKEN"
}
}
}
}
{
"mcpServers": {
"mlflow": {
"command": "python",
"args": [
"-m",
"mlflow_mcp"
],
"env": {
"MLFLOW_TRACKING_URI": "https://your-mlflow.example.com",
"MLFLOW_TRACKING_TOKEN": "YOUR_TOKEN"
}
}
}
}
6 TOOLS AVAILABLE
OUR ASSESSMENT
- 30 commits on main in the last 30 days. Top-tier cadence.
- MIT license.
- MLflow itself is a mature platform.
- Direct access to experiments, runs, and registered models.
- Active community development.
- 10 GitHub stars indicates very early adoption.
- Tier 2 designation reflects MCP maturity, not platform maturity.
- Tool surface is read-heavy; model registration requires the MLflow CLI/SDK directly.
MLFLOW_TRACKING_TOKEN carries the connecting user permission. For self-hosted MLflow, use a service-account user scoped to the agent flow. Read-only tool surface keeps blast radius small.
ML teams running MLflow self-hosted or MLflow-on-Databricks; agents that triage experiment failures or compare run metrics; mlops engineers adopting agent-driven model registry operations.
TECHNICAL DETAILS
ADOPTION METRICS
// Reading this10 stars on kkruglik/mlflow-mcp. 30 commits on main in the last 30 days, top-tier cadence.
// Reading thisPairs with W&B, Arize Phoenix, Pinecone, Qdrant, Cognee in ai-ml. MLflow covers the open-source experiment-tracking standard uniquely.
SOURCES & VERIFICATION
We don't take any single directory's word for it. Before scoring, we cross-reference 3 public MCP sources, install the server ourselves against the clients we cover, and record when we last re-verified.
The same server, 3 different lenses. We reconcile these signals into our editorial score, which is why our number sometimes diverges from a directory-aggregate star count.
| Source | Their rating | Their star count | Their downloads | Last synced |
|---|---|---|---|---|
| AutomationSwitch This page | 3.9editorial | 10 | — | MAY 6, 2026 |
| PulseMCP | — unrated | unavailable | unavailable | MAY 6, 2026 |
| MCP.so | — unrated | unavailable | unavailable | MAY 6, 2026 |
| Glama | — unrated | unavailable | unavailable | MAY 6, 2026 |
// Counts are directory-reported; we don't adjust them. Discrepancies usually come from different snapshot times or star-caching.
OTHER AI / ML MCP SERVERS
Cognee MCP
Knowledge graph plus vector memory engine for AI agents, exposed as an MCP server with V2 session-aware memory tools (remember, recall, forget, improve) and classic V1 ingestion pipelines (cognify, codify). Three transports: stdio, SSE, Streamable HTTP. 16,965 GitHub stars, Apache-2.0.
Codebase Memory MCP
High-performance code intelligence MCP server for AI coding agents. Indexes a codebase into a queryable knowledge graph in milliseconds, with 14 MCP tools spanning structural search, call-chain tracing, impact analysis, dead-code detection, and Cypher queries. Single static C binary, 66 languages via tree-sitter, zero runtime dependencies.
Arize Phoenix MCP
LLM observability platform exposing prompts, projects, traces, spans, sessions, datasets, and experiments through MCP. Published to npm as @arizeai/phoenix-mcp, current 4.0.8 (2026-04-29). 9,496 stars on parent monorepo, Elastic License 2.0.
Qdrant MCP Server
Official Qdrant vector database MCP server. Acts as a semantic memory layer on top of Qdrant: store information with metadata, retrieve via similarity search. Two tools, very small surface area, exceptionally maintained by the Qdrant team. Configurable embedding provider (fastembed default), supports remote and local Qdrant clusters.
Amazon Bedrock AgentCore MCP
Official AWS Labs MCP server for Amazon Bedrock AgentCore: agent runtime, memory, gateway, identity, and observability. Tools fetch curated AgentCore documentation and surface deployment guides for runtime, memory, and gateway resources. Apache-2.0 within awslabs/mcp monorepo (8,924 parent stars).
Vapi MCP
Vapi.ai voice AI server SDK with MCP server module. Manage voice assistants, trigger outbound calls, and inspect call transcripts. 118 GitHub stars and 14 commits on main in the last 30 days.
DISCUSS YOUR
MCP REQUIREMENTS.
Evaluating a server, scoping an internal deployment, or working out whether MCP is the right fit at all. Start the conversation and we will point you at the right piece of the ecosystem.