MCP server (Beta)¶
The Tinybird remote MCP server enables AI agents to connect directly to your workspace to use endpoints as tools or execute queries. The Model Context Protocol gives AI assistants access to your analytics APIs, data sources, and endpoints through a standardized interface.
Our server only supports Streamable HTTP as the transport protocol. If your MCP client doesn't support it, you'll need to use the mcp-remote
package as a bridge.
Available tools¶
Depending on the token scopes, the following tools will be exposed:
<endpoint_name>
: Every endpoint and its parameters are exposed as a tool.execute_query
: Execute a SQL query against your workspace.list_datasources
: List all data sources.list_endpoints
: List all endpoints.
Usage examples¶
Client settings (Cursor, Windsurf, Claude Desktop)¶
{ "mcpServers": { "tinybird": { "command": "npx", "args": [ "-y", "mcp-remote", "https://cloud.tinybird.co/mcp?token=TB_TOKEN" ] } } }
Pydantic AI¶
from pydantic_ai import Agent from pydantic_ai.mcp import MCPServerStdio server = MCPServerStdio( command="npx", args=[ "-y", "mcp-remote", "https://cloud.tinybird.co/mcp?token=TB_TOKEN", ], ) agent = Agent(name="tb_agent", model=MODEL, mcp_servers=[server]) async def main(): async with agent.run_mcp_servers(): result = await agent.run( "How many paying users do we have today compared to 30 days ago?" ) print(result.output)
OpenAI agents SDK¶
from agents import Agent, Runner from agents.mcp import MCPServerStreamableHttp async def main(): server = MCPServerStreamableHttp( name="tinybird", params={ "url": "https://cloud.tinybird.co/mcp?token=TB_TOKEN", }, ) async with server: agent = Agent(name="tb_agent", model=MODEL, mcp_servers=[server]) result = await Runner.run( agent, input="How many paying users do we have today compared to 30 days ago?", ) print(result.final_output)
Clients supporting Streamable HTTP¶
{ "mcpServers": { "tinybird": { "url": "https://cloud.tinybird.co/mcp?token=TB_TOKEN" } } }