Build MCP Servers: Using FastMCP v2
Table of Contents
In my previous post we looked at MCP Python SDK for building MCP servers. Let’s now look at FastMCP v2. It is what contributed to the FastMCP version found in the official MCP Python SDK, but has since evolved further. Therefore some of the code should look familar.
You might find it helpful to be familiar with the MCP Inspector and the general ideal behind what and why MCP is in my MCP Introduction.
Install
pip install fastmcp
Simple MCP Server
The following FastMCP v2 example looks very similar to the MCP Python SDK:
# server.py
from fastmcp import FastMCP
mcp = FastMCP("Example MCP Server")
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b
if __name__ == "__main__":
mcp.run()
Here the only difference to the MCP Python SDK is the import of from fastmcp import FastMCP
instead of from mcp.server.fastmcp import FastMCP
.
Running the server via mcp.run()
Most of this section is exactly the same as with the MCP Python SDK.
One notable difference is that you can also pass in the host
and port
to the mcp.run
function. In fact, passing transport related settings to mcp.run
is preferred over passing them to FastMCP
.
Run the server via:
python server.py
That will load server.py
with __name__
being set to __main__
. i.e. it will also execute mcp.run()
which by default starts the server in stdio
mode.
Therefore that is equivalent to passing in stdio
explicitly:
mcp.run(transport="stdio")
Due to the nature of stdio
, this will need to be started by the MCP client / host.
You can pass in other modes via the transport
parameter, e.g.:
A Stateful Server using the Streamable HTTP transport mode:
mcp = FastMCP("Streamable HTTP: Stateful Server")
mcp.run(transport="streamable-http")
Connect client to: http://localhost:8000/mcp
Or a Stateless Server also using the Streamable HTTP transport mode:
mcp = FastMCP(
"Streamable HTTP: Stateless Server",
stateless_http=True
)
mcp.run(transport="streamable-http")
Also available under: http://localhost:8000/mcp
You can also disable SSE support by setting json_response
to True
:
mcp = FastMCP(
"Streamable HTTP: Stateless Server (no SSE)",
stateless_http=True,
json_response=True
)
mcp.run(transport="streamable-http")
And finally using the now deprecated SSE transport mode:
mcp = FastMCP("Deprecated Server-Sent Events(SSE)")
mcp.run(transport="sse")
The SSE endpoint is available under: http://localhost:8000/sse
You could also configure host
and port
by passing them in to mcp.run
, e.g.:
mcp.run(transport="streamable-http", host='0.0.0.0', port=8080)
Running the server via fastmcp run
The fastmcp
command is also very similar to the mcp
command.
Run the server via:
fastmcp run server.py
This will load server.py
and look for a global of type FastMCP
. It would then run the server, similar to how you’d use mcp.run()
.
You can also select the global variable of the MCP server, e.g. mcp
:
fastmcp run server.py:mcp
The transport can also be selected via --transport
parameter. e.g.:
fastmcp run --transport=streamable-http server.py:mcp
You can also pass in the host
and port
(this is something you couldn’t do with the mcp
command from MCP Python SDK):
fastmcp run --transport=streamable-http \
--host=0.0.0.0 \
--port=8080 \
server.py:mcp
Running the server via fastmcp dev
Run the MCP Inspector via:
fastmcp dev server.py
The inpector will then be available under http://127.0.0.1:6274
.
In the inspector, the following configuation will be pre-populated:
Configuration Option | Value |
---|---|
Transport Type | STDIO |
Command | uv |
Arguments | run --with fastmcp fastmcp run server.py |
That means it uses uv to run fastmcp run server.py
in an isolated virtual environment with fastmcp installed.
You could also amend the configuration in the inspector to say:
Configuration Option | Value |
---|---|
Transport Type | STDIO |
Command | fastmcp |
Arguments | run server.py |
Instead of using fastmcp dev
to start the inspector, you can just start it directly via:
npx @modelcontextprotocol/inspector
I do think fastmcp dev
should instead focus on auto-reloading like you would find in FastAPI apps (the same is true for the MCP Python SDK).
Tools schema generated by fastmcp
The tools schema generated by fastmcp
is exactly the same as by mcp
. The tools schema is important as it describes the available tools - to users as well as AI. In this case the schema includes the correct input type integer
and other fields you would expect:
{
"tools": [
{
"name": "add",
"description": "Add two numbers",
"inputSchema": {
"type": "object",
"properties": {
"a": {
"title": "A",
"type": "integer"
},
"b": {
"title": "B",
"type": "integer"
}
},
"required": [
"a",
"b"
]
}
}
]
}
Convert FastMCP to ASGI app
One advantage of converting the FastMCP server to an ASGI app is that you can then use tools like uvicorn to run it.
# server.py
# ...
http_app = mcp.http_app()
Then start the server via the uvicorn CLI:
uvicorn server:http_app --host 0.0.0.0 --port 8000
Integrate FastMCP into FastAPI app
Like with the MCP Python SDK, FastMCP v2 can also be integrated into an existing FastAPI app:
mcp_app = mcp.http_app(path='/mcp')
app = FastAPI(lifespan=mcp_app.lifespan)
app.mount("/example", mcp_app)
Then start the server via the fastapi
CLI (alternatively use uvicorn
like before):
fastapi dev server.py
The endpoint in this case will be /example/mcp
, i.e. http://127.0.0.1:8000/example/mcp
.
You could also mount multiple MCP servers into your FastAPI app.
FastMCP from FastAPI app
You can turn things around by defining your tools using FastAPI and then creating a FastMCP server from it.
# server.py
import logging
import os
from fastapi import FastAPI
from fastmcp import FastMCP
app = FastAPI(title="MCP Server API", version="0.0.1")
@app.get("/add_numbers", operation_id="add_numbers")
async def add_numbers(a: int, b: int) -> int:
"""Add two numbers."""
return a + b
mcp: FastMCP = FastMCP.from_fastapi(
app,
route_maps=[
RouteMap(mcp_type=MCPType.TOOL)
]
)
Start the MCP Server:
fastmcp run server.py:mcp
By default, FastMCP converts all GET
routes to a Resource or Resource Template (depending on whether there are path parameters). To ensure our endpoint ends up as a tool, we could use POST
or any other HTTP method except GET
. Alternatively we can provide custom route maps like we have done in this example.
This generates some verbose tool description:
Add two numbers
Query Parameters:
a (Required): No description.
b (Required): No description.
Responses:
- 200 (Success): Successful Response
Content-Type:
application/json
Example:
1
- 422: Validation Error
Content-Type:
application/json
Response Properties:
Example:
{ "detail": [ "unknown_type" ] }
The tool description will usually be sent to the AI for every available tool. You want it to describe enough for the tool to be picked by the AI when appropriate, with the correct parameters. Describing the responses in such detail though may just eat tokens.
FastMCP from OpenAPI
You can turn almost any API with an OpenAPI spec into a FastMCP server. The following example is MCPifying the API.guru API:
# server.py
import yaml
import httpx
from fastmcp import FastMCP
from fastmcp.server.openapi import RouteMap, MCPType
client = httpx.AsyncClient(base_url="https://api.apis.guru/v2")
openapi_spec = yaml.safe_load(
httpx.get("https://api.apis.guru/v2/openapi.yaml").text
)
mcp = FastMCP.from_openapi(
openapi_spec=openapi_spec,
client=client,
name="OpenAPI MCP Server",
route_maps=[
RouteMap(mcp_type=MCPType.TOOL)
]
)
Start the MCP Server:
fastmcp run server.py:mcp
Like in the FastAPI example, we use custom route maps to make every endpoint a tool.
Here too, the generated tool descriptions are slightly verbose.
e.g. for getProviders
(one of the shortest)
List all the providers in the directory
Responses:
- 200 (Success): OK
Content-Type:
application/json
Response Properties:
Example:
{ "data": [ "string" ] }
Code
You can find self contained examples code in my python-examples
repo, under python_examples/ai/mcp/fastmcp
.
Conclusion
This is it for now. FastMCP v2 is rapidly expanding and going beyond simply defining an MCP server. I was only able to demonstrate some highlights that I feel are important.