Published on

OpenAI MCP support

What does OpenAI mean when they say their API will support MCP? MCP hosts, clients and servers all run on the local device. How would OpenAI’s API talk to any of that?

Last week OpenAI announced

“People love MCP and we are excited to add support across our products,” Altman said. “[It’s] available today in the Agents SDK and support for [the] ChatGPT desktop app [and] Responses API [is] coming soon!”

One vision that people have for the future of MCP: As well as providing traditional APIs, vendors provide an MCP interface to their product. So your chatbot can easily talk to Gmail, because Google provides an LLM native interface (and you don't have to run an MCP server locally - Google's doing it for you). This is the idea that Altman is alluding to.

Practically that means there's going to be support for ChatGPT in the Desktop and API to use MCP servers.

Currently MCP servers are "local only" based on the spec though, so it's not clear how this would work with the API today. Anthropic's MCP team has a priority item on the roadmap to extend the protocol to support remote servers, so that will likely be a prerequisite.

References

https://techcrunch.com/2025/03/26/openai-adopts-rival-anthropics-standard-for-connecting-ai-models-to-data/