Skip to content

openai.types.responses.response_output_item.McpCall Model not Accepted by Responses API #2670

@michael-dommett-deel

Description

@michael-dommett-deel

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

When using Responses API to do an MCP call, the Response.output field will contain an openai.types.responses.response_output_item.McpCall type. This type has a status field.

When adding this same McpCall object into the input field of a new Responses call, previously there was no issue. Now, it fails with input validation:

{'error': {'message': "Unknown parameter: 'input[2].status'.", 'type': 'invalid_request_error', 'param': 'input[2].status', 'code': 'unknown_parameter'}}

The field status of McpCall is not accepted. Thats because its not in the data model for openai.types.responses.response_output_item.McpCall.

So, why is status being added to the openai.types.responses.response_output_item.McpCall output, and why previously was this ok to use and now its not. Im talking a difference between 2025/09/25 (working) and 2025/09/29 (not working)

Removing status field from the McpCall and then sending the same request does work ✅

To Reproduce

  1. Do an MCP call
  2. Add the MCP call response output item to the input of a second call
  3. This second call will fail (see code snippet to reproduce

Code snippets

from openai import OpenAI
client = OpenAI()
input = [{"role":"user","content":"How does tiktoken work?"}]
response_tiktoken=client.responses.create(tools=[{
            "type": "mcp",
            "server_label": "gitmcp",
            "server_url": "https://gitmcp.io/openai/tiktoken",
            "allowed_tools": ["search_tiktoken_documentation", "fetch_tiktoken_documentation"],
            "require_approval": "never"
        }],
        input=input,
        model="gpt-4o-mini")

tiktoken_mcp_call_outputs = [i for i in response_tiktoken.output if i.type == "mcp_call"]
new_input = input + tiktoken_mcp_call_outputs + [{"role":"user","content":"What did your previous message say?"}]

# this will fail due to `status` in the mcp call
response_tiktoken=client.responses.create(tools=[{
            "type": "mcp",
            "server_label": "gitmcp",
            "server_url": "https://gitmcp.io/openai/tiktoken",
            "allowed_tools": ["search_tiktoken_documentation", "fetch_tiktoken_documentation"],
            "require_approval": "never"
        }],
        input=new_input,
        model="gpt-4o-mini")

OS

macOS

Python version

3.12

Library version

openai v1.109.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions