Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: openai/openai-python
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v1.66.5
Choose a base ref
...
head repository: openai/openai-python
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v1.67.0
Choose a head ref
  • 2 commits
  • 19 files changed
  • 1 contributor

Commits on Mar 19, 2025

  1. Copy the full SHA
    653dfec View commit details
  2. release: 1.67.0

    stainless-app[bot] committed Mar 19, 2025
    Copy the full SHA
    e9f971a View commit details
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "1.66.5"
".": "1.67.0"
}
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
configured_endpoints: 81
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-f3bce04386c4fcfd5037e0477fbaa39010003fd1558eb5185fe4a71dd6a05fdd.yml
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-b26121d5df6eb5d3032a45a267473798b15fcfec76dd44a3256cf1238be05fa4.yml
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# Changelog

## 1.67.0 (2025-03-19)

Full Changelog: [v1.66.5...v1.67.0](https://github.com/openai/openai-python/compare/v1.66.5...v1.67.0)

### Features

* **api:** o1-pro now available through the API ([#2228](https://github.com/openai/openai-python/issues/2228)) ([40a19d8](https://github.com/openai/openai-python/commit/40a19d8592c1767d6318230fc93e37c360d1bcd1))

## 1.66.5 (2025-03-18)

Full Changelog: [v1.66.4...v1.66.5](https://github.com/openai/openai-python/compare/v1.66.4...v1.66.5)
2 changes: 2 additions & 0 deletions api.md
Original file line number Diff line number Diff line change
@@ -2,6 +2,7 @@

```python
from openai.types import (
AllModels,
ChatModel,
ComparisonFilter,
CompoundFilter,
@@ -14,6 +15,7 @@ from openai.types import (
ResponseFormatJSONObject,
ResponseFormatJSONSchema,
ResponseFormatText,
ResponsesModel,
)
```

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "openai"
version = "1.66.5"
version = "1.67.0"
description = "The official Python library for the openai API"
dynamic = ["readme"]
license = "Apache-2.0"
2 changes: 1 addition & 1 deletion src/openai/_version.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

__title__ = "openai"
__version__ = "1.66.5" # x-release-please-version
__version__ = "1.67.0" # x-release-please-version
17 changes: 9 additions & 8 deletions src/openai/resources/responses/responses.py
Original file line number Diff line number Diff line change
@@ -44,6 +44,7 @@
from ...types.responses.parsed_response import ParsedResponse
from ...lib.streaming.responses._responses import ResponseStreamManager, AsyncResponseStreamManager
from ...types.responses.response_includable import ResponseIncludable
from ...types.shared_params.responses_model import ResponsesModel
from ...types.responses.response_input_param import ResponseInputParam
from ...types.responses.response_stream_event import ResponseStreamEvent
from ...types.responses.response_text_config_param import ResponseTextConfigParam
@@ -80,7 +81,7 @@ def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
max_output_tokens: Optional[int] | NotGiven = NOT_GIVEN,
@@ -245,7 +246,7 @@ def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
stream: Literal[True],
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
@@ -410,7 +411,7 @@ def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
stream: bool,
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
@@ -575,7 +576,7 @@ def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
max_output_tokens: Optional[int] | NotGiven = NOT_GIVEN,
@@ -892,7 +893,7 @@ async def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
max_output_tokens: Optional[int] | NotGiven = NOT_GIVEN,
@@ -1057,7 +1058,7 @@ async def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
stream: Literal[True],
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
@@ -1222,7 +1223,7 @@ async def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
stream: bool,
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
@@ -1387,7 +1388,7 @@ async def create(
self,
*,
input: Union[str, ResponseInputParam],
model: Union[str, ChatModel],
model: ResponsesModel,
include: Optional[List[ResponseIncludable]] | NotGiven = NOT_GIVEN,
instructions: Optional[str] | NotGiven = NOT_GIVEN,
max_output_tokens: Optional[int] | NotGiven = NOT_GIVEN,
Loading