Skip to content

Conversation

@iamemilio
Copy link

Complete testing, wrapping and coverage for the latest OpenAI responses API and Compaction API features in the python client.

NOTE This is a work in progress and I am a new contributor who would greatly appreciate any guidance maintainers think is needed.

Description

Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.

Fixes #3436

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

How Has This Been Tested?

A number of tests have been added to cover this to make sure that it meets the following requirements:

  • It always makes a best attempt to capture telemetry data
  • It does not disrupt the behavior of the users application in any way
  • It can not cause timeouts or crashes
  • The data captures follows semantic conventions and behaves as expected

Does This PR Require a Core Repo Change?

  • No.

Checklist:

See contributing.md for styleguide, changelog guidelines, and more.

  • Followed the style guidelines of this project
  • Changelogs have been updated
  • Unit tests have been added
  • Documentation has been updated

Complete testing, wrapping and coverage for the latest OpenAI
responses API and Compaction API features in the python client.
There were a few assumptions being made about openai types
in the tests and the openai wrappers that raised linting errors.
Some of these errors were rightly raised, especially around streaming
types, and have been patched.
@iamemilio
Copy link
Author

@xrmx I noticed you were making some improvements to the OpenAI v2 library. I was wondering if you could give me a little guidance on this PR. I recognize that the size is fairly large, so I am open to breaking it up into smaller changes if needed.

I especially would appreciate guadance around the semantic conventions, and the OpenTelemetry GenAI input/output message schemas:

  • Input messages schema: https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-input-messages.json
  • Output messages schema: https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-output-messages.json

Copy link
Contributor

@xrmx xrmx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR. I would say to make the PR more reviewable open a preparatory PR with just the moving of the code outside of patch and whatever other change to the current code. Did you already get in touch with the genai people? If not please hop on the otel-genai-instrumentation channel in slack and share your work, SIG calls won't be run until next year.

"Programming Language :: Python :: 3.13",
]
dependencies = [
"openai>=1.109.1",
Copy link
Contributor

@xrmx xrmx Dec 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why? If it is the first version having Responses the instrumentation should continue to work with older versions.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 this is not a dependency. There is a lot of junk in this PR, its a very rough draft and I am still getting my bearings, but its clearly doing far too much. I expect to cut out a majority of this.

unwrap(openai.resources.embeddings.AsyncEmbeddings, "create")
def _uninstrument(self, **kwargs):
chat_mod = importlib.import_module("openai.resources.chat.completions")
unwrap(chat_mod.Completions, "create")
Copy link
Contributor

@xrmx xrmx Dec 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unwrap takes also strings as the wrap functions, you shouldn't need the importlib usage

)

# Re-export StreamWrapper for backwards compatibility
__all__ = ["StreamWrapper"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instrumentations are not APIs, I don't think we need to care about this unless we have known users

@xrmx
Copy link
Contributor

xrmx commented Dec 19, 2025

@xrmx I noticed you were making some improvements to the OpenAI v2 library. I was wondering if you could give me a little guidance on this PR. I recognize that the size is fairly large, so I am open to breaking it up into smaller changes if needed.

I especially would appreciate guadance around the semantic conventions, and the OpenTelemetry GenAI input/output message schemas:

* Input messages schema: `https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-input-messages.json`

* Output messages schema: `https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-output-messages.json`

If you want to implement the newer semantic conventions for genai events you have broadly two choices:

  • introduce the semconv opt-in switch with the current code and then add responses leveraging that work
  • first add responses for current semconv and then move everything

I suggest the first option so the PR adding responses would be hopefully smaller.

@iamemilio
Copy link
Author

I have not gotten in touch with them yet, but I will! Thank you for taking the time to review this and point me in the right direction. This work was largely exploratory, so I am more than happy to close this PR up and plan a more pragmatic approach to introducing these changes :). I will work on adding the opt in switch to gate usage of the new semantic conventions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

gen-ai Related to generative AI

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Instrument OpenAI Responses API

7 participants