Null Input & Output when calling Azure OpenAI model #3977
Unanswered
haroldpijpelink
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Have you set
In your flow.dag.yaml? I'm getting the same issue when I'm running it through the python file itself (via fastapi). Then there is no neat trace. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi there everyone,
I have this issue where I can run promptflow and get the outputs I want, but tracing does not seem to work properly. Instead of getting a success or fail status, it says the following in the output json:

status: {
status_code: "Unset"
description: ""
}
Moreover, the Input and Output fields show as Null in the UI and I have no information on the amounts of tokens used.
I get this both when I use start_trace() in the tracing docs and when I try to access through the Prompty extension. I realize that this might be some issue with OpenTelemetry settings and not Promptflow itself, but if anyone could point me in the right direction that'd be great :)
Beta Was this translation helpful? Give feedback.
All reactions