Stream Token Usage for Azure AI (not Azure OpenAI) Model #4663
Unanswered
petewhitfield
asked this question in
Help
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm using Azure AI & Azure OpenAI for a chatbot with multiple providers and models. I'm able to stream token usage with createAzure for my Azure OpenAI models, and with createDeepseek for the Deepseek-R1 model hosted in an Azure AI project. I have a Llama3.3 model deployed in the same Azure AI Project that DeepSeek is in, but token usage shows as NaN with createGroq.
Is there a different provider I should use for Llama in Azure AI, or a custom option that sends token usage? My understanding was that the Groq provider supports it (https://sdk.vercel.ai/providers/ai-sdk-providers/groq), but I wonder if it doesn't work because the model is deployed in Azure.
Beta Was this translation helpful? Give feedback.
All reactions