Using a ResponseFormat of "json_object" causes errors local OpenAI-like models using LM Studio #55
Labels
Investigating
Issue is being investigated
triage
Initial state for our team to determine nessessary action
Version
main
Describe the bug
I need to use "json_schema" here per the documentation: https://lmstudio.ai/docs/advanced/structured-output
I do not know why they made it different than OpenAI hosted models.
To Reproduce
Use the WeatherBot sample with a local model hosted with LM Studio.
Define some JSON schema in LM Studio:
Use local endpoints:
AIServices:LocalModel:ModelName = qwen2.5-7b-instruct
AIServices:LocalModel:Endpoint = http://127.0.0.1:1234/v1/
Additional context
Maybe this is more a feature request to consider local model usage for OpenAI-like models that aren't OpenAI hosted models.
The text was updated successfully, but these errors were encountered: