Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SOCAI error 400 #745

Open
kazhuyo opened this issue Jul 18, 2024 · 0 comments
Open

SOCAI error 400 #745

kazhuyo opened this issue Jul 18, 2024 · 0 comments
Assignees
Labels
needs-triage Needs to be triaged

Comments

@kazhuyo
Copy link

kazhuyo commented Jul 18, 2024

When trying process alert with soc-ai, the docker logs return error 400

Here's the part of error in socai docker container :

request to GPT: status code '400' received '{
  "error": {
    "message": "This model's maximum context length is 16385 tokens. However, your messages resulted in 19181 tokens. Please reduce the length of the messages.",
    "type": "invalid_request_error",
    "param": "messages",
    "code": "context_length_exceeded"
  }
}'

I think the default model for socai is using gpt-3.5-turbo-16k, which is need to change gpt-4-turbo to support above 16K tokens.

@kazhuyo kazhuyo added the bug label Jul 18, 2024
@osmontero osmontero added the needs-triage Needs to be triaged label Jan 22, 2025
@osmontero osmontero changed the title [BUG] SOCAI error 400 SOCAI error 400 Jan 22, 2025
@osmontero osmontero removed the bug label Mar 16, 2025
@osmontero osmontero moved this to 🆕 New in UTMStack Mar 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage Needs to be triaged
Projects
Status: 🆕 New
Development

No branches or pull requests

3 participants