Hard crash while using Gemini 2.0 Thinking #5552
Replies: 3 comments 5 replies
-
Latest dependency is now 591a019 and coincidentally, I updated google dependencies. Try latest. I can't reproduce this on the same commit anyway. |
Beta Was this translation helpful? Give feedback.
-
@danny-avila this needs to be handled. it makes the instance to stop/restart in case of any malformed data received by GenAI, due to network issues for example. |
Beta Was this translation helpful? Give feedback.
-
I still can't replicate this but adding a try/catch block over the stream generation should fix this. The following prompt indeed gets cut off due to google censoring, but I may not be receiving the error as the abort controller may be triggering before upstream SDK can bubble up the error Tell me a story in two paragraphs, the first paragraph should be in English, the second paragraph should be in Cyrillic alphabet Relevant upstream code: |
Beta Was this translation helpful? Give feedback.
-
What happened?
Model is gemini-2.0-flash-thinking-exp-01-21, right now it's quite unstable for me and returns 500 Internal Server Error for almost any request. The immediate 500 itself is being handled correctly.
However, if it's starts responding, and then fails while streaming the response, it seems to crash whole LibreChat instance.
I haven't found any correlation of that happening with the GOOGLE_SAFETY settings.
LibreChat Version: d60a149 (latest
main
commit at the time of writing)Steps to Reproduce
gemini-2.0-flash-thinking-exp-01-21
to GOOGLE_MODELSWhat browsers are you seeing the problem on?
No response
Relevant log output
Service:
Error log:
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions