[Feature Request] A way to see generated prompt and response #129
Replies: 4 comments
-
@BertanAygun from what i'm seeing it looks like most of that is there if you do "docker logs conatinername -f" |
Beta Was this translation helpful? Give feedback.
-
I can try to update but the document data and below is all I am seeing for the AI requests in the logs: [DEBUG] [08.01.25, 19:21] OpenAI request sent |
Beta Was this translation helpful? Give feedback.
-
actually, what i was seeing was the source document data not the prompt/response, sorry |
Beta Was this translation helpful? Give feedback.
-
Sorry, forgot to inform you that this feature is already implemented. You have full log access in the container folder /logs |
Beta Was this translation helpful? Give feedback.
-
I was trying to restrict to prompt to certain correspondents only but so far failed in preventing it from generating new correspondent entries. It would be easier if I am able to see the final prompt generated and the raw response, in a log somewhere.
As far as I can see there is no logging to files, or logs to console do not include request/response to LLM. An option to log all LLM interactions to a file would be great in these cases.
Beta Was this translation helpful? Give feedback.
All reactions