Ability to inspect the raw request going out to LLM provider.
complete
S
Sepia Porpoise
Sometimes, I am not able to reproduce the output I see on Langtail elsewhere even after retrying multiple times. It'll be great if I could see the raw request and response from LLM provider on each generation in the chat UI.
This is also helpful to know other response values like system fingerprint while developing a prompt
Tomas Rychlik
complete
Tomas Rychlik
in progress
You can now find the raw request in logs, we will make this a bit more easy to find in chat ui
Tomas Rychlik
Just to clarify: all requests from chat UI (when you click send) are logged in logs since few months ago :)
Tomas Rychlik
Its a bit hidden, but when you go to Logs, click on the specific log and switch the "Request to provider" to JSON, you should be able to see all request data that went to OpenAI.
It might make sense to also expose this information (or at least link to the log) somewhere in the chat ui