Change used model and parameters in Deployed prompt API
complete
Tomas Rychlik
Be able to override the configuration in prompt API - like change the used model version, max_tokens, etc
Tomas Rychlik
complete
You can now override any openai fields in prompt API invocation:
- model
- temperature
- top_p
- presence_penalty
- frequency_penalty
- tools
- stop
- tool_choice
- response_format
Petr Brzek
planned