Currently Supported Providers
OpenAI
Below are the default_params and model name for OpenAI. Specify override values in the config.yaml
file.
model: "gpt-3.5-turbo"
default_params:
temperature: 0.8
top_p: 1
max_tokens: 100
n: 1
stop:
- ""
frequency_penalty: 0
presence_penalty: 0
logit_bias: null
user: null
seed: null
tools: []
tool_choice: null
response_format: null
Azure OpenAI
Below are the default_params for Azure OpenAI. Specify override values in the config.yaml
file.
There is no default model for Azure OpenAI.
default_params:
temperature: 0.8
top_p: 1
max_tokens: 100
n: 1
stop:
- ""
frequency_penalty: 0
presence_penalty: 0
logit_bias: null
user: null
seed: null
tools: []
tool_choice: null
response_format: null
Cohere
the default_params and model name for Cohere. Specify override values in the config.yaml
file.
model: "command-light"
default_params:
temperature: 0.8
preamble_override: ""
chat_history: []
conversation_id: ""
prompt_truncation: ""
connectors: []
search_queries_only: false
citiation_quality: ""
OctoML
the default_params and model name for OctoML. Specify override values in the config.yaml
file.
model: "mistral-7b-instruct-fp16"
default_params:
temperature: 1
top_p: 1
max_tokens: 100
stop:
- ""
frequency_penalty: 0
presence_penalty: 0