semantic_router.llms.openai
OpenAILLM Objects
LLM for OpenAI. Requires an OpenAI API key from https://platform.openai.com/api-keys.
__init__
Initialize the OpenAILLM.
Arguments:
name
(Optional[str]
): The name of the OpenAI model to use.openai_api_key
(Optional[str]
): The OpenAI API key.temperature
(float
): The temperature of the LLM.max_tokens
(int
): The maximum number of tokens to generate.
async_extract_tool_calls_info
Extract the tool calls information from the tool calls.
Arguments:
tool_calls
(List[ChatCompletionMessageToolCall]
): The tool calls to extract the information from.
Returns:
List[Dict[str, Any]]
: The tool calls information.
__call__
Call the OpenAILLM.
Arguments:
messages
(List[Message]
): The messages to pass to the OpenAILLM.function_schemas
(Optional[List[Dict[str, Any]]]
): The function schemas to pass to the OpenAILLM.
Returns:
str
: The response from the OpenAILLM.
acall
Call the OpenAILLM asynchronously.
Arguments:
messages
(List[Message]
): The messages to pass to the OpenAILLM.function_schemas
(Optional[List[Dict[str, Any]]]
): The function schemas to pass to the OpenAILLM.
Returns:
str
: The response from the OpenAILLM.
extract_function_inputs
Extract the function inputs from the query.
Arguments:
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.
Returns:
List[Dict[str, Any]]
: The function inputs.
async_extract_function_inputs
Extract the function inputs from the query asynchronously.
Arguments:
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.
Returns:
List[Dict[str, Any]]
: The function inputs.
get_schemas_openai
Get function schemas for the OpenAI LLM from a list of functions.
Arguments:
items
(List[Callable]
): The functions to get function schemas for.
Returns:
List[Dict[str, Any]]
: The schemas for the OpenAI LLM.