LLM for OpenAI. Requires an OpenAI API key from https://platform.openai.com/api-keys.
Initialize the OpenAILLM.
Arguments:
name
(Optional[str]
): The name of the OpenAI model to use.openai_api_key
(Optional[str]
): The OpenAI API key.temperature
(float
): The temperature of the LLM.max_tokens
(int
): The maximum number of tokens to generate.Extract the tool calls information from the tool calls.
Arguments:
tool_calls
(List[ChatCompletionMessageToolCall]
): The tool calls to extract the information from.Returns:
List[Dict[str, Any]]
: The tool calls information.
Call the OpenAILLM.
Arguments:
messages
(List[Message]
): The messages to pass to the OpenAILLM.function_schemas
(Optional[List[Dict[str, Any]]]
): The function schemas to pass to the OpenAILLM.Returns:
str
: The response from the OpenAILLM.
Call the OpenAILLM asynchronously.
Arguments:
messages
(List[Message]
): The messages to pass to the OpenAILLM.function_schemas
(Optional[List[Dict[str, Any]]]
): The function schemas to pass to the OpenAILLM.Returns:
str
: The response from the OpenAILLM.
Extract the function inputs from the query.
Arguments:
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.Returns:
List[Dict[str, Any]]
: The function inputs.
Extract the function inputs from the query asynchronously.
Arguments:
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.Returns:
List[Dict[str, Any]]
: The function inputs.
Get function schemas for the OpenAI LLM from a list of functions.
Arguments:
items
(List[Callable]
): The functions to get function schemas for.Returns:
List[Dict[str, Any]]
: The schemas for the OpenAI LLM.