OpenAILLM Objects
__init__
name(Optional[str]): The name of the OpenAI model to use.openai_api_key(Optional[str]): The OpenAI API key.temperature(float): The temperature of the LLM.max_tokens(int): The maximum number of tokens to generate.
async_extract_tool_calls_info
tool_calls(List[ChatCompletionMessageToolCall]): The tool calls to extract the information from.
List[Dict[str, Any]]: The tool calls information.
__call__
messages(List[Message]): The messages to pass to the OpenAILLM.function_schemas(Optional[List[Dict[str, Any]]]): The function schemas to pass to the OpenAILLM.
str: The response from the OpenAILLM.
acall
messages(List[Message]): The messages to pass to the OpenAILLM.function_schemas(Optional[List[Dict[str, Any]]]): The function schemas to pass to the OpenAILLM.
str: The response from the OpenAILLM.
extract_function_inputs
query(str): The query to extract the function inputs from.function_schemas(List[Dict[str, Any]]): The function schemas to extract the function inputs from.
List[Dict[str, Any]]: The function inputs.
async_extract_function_inputs
query(str): The query to extract the function inputs from.function_schemas(List[Dict[str, Any]]): The function schemas to extract the function inputs from.
List[Dict[str, Any]]: The function inputs.
get_schemas_openai
items(List[Callable]): The functions to get function schemas for.
List[Dict[str, Any]]: The schemas for the OpenAI LLM.
