name
(Optional[str]
): The name of the OpenAI model to use.openai_api_key
(Optional[str]
): The OpenAI API key.temperature
(float
): The temperature of the LLM.max_tokens
(int
): The maximum number of tokens to generate.tool_calls
(List[ChatCompletionMessageToolCall]
): The tool calls to extract the information from.List[Dict[str, Any]]
: The tool calls information.
messages
(List[Message]
): The messages to pass to the OpenAILLM.function_schemas
(Optional[List[Dict[str, Any]]]
): The function schemas to pass to the OpenAILLM.str
: The response from the OpenAILLM.
messages
(List[Message]
): The messages to pass to the OpenAILLM.function_schemas
(Optional[List[Dict[str, Any]]]
): The function schemas to pass to the OpenAILLM.str
: The response from the OpenAILLM.
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.List[Dict[str, Any]]
: The function inputs.
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.List[Dict[str, Any]]
: The function inputs.
items
(List[Callable]
): The functions to get function schemas for.List[Dict[str, Any]]
: The schemas for the OpenAI LLM.