LlamaCppLLM Objects
__init__
llm(Any): The LLM to use.name(str): The name of the LLM.temperature(float): The temperature of the LLM.max_tokens(Optional[int]): The maximum number of tokens to generate.grammar(Optional[Any]): The grammar to use.
__call__
messages(List[Message]): The messages to pass to the LlamaCPPLLM.
str: The response from the LlamaCPPLLM.
extract_function_inputs
query(str): The query to extract the function inputs from.function_schemas(List[Dict[str, Any]]): The function schemas to extract the function inputs from.
List[Dict[str, Any]]: The function inputs.
