LlamaCppLLM Objects

class LlamaCppLLM(BaseLLM)

LLM for LlamaCPP. Enables fully local LLM use, helpful for local implementation of dynamic routes.

__init__

def __init__(llm: Any,
             name: str = "llama.cpp",
             temperature: float = 0.2,
             max_tokens: Optional[int] = 200,
             grammar: Optional[Any] = None)

Initialize the LlamaCPPLLM.

Arguments:

  • llm (Any): The LLM to use.
  • name (str): The name of the LLM.
  • temperature (float): The temperature of the LLM.
  • max_tokens (Optional[int]): The maximum number of tokens to generate.
  • grammar (Optional[Any]): The grammar to use.

__call__

def __call__(messages: List[Message]) -> str

Call the LlamaCPPLLM.

Arguments:

  • messages (List[Message]): The messages to pass to the LlamaCPPLLM.

Returns:

str: The response from the LlamaCPPLLM.

extract_function_inputs

def extract_function_inputs(
        query: str, function_schemas: List[Dict[str,
                                                Any]]) -> List[Dict[str, Any]]

Extract the function inputs from the query.

Arguments:

  • query (str): The query to extract the function inputs from.
  • function_schemas (List[Dict[str, Any]]): The function schemas to extract the function inputs from.

Returns:

List[Dict[str, Any]]: The function inputs.