llms
semantic_router.llms.llamacpp
LlamaCppLLM Objects
LLM for LlamaCPP. Enables fully local LLM use, helpful for local implementation of dynamic routes.
__init__
Initialize the LlamaCPPLLM.
Arguments:
llm
(Any
): The LLM to use.name
(str
): The name of the LLM.temperature
(float
): The temperature of the LLM.max_tokens
(Optional[int]
): The maximum number of tokens to generate.grammar
(Optional[Any]
): The grammar to use.
__call__
Call the LlamaCPPLLM.
Arguments:
messages
(List[Message]
): The messages to pass to the LlamaCPPLLM.
Returns:
str
: The response from the LlamaCPPLLM.
extract_function_inputs
Extract the function inputs from the query.
Arguments:
query
(str
): The query to extract the function inputs from.function_schemas
(List[Dict[str, Any]]
): The function schemas to extract the function inputs from.
Returns:
List[Dict[str, Any]]
: The function inputs.