llms
semantic_router.llms.ollama
OllamaLLM Objects
LLM for Ollama. Enables fully local LLM use, helpful for local implementation of dynamic routes.
__init__
Initialize the OllamaLLM.
Arguments:
name
(str
): The name of the Ollama model to use.temperature
(float
): The temperature of the LLM.max_tokens
(Optional[int]
): The maximum number of tokens to generate.stream
(bool
): Whether to stream the response.
__call__
Call the OllamaLLM.
Arguments:
messages
(List[Message]
): The messages to pass to the OllamaLLM.temperature
(Optional[float]
): The temperature of the LLM.name
(Optional[str]
): The name of the Ollama model to use.max_tokens
(Optional[int]
): The maximum number of tokens to generate.stream
(Optional[bool]
): Whether to stream the response.