Quickstart
To get started with semantic-router we install it like so:
If wanting to use a fully local version of semantic router you can use HuggingFaceEncoder
and LlamaCppLLM
(pip install -qU "semantic-router[local]"
, see here). To use the HybridRouteLayer
you must pip install -qU "semantic-router[hybrid]"
.
We begin by defining a set of Route
objects. These are the decision paths that the semantic router can decide to use, let’s try two simple routes for now — one for talk on politics and another for chitchat:
We have our routes ready, now we initialize an embedding / encoder model. We currently support a CohereEncoder
and OpenAIEncoder
— more encoders will be added soon. To initialize them we do:
With our routes
and encoder
defined we now create a RouteLayer
. The route layer handles our semantic decision making.
We can now use our route layer to make super fast decisions based on user queries. Let’s try with two queries that should trigger our route decisions:
Correct decision, let’s try another:
We get both decisions correct! Now lets try sending an unrelated query:
In this case, no decision could be made as we had no matches — so our route layer returned None
!