Control Flow

Semantic Conditions

AI-powered routing with embeddings, LLM classifiers, and condition combinators.

Noetic provides a set of condition helpers that let you route pipeline inputs using embeddings, LLM classifiers, and boolean combinators. These helpers produce async route functions compatible with branch.

Condition Type

Every condition is an async boolean predicate:

type Condition<I> = (input: I, ctx: Context) => Promise<boolean>;

Conditions are the building blocks for semanticRoute, anyCondition, and allCondition.

Route Builders

semanticSwitch

Routes input to the most semantically similar case using cosine similarity on embeddings.

Simple form -- string keys map directly to steps:

import { branch, semanticSwitch, step } from '@noetic/core';
import type { EmbedFn } from '@noetic/core';

const embed: EmbedFn = /* your embedding function */;

const router = branch({
  id: 'semantic-router',
  route: semanticSwitch({
    embed,
    cases: {
      'greeting or salutation': step.llm({ id: 'greet', model: 'gpt-4o-mini' }),
      'technical question': step.llm({ id: 'tech', model: 'gpt-4o' }),
      'complaint or feedback': step.llm({ id: 'feedback', model: 'gpt-4o' }),
    },
    threshold: 0.7,
  }),
});

Advanced form -- multiple labels per case:

const router = branch({
  id: 'semantic-router',
  route: semanticSwitch({
    embed,
    cases: [
      {
        labels: ['greeting', 'hello', 'hi there'],
        step: step.llm({ id: 'greet', model: 'gpt-4o-mini' }),
      },
      {
        labels: 'technical question',
        step: step.llm({ id: 'tech', model: 'gpt-4o' }),
      },
    ],
    default: step.llm({ id: 'fallback', model: 'gpt-4o-mini' }),
    threshold: 0.7,
  }),
});
PropertyTypeRequiredDescription
embedEmbedFnYesBatch embedding function
casesRecord<string, Step> or { labels: string | string[]; step: Step }[]YesLabel-to-step mapping
defaultStepNoFallback step when no case exceeds the threshold
thresholdnumberNoMinimum cosine similarity (default 0.7)
cacheStorageAdapterNoPersists label embeddings across invocations

When no case exceeds the threshold and no default is provided, the route returns null (branch skips).

semanticRoute

Evaluates condition clauses in order and returns the first match. Built from when and otherwise clauses.

import { branch, semanticRoute, when, otherwise, embeddingMatch, aiCondition } from '@noetic/core';

const router = branch({
  id: 'smart-router',
  route: semanticRoute(
    when(embeddingMatch(embed, 'greeting', 0.8), greetingStep),
    when(aiCondition({ callModel, model: 'gpt-4o-mini', prompt: 'Is this urgent?' }), urgentStep),
    otherwise(defaultStep),
  ),
});

Returns null if no when matches and no otherwise is present.

Clause Builders

when

Creates a conditional clause. The condition is evaluated; if true, the associated step is selected.

import { when } from '@noetic/core';

when(condition, step);
ParameterTypeDescription
conditionCondition<I>Async predicate
stepStep<I, O>Step to execute when condition is true

otherwise

Creates a fallback clause. Used as the last argument to semanticRoute.

import { otherwise } from '@noetic/core';

otherwise(fallbackStep);

Condition Helpers

embeddingMatch

Returns true when the input is semantically similar to a label (or set of labels) above a threshold.

Simple form:

import { embeddingMatch } from '@noetic/core';

const isGreeting = embeddingMatch(embed, 'greeting or salutation', 0.8);

Advanced form -- multiple labels with match mode:

const isTechSupport = embeddingMatch({
  embed,
  labels: ['technical question', 'bug report', 'how do I'],
  threshold: 0.75,
  match: 'any', // 'any' (default) or 'all'
  cache: myStorageAdapter,
});
PropertyTypeRequiredDescription
embedEmbedFnYesBatch embedding function
labelsstring[]YesLabels to compare against
thresholdnumberYesMinimum cosine similarity
match'any' | 'all'NoMatch mode (default 'any')
cacheStorageAdapterNoPersists label embeddings

aiCondition

Uses an LLM as a boolean classifier. The model receives the input and answers a yes/no question with structured JSON output.

import { aiCondition } from '@noetic/core';

const isUrgent = aiCondition({
  callModel,
  model: 'gpt-4o-mini',
  prompt: 'Is this message urgent and requiring immediate attention?',
});
PropertyTypeRequiredDescription
callModelCallModelFnYesModel invocation function
modelstringYesModel identifier
promptstringYesYes/no question to classify against

The model is called with temperature: 0 and a JSON response schema ({ answer: boolean }).

Combinators

anyCondition

Logical OR -- returns true if any condition is true. Short-circuits on the first true.

import { anyCondition, embeddingMatch, aiCondition } from '@noetic/core';

const isGreetingOrUrgent = anyCondition(
  embeddingMatch(embed, 'greeting', 0.8),
  aiCondition({ callModel, model: 'gpt-4o-mini', prompt: 'Is this urgent?' }),
);

allCondition

Logical AND -- returns true if all conditions are true. Short-circuits on the first false.

import { allCondition, embeddingMatch, aiCondition } from '@noetic/core';

const isUrgentGreeting = allCondition(
  embeddingMatch(embed, 'greeting', 0.8),
  aiCondition({ callModel, model: 'gpt-4o-mini', prompt: 'Is this urgent?' }),
);

Custom Conditions

Any function matching Condition<I> works with when, anyCondition, and allCondition:

import type { Condition } from '@noetic/core';

const containsKeyword: Condition<string> = async (input) => {
  return input.toLowerCase().includes('help');
};

const router = branch({
  id: 'custom-router',
  route: semanticRoute(
    when(containsKeyword, helpStep),
    otherwise(defaultStep),
  ),
});

Input Serialization

When a condition receives non-string input, it is serialized to JSON before embedding or classification. String inputs are passed as-is. This means your embeddings and LLM classifiers see a consistent text representation regardless of the input type.

Caching

Embedding-based conditions (semanticSwitch, embeddingMatch) accept an optional cache: StorageAdapter to persist label embeddings. This is useful in ephemeral environments like Cloudflare Workers where label vectors would otherwise be recomputed on every invocation.

const route = semanticSwitch({
  embed,
  cases: { /* ... */ },
  cache: myStorageAdapter,
});

Label vectors are cached under deterministic keys (embed:<encoded-label>). In-memory caching is always active -- the StorageAdapter adds persistence across process restarts.

  • branch -- the control flow primitive that uses route functions.
  • fork -- parallel execution.
  • Adapters -- EmbedFn and CallModelFn types.

On this page