Endpoint Registration

Configure your LLM endpoint in Now Assist admin — URL, auth, model capabilities. Common choices: Azure OpenAI (enterprise contract, regional deployment), self-hosted Llama or similar on your infra, or specialty providers per use case.

Auth and Secrets

Credentials go in ServiceNow’s credential store. Rotate on schedule. Never hardcode. Audit access — BYO LLM credentials unlock expensive inference, and a compromised credential can create runaway bills.

Fallback Strategy

BYO endpoints have availability considerations. Configure fallback to ServiceNow’s default model when yours is unavailable or too slow. Fallback degrades feature (may lose fine-tuning specifics) but maintains continuity.

Monitoring

Latency, error rate, cost per call. BYO LLM costs often surprise — the calls that in sampled testing were cheap end up 10× in production volume. Set cost alerts before deploying broadly.

Share