What BYO LLM Enables

Since the Washington release, customers can use pre-approved LLMs (Azure OpenAI, hosted models, others) with ServiceNow’s GenAI layer instead of ServiceNow’s default model. Compliance teams who need specific model versions now have a path.

When It Fits

Regulated industries requiring specific model residency or versioning. Orgs with strong existing LLM investments (Azure OpenAI contracts, bespoke fine-tunes). Cost-sensitive deployments where self-hosted inference is cheaper at scale.

Governance

BYO LLM doesn’t automatically inherit all Now Assist protections. Configure PII masking, logging, and audit trails explicitly for your endpoint. The responsibility shifts to you — ServiceNow’s default covers this automatically.

Setup Path

Register the LLM endpoint. Configure authentication. Test with non-sensitive queries first. Monitor latency and error rates versus the default. Plan a fallback to the default model for availability.

Share