LLM / AI Backend
By default OpsWorker uses its own managed AI backend. If your organization requires data sovereignty or has specific model preferences, you can bring your own LLM.
🟠 AWS Bedrock
Use Amazon Bedrock — Claude, Titan, or other hosted foundation models — as the AI backend for OpsWorker.
Read more →🔵 Azure OpenAI
Connect OpsWorker to your Azure OpenAI deployment for GPT-based investigation and chat capabilities.
Read more →🧠 Bring Your Own LLM
Configure a custom OpenAI-compatible LLM endpoint to use any self-hosted or third-party language model.
Read more →