Skip to main content

Azure OpenAI

Overview

Azure OpenAI is available as an alternative LLM provider for OpsWorker, primarily for organizations with Azure-only policies or specific compliance requirements.

Availability

Deployment TypeAvailable
SaaSDefault uses AWS Bedrock
Dedicated AWSContact OpsWorker team
Private CloudYes — configure your Azure OpenAI endpoint

Configuration

Azure OpenAI integration is configured for dedicated and private cloud deployments:

  1. Azure OpenAI resource: Create an Azure OpenAI resource in your Azure subscription
  2. Deploy models: Deploy GPT-4 or GPT-4 Turbo models
  3. Provide credentials: Share the endpoint URL and API key with the OpsWorker team during deployment setup

Required Information

FieldDescription
EndpointYour Azure OpenAI endpoint URL
API KeyAuthentication key
Model deployment nameName of your deployed model
API versionAzure OpenAI API version

Supported Models

  • GPT-4
  • GPT-4 Turbo

Use Cases

  • Organizations with Azure-only cloud policies
  • Environments where data must stay within Azure
  • Compliance requirements that mandate specific cloud providers

Getting Started

Contact the OpsWorker team to configure Azure OpenAI for your deployment. This is a deployment-time configuration, not a self-service setup.

Next Steps