Azure OpenAI
Overview
Azure OpenAI is available as an alternative LLM provider for OpsWorker, primarily for organizations with Azure-only policies or specific compliance requirements.
Availability
| Deployment Type | Available |
|---|---|
| SaaS | Default uses AWS Bedrock |
| Dedicated AWS | Contact OpsWorker team |
| Private Cloud | Yes — configure your Azure OpenAI endpoint |
Configuration
Azure OpenAI integration is configured for dedicated and private cloud deployments:
- Azure OpenAI resource: Create an Azure OpenAI resource in your Azure subscription
- Deploy models: Deploy GPT-4 or GPT-4 Turbo models
- Provide credentials: Share the endpoint URL and API key with the OpsWorker team during deployment setup
Required Information
| Field | Description |
|---|---|
| Endpoint | Your Azure OpenAI endpoint URL |
| API Key | Authentication key |
| Model deployment name | Name of your deployed model |
| API version | Azure OpenAI API version |
Supported Models
- GPT-4
- GPT-4 Turbo
Use Cases
- Organizations with Azure-only cloud policies
- Environments where data must stay within Azure
- Compliance requirements that mandate specific cloud providers
Getting Started
Contact the OpsWorker team to configure Azure OpenAI for your deployment. This is a deployment-time configuration, not a self-service setup.
Next Steps
- Architecture Overview — Understand the system design
- Private Cloud Deployment — Deploy in your own cloud