Azure AI Foundry Endpoint Construction Guide
Overview
The Azure AI Foundry Project endpoint cannot be retrieved directly from Azure resource properties. It must be constructed from the Hub and Project resource names.
The Problem
Azure ML Workspace resources (which back AI Foundry Hub/Project) expose these properties:
- ✅ discoveryUrl - Internal workspace URL (format: https://<workspace-guid>.workspace.<region>.api.azureml.ms/...)
- ✅ workspaceId - Workspace GUID
- ✅ mlFlowTrackingUri - MLflow endpoint
- ❌ No property for the AI Project API endpoint
Required Endpoint Format
For Azure AI Agent Framework SDK to work, we need:
Example:
How We Construct It
In infrastructure/bicep/modules/ai-services.bicep:
// Hub resource
resource aiFoundryHub 'Microsoft.MachineLearningServices/workspaces@2024-04-01' = {
name: aiFoundryHubName // e.g., 'ldfdev2-dev-aihub'
kind: 'Hub'
// ...
}
// Project resource
resource aiFoundryProject 'Microsoft.MachineLearningServices/workspaces@2024-04-01' = {
name: '${aiFoundryHubName}-project' // e.g., 'ldfdev2-dev-aihub-project'
kind: 'Project'
properties: {
hubResourceId: aiFoundryHub.id
}
}
// Construct the endpoint
output azureAiProjectEndpoint string = 'https://${aiFoundryHubName}.services.ai.azure.com/api/projects/${aiFoundryProject.name}'
Construction Logic
- Hub name → Used as subdomain:
<hub-name>.services.ai.azure.com - Project name → Used in path:
/api/projects/<project-name> - Domain → Constant:
.services.ai.azure.com - No region → Unlike
.api.azureml.ms, this endpoint is region-agnostic
Verification Steps
Automatic Verification (Recommended)
The Substrate deployment script automatically runs endpoint verification after successful deployment:
What happens: 1. ✅ Substrate layer deploys (ACR, AI Foundry Hub/Project) 2. ✅ Deployment outputs are retrieved 3. ✅ Verification script runs automatically 4. ✅ Endpoint construction validated 5. ✅ DNS resolution tested 6. ✅ Deployment exits with success (0) or failure (1)
If verification fails: - ❌ Deployment script exits with error - 🛑 Apps deployment is blocked - 📋 Error details displayed for debugging - 🔧 Fix required before proceeding
Manual Verification (Optional)
Manual Verification (Optional)
If you need to re-verify endpoints without re-deploying:
This will: - ✅ Show all endpoint values from deployment outputs - ✅ Verify construction logic is correct - ✅ Test DNS resolution - ✅ Provide debugging guidance
2. Manual Verification
Check the outputs:
az deployment group show \
--resource-group <rg> \
--name <deployment-name> \
--query "properties.outputs" -o json
Verify these values:
- aiFoundryHubName - Should match your Hub resource name
- aiFoundryProjectName - Should match your Project resource name
- azureAiProjectEndpoint - Should follow the format above
- aiFoundryProjectDiscoveryUrl - For comparison (this is NOT the right endpoint)
3. Test from Application
export AZURE_AI_PROJECT_ENDPOINT="https://<hub>.services.ai.azure.com/api/projects/<project>"
# Run your agent code
Common Issues and Debugging
Issue 1: DNS Resolution Fails
Symptom: Cannot resolve <hub-name>.services.ai.azure.com
Causes: - Private endpoints enabled but testing from outside VNet - Private DNS zone not linked to VNet - DNS propagation delay (if just deployed)
Debug:
# From within VNet (via Bastion/Jump Box):
nslookup <hub-name>.services.ai.azure.com
# Should return a private IP (10.x.x.x) if using private endpoints
Issue 2: Authentication Fails
Symptom: 401/403 errors when calling endpoint
Causes: - Managed identity missing RBAC permissions - Wrong identity being used - Project not properly linked to Hub
Debug:
# Check RBAC assignments on AI Foundry resources
az role assignment list --scope <project-resource-id>
# Verify managed identity has:
# - "Azure AI Administrator" on Hub and Project
# - "Cognitive Services OpenAI User" on AI Services
Issue 3: Endpoint Not Found (404)
Symptom: 404 when calling endpoint
Causes: - Hub or Project name mismatch - Resources not fully provisioned - Wrong endpoint format
Debug:
# Verify actual resource names
az ml workspace list --resource-group <rg> -o table
# Compare with constructed endpoint
echo "Expected Hub: <hub-from-resource-list>"
echo "Endpoint uses: <hub-from-endpoint>"
Validation Checklist
Before deploying Apps layer:
- Substrate deployment succeeded
-
verify-ai-endpoints.shscript passes - Hub and Project resources exist in Azure portal
- DNS resolution works (from VNet if private endpoints)
- Managed identity has RBAC permissions
- Endpoint stored in Key Vault correctly
- AI Models deployed successfully
Why Construction is Necessary
Azure doesn't expose the .services.ai.azure.com endpoint because:
- Historical reasons - AI Foundry is built on Azure ML infrastructure
- Portal vs SDK - Portal uses
discoveryUrl, SDK uses different endpoint - Abstraction - The endpoint format is standardized and predictable
- Microsoft's approach - Their documentation shows constructing this URL
References
- Azure AI Foundry Documentation: https://learn.microsoft.com/azure/ai-studio/
- Azure ML Workspaces API: https://learn.microsoft.com/rest/api/azureml/
- Private DNS for AI Services: https://learn.microsoft.com/azure/ai-services/cognitive-services-virtual-networks
Troubleshooting Resources
If the constructed endpoint doesn't work:
- Run verification script -
verify-ai-endpoints.sh - Check Azure portal - Verify resource names in portal match construction
- Test from VNet - If private endpoints, test from Bastion/Jump Box
- Review logs - Check Application Insights for connection errors
- Open issue - Document the mismatch for team review
Last Updated: 2025-10-25
Status: ⚠️ Construction approach verified against Microsoft patterns but requires post-deployment validation