Skip to content

Azure Monitor Integration Example

This example demonstrates how to integrate the OpenTelemetry Collector with Azure Monitor and Application Insights.

Prerequisites

  • Azure subscription
  • Azure Monitor workspace or Application Insights resource
  • Connection string from Azure Portal
  • Docker Desktop (for local testing)

Step 1: Create Azure Monitor Resources

Option A: Application Insights

  1. Go to Azure Portal
  2. Create Application Insights resource
  3. Copy the Connection String from Overview page

Option B: Log Analytics Workspace

  1. Go to Azure Portal
  2. Create Log Analytics Workspace
  3. Note the workspace ID and key

Step 2: Configure Collector

Update your OpenTelemetry Collector configuration file (e.g., otel-collector-config.azure-monitor.yaml):

exporters:
  azuremonitor:
    connection_string: ${AZURE_MONITOR_CONNECTION_STRING}

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [azuremonitor]
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [azuremonitor]
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [azuremonitor]

Step 3: Set Connection String

Local Development (Docker Compose)

Add to docker-compose.yml:

services:
  otel-collector:
    environment:
      - AZURE_MONITOR_CONNECTION_STRING=InstrumentationKey=xxx;IngestionEndpoint=https://xxx.in.applicationinsights.azure.com/

Or use .env file:

AZURE_MONITOR_CONNECTION_STRING=InstrumentationKey=xxx;IngestionEndpoint=https://xxx.in.applicationinsights.azure.com/

Kubernetes

Create secret:

kubectl create secret generic observability-secrets \
  --from-literal=azure-monitor-connection-string='InstrumentationKey=xxx;IngestionEndpoint=https://xxx.in.applicationinsights.azure.com/' \
  -n observability

Update collector deployment to use secret:

env:
  - name: AZURE_MONITOR_CONNECTION_STRING
    valueFrom:
      secretKeyRef:
        name: observability-secrets
        key: azure-monitor-connection-string

Step 4: Start Collector

# Local
docker-compose up -d otel-collector

# Kubernetes
kubectl apply -f Deployment/kubernetes/observability/otel-collector-deployment.yaml

Step 5: Verify Data Flow

Check Collector Logs

docker-compose logs otel-collector | grep azuremonitor

Look for successful export messages.

Check Azure Portal

  1. Go to Application Insights resource
  2. Navigate to Logs
  3. Run query:
traces
| take 10

Or for metrics:

customMetrics
| take 10

Step 6: Create Dashboards

Application Map

  1. Go to Application Insights > Application Map
  2. View service dependencies and health

Metrics Explorer

  1. Go to Metrics
  2. Select metric namespace (e.g., customMetrics)
  3. Add metrics to chart
  4. Save as dashboard

Log Analytics Queries

Example queries:

// Request rate
traces
| where timestamp > ago(1h)
| summarize count() by bin(timestamp, 5m)
| render timechart

// Error rate
traces
| where timestamp > ago(1h)
| where severityLevel >= 3
| summarize count() by bin(timestamp, 5m)
| render timechart

// Top slow requests
traces
| where timestamp > ago(1h)
| where duration > 1000
| project timestamp, name, duration
| order by duration desc
| take 10

Configuration Examples

Resource Attributes

Add custom resource attributes:

processors:
  resource:
    attributes:
      - key: cloud.provider
        value: azure
        action: upsert
      - key: cloud.region
        value: ${AZURE_REGION:westus2}
        action: upsert
      - key: deployment.environment
        value: ${ENVIRONMENT:production}
        action: upsert

Sampling

Reduce data volume with sampling:

processors:
  probabilistic_sampler:
    sampling_percentage: 10.0

Custom Metrics

Export custom metrics:

// In your application
var meter = new Meter("MyApp.Metrics");
var requestCounter = meter.CreateCounter<long>("myapp.requests.total");

requestCounter.Add(1, new KeyValuePair<string, object?>("endpoint", "/api/users"));

Troubleshooting

No Data in Azure

Problem: Data not appearing in Application Insights

Solution:

  1. Verify connection string is correct
  2. Check collector logs for errors
  3. Verify network connectivity to Azure
  4. Check Azure resource is active
  5. Verify instrumentation key matches

High Costs

Problem: Azure Monitor costs are high

Solution:

  • Enable sampling (reduce data volume)
  • Set data retention limits
  • Use Log Analytics basic tier
  • Filter unnecessary telemetry

Authentication Errors

Problem: Authentication failures

Solution:

  • Verify connection string format
  • Check if resource is deleted/recreated
  • Verify Azure subscription is active
  • Check for IP restrictions

Use Cases

Application Performance Monitoring

  • Track request performance
  • Monitor dependencies
  • Identify bottlenecks

Error Tracking

  • View exceptions and errors
  • Track error rates
  • Debug production issues

Business Analytics

  • Track user actions
  • Monitor business metrics
  • Create custom dashboards

Further Reading