Set up Azure EventHub to send logs to HLA<!-- /*NS Branding Styles*/ --> .ns-kb-css-body-editor-container { p { font-size: 12pt; font-family: Lato; color: #000000; } span { font-size: 12pt; font-family: Lato; color: #000000; } h2 { font-size: 24pt; font-family: Lato; color: black; } h3 { font-size: 18pt; font-family: Lato; color: black; } h4 { font-size: 14pt; font-family: Lato; color: black; } a { font-size: 12pt; font-family: Lato; color: #00718F; } a:hover { font-size: 12pt; color: #024F69; } a:target { font-size: 12pt; color: #032D42; } a:visited { font-size: 12pt; color: #00718f; } ul { font-size: 12pt; font-family: Lato; } li { font-size: 12pt; font-family: Lato; } img { display: block; max-width: ; width: auto; height: auto; } } Requirements HLA v37 or later is required.Important: This doc assumes you have existing Azure Logs and permissions to send those logs to an EventHub that will be created for forwarding log data to ServiceNow HLA. Solution Architecture An Event Hub serves as a message bus for the logs.A Container App runs an OpenTelemetry collector that pull logs from the Event Hub, translating them to OTLP before forwarding to the Gateway Collector via gRPC. It runs a public OpenTelemetry image pulled from Docker Hub.Configuration will be passed via environment variable. Setup The following setup uses the Azure CLI to deploy a template to forward logs. First, set these environment variables: export RESOURCE_GROUP="collector-resource-group" # arbitrary name export LOCATION="eastus" export EVENTHUB_NAMESPACE="your-namespace-1234" # arbitrary name but must be globally unique export EVENTHUB_NAME="logs" # arbitrary name export LOGS_ENDPOINT="" export INTEGRATION_ID="" export ACCESS_TOKEN="" Create the below OpenTelelemetry collector config, saved as "config.yaml": receivers: azureeventhub: connection: ${EVENTHUB_CONNECTION} format: azure exporters: otlp: endpoint: ${LOGS_ENDPOINT} headers: servicenow-access-token: ${ACCESS_TOKEN} servicenow-integration-id: ${INTEGRATION_ID} service: pipelines: logs: receivers: [azureeventhub] exporters: [otlp] Create a resource group (or you can use an existing one): az group create \ --name "${RESOURCE_GROUP}" \ --location "${LOCATION}" Create a new Bicep deployment file, either download from the Azure EventHub (MID-less) Integration Launchpad title or modify the example attached to this KB. Next, create a deployment with the Bicep file as the template: az deployment group create \ --resource-group "${RESOURCE_GROUP}" \ --template-file path/to/file.bicep \ --parameters eventHubNamespaceName="${EVENTHUB_NAMESPACE}" \ eventHubName="${EVENTHUB_NAME}" \ logsEndpoint="${LOGS_ENDPOINT}" \ otelConfig="$(cat config.yaml)" \ servicenowAccessToken="${ACCESS_TOKEN}" \ servicenowIntegrationID="${INTEGRATION_ID}" This will deploy an Azure Container app that fowards logs to HLA. Check the deployment logs and container logs for errors. Testing First, create a Shared Access Policy with the ability to write logs to Event Hub for testing purposes: az eventhubs eventhub authorization-rule create \ --resource-group "${RESOURCE_GROUP}" \ --namespace-name "${EVENTHUB_NAMESPACE}" \ --eventhub-name "${EVENTHUB_NAME}" \ --name TestWritePolicy \ --rights Send Then retrieve the Connection String: EVENTHUB_CONNECTION="$(az eventhubs eventhub authorization-rule keys list \ --resource-group ${RESOURCE_GROUP} \ --namespace-name ${EVENTHUB_NAMESPACE} \ --eventhub-name ${EVENTHUB_NAME} \ --name TestWritePolicy \ --query primaryConnectionString \ --output tsv)" Use the connection string to write events to Event Hub and verify they can be seen in the target instance's SOW Log Viewer. Events can be written in many ways, but the logs must be formatted as valid Azure logs. More information the collector's expectations around this format and how it will map it to OTLP can be found here. Here is an example Python script that writes a given log five times: # /// script # requires-python = ">=3.12" # dependencies = [ # "azure-eventhub", # "azure-identity", # ] # /// import asyncio from azure.eventhub.aio import EventHubProducerClient from azure.identity.aio import DefaultAzureCredential from azure.eventhub.exceptions import EventHubError from azure.eventhub import EventData con_string = "" event_message = """{ "records":[ { "time":"2025-04-24T13:14:28.0000000Z", "resourceId":"/SUBSCRIPTIONS/OPENTELEMETRY-AZURE-SUB/RESOURCEGROUPS/OPENTELEMETRY-FRONTDOOR/PROVIDERS/MICROSOFT.CDN/PROFILES/OPENTELEMETRY-FRONTDOOR-PROFILE", "category":"FrontDoorAccessLog", "operationName":"Microsoft.Cdn/Profiles/AccessLog/Write", "properties":{ "trackingReference":"20250424T131428Z-17587c8c466d76czhC1PARprs40000000q8g00000000d67w", "httpMethod":"GET", "httpVersion":"2.0.0.0", "requestUri":"https://opentelemetry-test-fmagg0exgdcfhefq.z01.azurefd.net:443/", "sni":"opentelemetry-test-fmagg0exgdcfhefq.z01.azurefd.net", "requestBytes":"60", "responseBytes":"1624", "userAgent":"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:137.0) Gecko/20100101 Firefox/137.0", "clientIp":"2001:1c00:3280:6700:fbfa:bf04:1296:ebfc", "clientPort":"55262", "socketIp":"2001:1c00:3280:6700:fbfa:bf04:1296:ebfc", "timeToFirstByte":"0.035", "timeTaken":"0.035", "requestProtocol":"HTTPS", "securityProtocol":"TLS 1.3", "rulesEngineMatchNames":[ ], "httpStatusCode":"200", "httpStatusDetails":"200", "pop":"PAR", "cacheStatus":"CONFIG_NOCACHE", "errorInfo":"NoError", "ErrorInfo":"NoError", "result":"N/A", "endpoint":"opentelemetry-test-fmagg0exgdcfhefq.z01.azurefd.net", "routingRuleName":"opentelemetry-frontdoor-route", "hostName":"opentelemetry-test-fmagg0exgdcfhefq.z01.azurefd.net", "originUrl":"https://opentelemetry-app.azurewebsites.net:443/", "originIp":"23.100.1.29:443", "originName":"opentelemetry-app.azurewebsites.net:443", "originCryptProtocol":"N/A", "referer":"", "clientCountry":"Netherlands", "domain":"6d63ff6a-6a29-4702-bcc0-533a432cc7fa:443", "securityCipher":"TLS_AES_256_GCM_SHA384", "securityCurves":"0x11ec:X25519:prime256v1:secp384r1:secp521r1:0x0100:0x0101" } } ] }""" async def send_event_as_list(producer): event_data_list = [EventData(event_message)] await producer.send_batch(event_data_list) async def run(): event_hub_path = "logs" producer = EventHubProducerClient.from_connection_string( conn_str=con_string, eventhub_name=event_hub_path ) o = 0 async with producer: while(o<5): await send_event_as_list(producer=producer) o = o+1 await producer.close() asyncio.run(run())