New metrics capabilities for OpenTelemetry in Azure Monitor

Microsoft released a series of updates to its Azure Monitor OpenTelemetry Exporter packages for .NET, Node.js, and Python applications for preview. New features include: export of OpenTelemetry metrics to Azure Monitor Application Insights (AMAI), improved control of trace and interval sampling, and cache retries and delivery of telemetry data on temporary disconnections to Azure Monitor Application Insights.

Azure Monitor is a set of tools for collecting, analyzing, and responding to application and infrastructure telemetry data from cloud and on-premises environments. AMAI is one of the Azure Monitor tools and provides Application Performance Monitoring (APM) to its users. In addition, Azure Monitor Application Insights supports distributed tracing, one of the pillars of the observability paradigm, across multiple applications.

OpenTelemetry is a framework that provides APIs, SDKs, and vendor-agnostic tools for consuming, transforming, and exporting telemetry data to Observability backends. In a blog post in 2021, Microsoft outlined its roadmap for integrating OpenTelemetry with its broader Azure Monitor ecosystem. The immediate goal of this was to create direct exporters from OpenTelemetry-based applications to AMAI instead of the de facto OpenTelemetry route from an OTLP exporter to Azure Monitor via the OpenTelemetry Collector.

Source:

A sample direct exporter in a Node.js application with OpenTelemetry tracking in place would be:

const { AzureMonitorTraceExporter } = require(“@azure/monitor-opentelemetry-exporter”); const { NodeTracerProvider } = require(“@opentelemetry/sdk-trace-node”); const { BatchSpanProcessor } = require(“@opentelemetry/sdk-trace-base”); constant provider = new NodeTracerProvider ({ resource : new Resource ({
[SemanticResourceAttributes.SERVICE_NAME]: “basic service”, }), }); provider.register(); // Create an exporter instance const exporter = new AzureMonitorTraceExporter({ connectionString: process.env[“APPLICATIONINSIGHTS_CONNECTION_STRING”] || “” }); // Add the exporter to the provider provider.addSpanProcessor( new BatchSpanProcessor (exporter, { bufferTimeout: 15000, bufferSize: 1000 }) );

With the release of new updates to the Azure Monitor OpenTelemetry Exporter packages, it would now be possible to export metrics to the AMAI as shown below:

const { MeterProvider, PeriodicExportingMetricReader } = require(“@opentelemetry/sdk-metrics”); const { Resource } = require(“@opentelemetry/resources”); const { AzureMonitorMetricExporter } = require(“@azure/monitor-opentelemetry-exporter”); // Add the exporter to the MetricReader and register it with the MeterProvider const provider = new MeterProvider(); const exporter = new AzureMonitorMetricExporter({ connectionString: process.env[“APPLICATIONINSIGHTS_CONNECTION_STRING”] || ““, }); const metricReaderOptions = { exporter: exporter, }; const metricReader = new PeriodicExportingMetricReader(metricReaderOptions); provider.addMetricReader(metricReader); );

To manage the amount of telemetry data sent to Application Insights, packages now include a counter that controls the percentage of traces that are sent. For the Node.js trace example above, this would be:

import { ApplicationInsightsSampler, AzureMonitorTraceExporter } from “@azure/monitor-opentelemetry-exporter”; // Sampler expects a sample rate between 0 and 1 inclusive // ​​A rate of 0.75 means about 75% of traces are sent const aiSampler = new ApplicationInsightsSampler(0.75); constant provider = new NodeTracerProvider ({ resource : new Resource ({
[SemanticResourceAttributes.SERVICE_NAME]: “basic service”, }), sampler: aiSampler });

Finally, in case of connection errors with AMAI, direct exporters write their payloads to local storage and periodically retry delivery within a 48-hour period. These parameters can be set when instantiating an exporter, as shown below:

const exporter = new AzureMonitorTraceExporter({ connectionString: process.env[“APPLICATIONINSIGHTS_CONNECTION_STRING”]storageDirectory: “C:\\SomeDirectory”, // your desired location disableOfflineStorage: false // enabled by default, set to disabled });

Leave a Comment

Your email address will not be published. Required fields are marked *