Out with the REST: Azure Monitor Ingestion libraries appear

Scott Addie

In November 2022, the Azure Monitor Logs service team announced the general availability (GA) of a redesigned, data collection rules (DCR)-based Logs Ingestion REST API. For more context on that announcement, see Announcing GA of revamped Custom Logs features. While this reimagined REST API was in a gated Public Preview phase, the Azure SDK team began contemplating client library API designs.

Analysis and the path forward

To start, the SDK team researched the developer experience supporting what was soon to become the legacy solution—the Data Collector REST API. The following things were learned about uploading custom logs:

  1. Authentication to Azure Active Directory (Azure AD) was unsupported; HMAC-SHA256 was used instead.
  2. No client libraries existed to support this scenario.
  3. Customers were calling the REST API directly via an HTTP POST in one of three ways:
    1. Using community-authored PowerShell cmdlets.
    2. Writing their own code.
    3. Using C#, Java, PowerShell, or Python snippets provided in the official documentation. For example, this fragment of the C# snippet:
      public static void PostData(string signature, string date, string json)
      {
          try
          {
              string url = "https://" + customerId + ".ods.opinsights.azure.com/api/logs?api-version=2016-04-01";
      
              System.Net.Http.HttpClient client = new System.Net.Http.HttpClient();
              client.DefaultRequestHeaders.Add("Accept", "application/json");
              client.DefaultRequestHeaders.Add("Log-Type", LogName);
              client.DefaultRequestHeaders.Add("Authorization", signature);
              client.DefaultRequestHeaders.Add("x-ms-date", date);
              client.DefaultRequestHeaders.Add("time-generated-field", TimeStampField);
      
              System.Net.Http.HttpContent httpContent = new StringContent(json, Encoding.UTF8);
              httpContent.Headers.ContentType = new MediaTypeHeaderValue("application/json");
              Task<System.Net.Http.HttpResponseMessage> response = client.PostAsync(new Uri(url), httpContent);
      
              System.Net.Http.HttpContent responseContent = response.Result.Content;
              string result = responseContent.ReadAsStringAsync().Result;
              Console.WriteLine("Return Result: " + result);
          }
          catch (Exception excep)
          {
              Console.WriteLine("API Post Exception: " + excep.Message);
          }
      }

To improve the developer experience, the SDK team created client libraries for .NET, Java, JavaScript, and Python. In the spirit of being customer-driven, the team began thinking about perception and utility of the libraries. More specifically, what would a prospective customer think about its overall design? If unlimited budget was provided and time constraints were removed, what’s one thing a prospective customer would change about the libraries? A moderated UX study would undoubtedly yield these answers. Equipped with that knowledge, we could pivot and press forward with a higher degree of confidence that the correct solution was built.

The early Beta releases of the Java and Python libraries were identified as good candidates for the UX study. We hypothesized about how prospective participants might attempt to use the libraries’ APIs to solve real-world problems. Participants were asked to complete a series of tasks using the library in their preferred language. Unsurprisingly, we learned a lot in each of those two-hour sessions! Hypotheses were both validated and invalidated. Participants’ feedback was distilled to actionable work items. The API surface philosophy evolved in lockstep across languages. Language-focused architects continued to scrutinize the APIs. Feature engineers continued to refine the API surface into a blissful state of idiomaticity. Now we have a product worthy of bearing the Azure SDK name.

Today, we announce the stable release of the Azure Monitor Ingestion libraries—an idiomatic, approachable, diagnosable collection of libraries for uploading custom logs to Log Analytics. How do you get started? Let’s dig in!

Get started

If you’re currently using the legacy Data Collector API, first migrate to the new DCR-based API. For migration guidance, see Migrate from Data Collector API and custom fields-enabled tables to DCR-based custom log collection.

Language-specific Monitor Ingestion library instructions for getting started are in the following locations:

End-to-end example

To illustrate an end-to-end example, imagine you need to upload a couple logs to a custom table, named CustomTable_CL, in a Log Analytics workspace. After the upload operation completes, you’d like to verify those two logs are in the table. Here are the basic steps to follow using the .NET library in an ASP.NET Core Razor Pages app.

  1. Install the necessary dependencies in your Razor Pages project:
    dotnet add package Azure.Identity
    dotnet add package Azure.Monitor.Ingestion
    dotnet add package Azure.Monitor.Query
    dotnet add package Microsoft.Extensions.Azure

    Aside from the obvious Azure.Monitor.Ingestion package, use the Azure SDK packages listed in the following table.

    Package Purpose
    Azure.Identity Provides token-based credential types used for authenticating the client to Azure AD.
    Azure.Monitor.Query Provides a client for querying Log Analytics workspaces with Kusto Query Language.
    Microsoft.Extensions.Azure Provides an extension method for registering Azure SDK clients with .NET’s dependency injection container.
  2. In Program.cs, invoke the Microsoft.Extensions.Azure package’s AddAzureClients extension method. With that method, you can register:
    • The logs ingestion client via a call to the Monitor Ingestion library’s AddLogsIngestionClient extension method. The method accepts the data collection endpoint‘s logs ingestion URI. For example, “https://mycollectionendpoint-abcd.eastus-1.ingest.monitor.azure.com”.
    • The credential type to use when constructing an instance of the logs ingestion client.
    using Azure.Identity;
    using Microsoft.Extensions.Azure;
    
    var builder = WebApplication.CreateBuilder(args);
    builder.Services.AddRazorPages();
    builder.Services.AddAzureClients(clientBuilder =>
    {
        var uri = builder.Configuration.GetValue<string>(
            "LogsIngestion:DataCollectionEndpoint");
        clientBuilder.AddLogsIngestionClient(new Uri(uri));
        clientBuilder.UseCredential(new DefaultAzureCredential());
    });
    
    // code omitted for brevity
  3. In a Razor Pages page model class:
    • Retrieve an instance of the logs ingestion client class via constructor injection.
    • Serialize the log objects to JSON format.
    • Upload two logs via a call to the Monitor Ingestion library’s UploadAsync method. Note the DCR ID argument this method accepts. The DCR is an Azure Resource Manager (ARM) object representing the control flow logic for data, including where to send that data.
    • Verify both logs were successfully uploaded to the CustomLogs_CL table via a call to the Monitor Query library’s QueryWorkspaceAsync method. The Kusto query retrieves the row count for the last 5 minutes of data in the table. Assuming no other logs were uploaded during that time period, a row count of 2 confirms a successful upload.
    using Azure;
    using Azure.Core;
    using Azure.Identity;
    using Azure.Monitor.Ingestion;
    using Azure.Monitor.Query;
    using Microsoft.AspNetCore.Mvc.RazorPages;
    
    // code omitted for brevity
    
    public class IndexModel : PageModel
    {
        private readonly IConfiguration _configuration;
        private readonly LogsIngestionClient _client;
    
        public IndexModel(
            IConfiguration configuration,
            LogsIngestionClient client)
        {
            _configuration = configuration;
            _client = client;
        }
    
        public async Task OnGet()
        {
            BinaryData logsToUpload = GetLogsToUpload();
    
            Response response = await _client.UploadAsync(
                _configuration["LogsIngestion:DataCollectionRuleId"],
                _configuration["LogsIngestion:StreamName"],
                RequestContent.Create(logsToUpload));
    
            int logsCount = await VerifyLogsUploaded();
    
            // code omitted for brevity
        }
    
        private BinaryData GetLogsToUpload() =>
            // Use BinaryData to serialize instances of an anonymous type into JSON
            BinaryData.FromObjectAsJson(
                new[] {
                    new
                    {
                        Time = DateTimeOffset.UtcNow,
                        Computer = "Computer1",
                        AdditionalContext = new
                        {
                            InstanceName = "user1",
                            TimeZone = "Pacific Time",
                            Level = 4,
                            CounterName = "AppMetric1",
                            CounterValue = 15.3
                        }
                    },
                    new
                    {
                        Time = DateTimeOffset.UtcNow,
                        Computer = "Computer2",
                        AdditionalContext = new
                        {
                            InstanceName = "user2",
                            TimeZone = "Central Time",
                            Level = 3,
                            CounterName = "AppMetric1",
                            CounterValue = 23.5
                        }
                    },
                });
    
        private async Task<int> VerifyLogsUploaded()
        {
            var client = new LogsQueryClient(
                new DefaultAzureCredential());
    
            Response<IReadOnlyList<int>> response =
                await client.QueryWorkspaceAsync<int>(
                    _configuration["LogsQuery:LogAnalyticsWorkspaceId"],
                    "CustomTable_CL | count",
                    new QueryTimeRange(TimeSpan.FromMinutes(5)));
    
            return response.Value.SingleOrDefault();
        }
    }

Summary

The DCR-based logs ingestion API has modernized the process of uploading custom data to Log Analytics workspaces. Send data to custom tables you create or to a handful of built-in tables. Authenticate to Azure AD with a token-based credential from the same Azure Identity libraries you use with other modern Azure SDK libraries. And when coupled with the Monitor Ingestion libraries, developer experience takes a front seat. Drop down to the REST API on your own terms.

We’re excited to see how you use the Monitor Ingestion libraries to satisfy your log ingestion needs. Do you have a feature request, question, or bug to report? Let’s have those conversations on GitHub at these locations:

0 comments

Discussion is closed.

Feedback usabilla icon