Create a .NET Azure Cosmos DB Function Trigger using Visual Studio Code in 2 minutes!

Theo van Kraay

 

Creating event sourcing solutions with Azure Cosmos DB is easy with Azure Functions Triggers, where you can leverage the Change Feed Processor‘s powerful scaling and reliable event detection functionality, without the need to maintain any worker infrastructure. You can just focus on your Azure Function’s logic without worrying about the rest of the event-sourcing pipeline. In this blog, we have some quick how-to videos to get you up and running with .NET Azure Cosmos DB Functions Triggers!

If you want to follow along, there are some pre-requisites you should have in place:

  • Ideally a Windows 10 Machine.
  • Access to an Azure subscription, and an Azure Cosmos DB account – instructions here.
  • .NET Core installed – instructions here.
  • Visual Studio Code installed – instructions here.
  • Azure Storage Emulator installed (make sure this is running before trying to follow the demo) – instructions here.
  • Azure Functions Core Tools installed – instructions here (we recommend v3.x – note that this requires installing Node.js).
  • Azure Functions Extension installed – you should have Azure Functions Core Tools and VS code already installed, and access to an Azure subscription for sign-in, before attempting to install this extension) – instructions here.

Create an Azure Cosmos DB Functions Trigger in .NET with VS Code… in 2 minutes!

 

Next, you can turn your function into a simple event sourcing solution, which streams updates and inserts from one collection into another, by replacing the boiler plate code with the below:

 

using System;
using System.Threading.Tasks;
using System.Collections.Generic;
using Microsoft.Azure.Documents;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

using Microsoft.Azure.Cosmos;

namespace Company.Function
{
    public static class CosmosDBTriggerCSharp1
    {
        private static readonly string _endpointUrl = System.Environment.GetEnvironmentVariable("endpointUrl");                
        private static readonly string _primaryKey = System.Environment.GetEnvironmentVariable("primaryKey");
        private static readonly string _databaseId = "database";
        private static readonly string _containerId = "collection2";
        private static CosmosClient cosmosClient = new CosmosClient(_endpointUrl, _primaryKey);


        [FunctionName("CosmosDBTriggerCSharp1")]
        public static async Task Run([CosmosDBTrigger(
            databaseName: "database",
            collectionName: "collection1",
            ConnectionStringSetting = "cosmosdbtvk_DOCUMENTDB",
            LeaseCollectionName = "leases",
            CreateLeaseCollectionIfNotExists = true)]IReadOnlyList<Document> input, ILogger log)
        {

            var container2 = cosmosClient.GetContainer(_databaseId, _containerId);

            foreach(Document doc in input){
                log.LogInformation("pushed doc into container 2");
                log.LogInformation("doc: "+doc);
                try{
                    await container2.CreateItemAsync<Document>(doc);
                }
                catch (Exception e) {
                        log.LogInformation("Exception pushing doc into container 2: "+e);
                }
                
            }
        }
    }
}

 

Make sure you install the latest Azure Cosmos DB .NET SDK. In terminal, run the following:

 

dotnet add package Microsoft.Azure.Cosmos

 

Your local.settings.json should also be updated to look something like this (note the “endpointUrl” and “primaryKey” added for the target collection):

 

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "cosmosdbtvk_DOCUMENTDB": "<PRIMARY CONNECTION STRING OF SOURCE COLLECTION>",
    "endpointUrl":"<URI OF TARGET COLLECTION>",
    "primaryKey": "<PRIMARY KEY OF TARGET COLLECTION>"
  }
}

 

Watch the video below to see how to deploy to Azure! To follow along with this video, you should have the pre-requisites from the above video already installed, plus the following:

  • Azure CLI – instructions here.
  • The Azure Cosmos DB Core (SQL) API .NET SDK at version 3.6.0 or higher
  • NuGet Package Manager for VS code – instructions here (you would need this to install the latest version of the Azure Cosmos DB .NET SDK – this is illustrated but not installed during the video).

 

For more information about Azure Cosmos DB’s change feed and it’s use cases, go here!

For the official documentation on creating an Azure Function triggered by Azure Cosmos DB, go here!

For more depth on Azure Cosmos DB Change Feed, try our labs here!

Get started

Create a new account using the Azure Portal, ARM template or Azure CLI and connect to it using your favourite tools. Stay up-to-date on the latest Azure #CosmosDB news and features by following us on Twitter @AzureCosmosDB. We are really excited to see what you will build with Azure Cosmos DB!

About Azure Cosmos DB

Azure Cosmos DB is a globally distributed, multi-model database service that enables you to read and write data from any Azure region. It offers turnkey global distribution, guarantees single-digit millisecond latency at the 99th percentile, 99.999 percent high availability, with elastic scaling of throughput and storage.

 

3 comments

Discussion is closed. Login to edit/delete existing comments.

    • Theo van KraayMicrosoft employee 0

      Thanks Manuel!

  • Thomas Vandenbon 0

    This article has the same problem as the Java article (https://devblogs.microsoft.com/cosmosdb/create-a-java-azure-cosmos-db-function-trigger-using-visual-studio-code-in-2-minutes/):
    It mentions “Event Sourcing” a lot, but it doesn’t do anything related to Event Sourcing.
    What you’re doing here is copying data from one container to another when it’s being added.
    In an article about event sourcing, I would expect the following topics to be handled:
    – How to handle exceptions in a stream?
    – How to handle upcasting of events?
    – How to maintain projections or materialized views?

    In my experience, the hardest problem with using Azure Functions for event sourcing, is the lack of control you have when things go wrong.
    Since things will always go wrong, it makes Azure Functions a poor fit for any production ready system.

Feedback usabilla icon