All in One with OData $Batch

Hassan Habib

Introduction

We talked in the past about some of the most powerful features OData has to offer such as shaping, filtering and ordering your data all within your API request.

But with an API GET request you can only do so much before you reach the maximum length of a URL which is a standard limitation.

For instance, let’s assume we are looking for information for a particular set of students. If the set is as little as 5 or 10 students, the request might be feasible with a simple API call such as this:

https://localhost:44369/api/students?$filter=name in ('John Lee', 'Susanna Aldo')

But the problem with that approach is that it doesn’t scale very well. For instance, if you are looking for 100 students, you might exceed the limit of 2048 characters for a URL request, especially if you are searching by a student Id rather than a student name.

 

Batching Overview

In this case, a better solution would be OData Batching – which is a feature that allows API consumers to combine multiple requests into one single POST request, then receive a full report back from an OData enabled API with the status and the results of each and every single request as follows:

POST https://localhost/api/$batch

with the following request body:

{
    "requests": [
        {
            "id": "1",
            "method": "GET",
            "url": "https://localhost/api/students/735b6ae6-485e-4ad8-a993-36227ac82851"
        },
        {
            "id": "2",
            "method": "GET",
            "url": "https://localhost/api/students/735b6ae6-485e-4ad8-a993-36227ac82853"
        },
        {
            "id": "3",
            "method": "GET",
            "url": "https://localhost/api/students/735b6ae6-485e-4ad8-a993-36227ac82854"
        }
    ]
}

An OData $batch response to the above request would be as follows:

{
    "responses": [
        {
            "id": "1",
            "status": 200,
            "headers": {
                "content-type": "application/json; odata.metadata=minimal; odata.streaming=true",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://localhost/api/$metadata#Students/$entity",
                "Id": "735b6ae6-485e-4ad8-a993-36227ac82851",
                "Name": "Susanna Aldo"
            }
        },
        {
            "id": "2",
            "status": 200,
            "headers": {
                "content-type": "application/json; odata.metadata=minimal; odata.streaming=true",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://localhost/api/$metadata#Students/$entity",
                "Id": "735b6ae6-485e-4ad8-a993-36227ac82853",
                "Name": "Michael John"
            }
        },
        {
            "id": "3",
            "status": 200,
            "headers": {
                "content-type": "application/json; odata.metadata=minimal; odata.streaming=true",
                "odata-version": "4.0"
            },
            "body": {
                "@odata.context": "https://localhost/api/$metadata#Students/$entity",
                "Id": "735b6ae6-485e-4ad8-a993-36227ac82854",
                "Name": "John Lee"
            }
        }
    ]
}

 

The batching capability allows an API consumer to send as many requests with any HTTP Verb as the API maintainers will allow – for instance you can combine a POST call with DELETE and GET or PUT all in one request, including requests that still support OData querying.

 

 

Less is More

From an optimization standpoint, Batching minimizes the overall count of requests from an API consumer from n to 1 – simply because it allows the combination of multiple requests into one, and the failure of one requests doesn’t necessarily impact the rest in any way shape or form as shown in the following illustration:

 

Image ODataBatching 3

 

With Batching, the cost of network latency per each request can be avoided – which also minimizes the risk of lost signals in weak networks especially for mobile applications and offline web applications that may require posting all the changes at once the application goes back online.

 

Setting things up

Enabling Batching in ASP.NET Core 3.1 application is just as simple as two changes, assuming the application is leveraging the EDM approach for communicating OData queries.

I’m going to use SchoolEM repo to modify the existing code to support batching as follows:

In Startup.cs file – in the Configure method – you should have the following code in place:

        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }

            app.UseHttpsRedirection();
            app.UseRouting();
            app.UseAuthorization();

            app.UseEndpoints(endpoints =>
            {
                endpoints.MapControllers();
                endpoints.Select().Filter().Expand().OrderBy();
                endpoints.MapODataRoute("api", "api", GetEdmModel());
            });
        }

We are going to change our code there to support Batching by simply adding OData Batching middleware, then adding a default OData batch handler as follows:

        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }

            app.UseODataBatching();
            app.UseHttpsRedirection();
            app.UseRouting();
            app.UseAuthorization();

            app.UseEndpoints(endpoints =>
            {
                endpoints.MapControllers();
                endpoints.Select().Filter().Expand().OrderBy();
                endpoints.MapODataRoute(
                    routeName: "api",
                    routePrefix: "api",
                    model: GetEdmModel(),
                    batchHandler: new DefaultODataBatchHandler());
            });
        }

By simply adding these two changes, your API now is fully able to support batching requests as we have shown above.

Note: it’s very important to add the using of OData batching before using the routing.

 

Great Power, Greater Responsibility

Allowing batching gives your API consumers and your API great powers to orchestrate and communicate multiple requests at the same time, but with that comes a great responsibility to maintain your API secure against abusive usage of batching especially when it comes to overwhelming your API service with infinite number of requests.

For that, I highly recommend understanding your business needs, and determining the maximum number of allowed requests per batch post before deploying your service. This can simply be done by setting up some of the properties on the default batch handler as follows:

 

                var defaultBatchHandler = new DefaultODataBatchHandler();
                defaultBatchHandler.MessageQuotas.MaxNestingDepth = 2;
                defaultBatchHandler.MessageQuotas.MaxOperationsPerChangeset = 10;
                defaultBatchHandler.MessageQuotas.MaxReceivedMessageSize = 100;

                endpoints.MapControllers();
                endpoints.Select().Filter().Expand().OrderBy();
                endpoints.MapODataRoute(
                    routeName: "api",
                    routePrefix: "api",
                    model: GetEdmModel(),
                    batchHandler: defaultBatchHandler);

The default batch handler gives you the ability to protect your API from abusive usages by protecting your API vertically by controlling the depth of nesting objects within a response as well protecting your API horizontally by controller the amount of requests your API can receive per batch requests.

Important: It’s very important to set these configurations up before deploying to protect your API against DDoS attacks.

 

Final Notes:

  1. You can find the entire source code for SchoolEM with Batching enabled on github at this link.
  2. Huge thanks to Vishwa Goli for educating me on this topic.
  3. Huge thanks to Sam Xu for addressing .NET Core 3.1 issues with Batching and making it possible for the rest of us to utilize the feature.
  4. In the second part of this article, we are going to dive deeper into batching, understanding some of the most powerful capabilities of the feature with OData.

2 comments

Discussion is closed. Login to edit/delete existing comments.

  • Shimmy Weitzhandler 0

    This is awesome!
    Thanks for sharing your wonderful articles, Hassan!

  • ivan zinov 1

    This is awesome, the only problem that I am seeing is that you expose only the scenario of the EDM. Can you update the post using the EndPointRouting approach without EDM?
    How you can filter a collection with the batch approach? https://localhost:44369/api/students?$filter=name in (‘John Lee’, ‘Susanna Aldo’)
    The only way I am seeing it is to get the list of students from the batch response and make the client responsible for the filtering.
    Any suggestion?

Feedback usabilla icon