{"id":742,"date":"2020-10-22T10:40:24","date_gmt":"2020-10-22T17:40:24","guid":{"rendered":"https:\/\/devblogs.microsoft.com\/azure-sdk\/?p=742"},"modified":"2020-10-22T10:41:21","modified_gmt":"2020-10-22T17:41:21","slug":"visualizing-customer-sentiment","status":"publish","type":"post","link":"https:\/\/devblogs.microsoft.com\/azure-sdk\/visualizing-customer-sentiment\/","title":{"rendered":"Analyzing Call Center Conversations with the new Azure SDK Cognitive Services Libraries"},"content":{"rendered":"<p>Analyzing the recorded audio, video, or chat history, as well as the average conversation time can be very helpful. We can find this kind of data in the call centers which are company hubs for customer communication. This article presents how new Azure .NET SDK client libraries (Cosmos DB, Text Analytics and Azure Storage Blobs) were used in the real project &#8211; call center conversations analyzer.<\/p>\n<h2>Business case<\/h2>\n<p>Together with my team at <a href=\"https:\/\/www.predicagroup.com\/\">Predica<\/a>, we needed a solution to get insights related to the recorded audio, video, and chat history from the call center owned by one of our clients. We wanted to know how many customers are satisfied and what are the most popular topics. This is why we decided to build call center conversations analysis solution with Azure cloud using services like Azure Storage, Azure Functions, Azure Cognitive Services, Azure Media Services Video Indexer, and Azure Cosmos DB.<\/p>\n<h2>Solution architecture<\/h2>\n<p>The below diagram presents an implemented solution, that enables uploading PDF files with conversation history or audio\/video files of the recorded conversation.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2020\/10\/call-center-talks-analysis.png\" alt=\"call-center-talks-analysis.png\" \/><\/p>\n<p>These services were used:<\/p>\n<ol>\n<li><strong><a href=\"https:\/\/azure.microsoft.com\/services\/storage\/\">Azure Storage Account (Blob Storage)<\/a><\/strong> &#8211; to store text, audio and video files for the further analysis<\/li>\n<li><strong><a href=\"https:\/\/docs.microsoft.com\/azure\/azure-functions\/durable\/durable-functions-overview?tabs=csharp\">Azure Durable Functions<\/a><\/strong> &#8211; to orchestrate analysis flow<\/li>\n<li><strong><a href=\"https:\/\/azure.microsoft.com\/services\/cognitive-services\/text-analytics\/\">Azure Cognitive Services Text Analytics<\/a><\/strong> &#8211; to analyze text and get sentiment together with topics<\/li>\n<li><strong><a href=\"https:\/\/azure.microsoft.com\/services\/cognitive-services\/form-recognizer\/\">Azure Cognitive Services Form Recognizer<\/a><\/strong> &#8211; to scan PDF documents with OCR<\/li>\n<li><strong><a href=\"https:\/\/azure.microsoft.com\/services\/media-services\/video-indexer\/\">Azure Video Indexer<\/a><\/strong> &#8211; to analyze audio and video content<\/li>\n<li><strong><a href=\"https:\/\/azure.microsoft.com\/services\/cosmos-db\/\">Azure Cosmos DB<\/a><\/strong> &#8211; to store analysis results as JSON documents<\/li>\n<li><strong><a href=\"https:\/\/powerbi.microsoft.com\/\">Power BI<\/a><\/strong> &#8211; to visualize collected data in a form of report<\/li>\n<li><strong><a href=\"https:\/\/azure.microsoft.com\/services\/monitor\/\">Azure Application Insights<\/a><\/strong> &#8211; to monitor the solution and discover issues<\/li>\n<\/ol>\n<h3>Services choice substantiation<\/h3>\n<p>Before we discuss solution implementation in detail, it is worth to talk about why we decided to use the services listed above.<\/p>\n<h4>Azure Storage Account<\/h4>\n<p>Azure Storage Account (Blob Storage in this case) was a natural choice for a place for storing source documents. Azure Blob Storage is a cost-efficient and secure service for storing different types of documents. In our solution, we used not only text documents, but also audio and video.<\/p>\n<h4>Azure Durable Functions<\/h4>\n<p>Initially, we thought about using Azure Web App and host API that would be responsible for orchestration. After some discussions, we made our mind to go with Azure Durable Functions. The reason behind it was simple. First, we wanted to run the analysis flow only when there is a new file uploaded on the Azure Blob Storage. Azure Functions are ideal to be used as event handlers. The second important reason was related to cost. With Azure Functions, we can reduce the cost as the first million executions are for free.<\/p>\n<h4>Azure Cognitive Services Text Analytics<\/h4>\n<p>There are many Azure Cognitive Services available. We needed a service to analyze the content of the text document and provide information about sentiment value. It is not easy to write own processor to discover emotions for instance. We wanted to analyze the conversation script and discover some insights related to customer satisfaction.<\/p>\n<h4>Azure Cognitive Services Form Recognizer<\/h4>\n<p>Form Recognizer is a great service that provides an easy way to extract text, key\/value pairs, and tables from documents, forms, receipts, and business cards. Initially, we wanted to use Azure Computer Vision API to scan documents with OCR but in the end, we moved with Form Recognizer. The reason behind it was simple. In the future, we have also plan to scan customer satisfaction survey forms and with Form Recognizer, it will be much easier to extract the content and validate specific values.<\/p>\n<h4>Azure Video Indexer<\/h4>\n<p>When it comes to video analysis, Azure Video Indexer is powerful. With this service, we were able to process audio and video files and extract information about customer satisfaction using the sentiment analysis feature. Of course, this is only a small part of Video Indexer&#8217;s capabilities. It also provides a way to create translations, automatically detect language, or face detection of people in the video.<\/p>\n<h4>Azure Cosmos DB<\/h4>\n<p>There are a few different databases available in the Azure cloud. The most popular, Azure SQL, was our first shoot when designing the architecture of the system. Clear structure, easy and secure access. There was one fact that we realized further. Our data model was not constant. We had a plan to extend it in the future and we wanted to remain flexible when it comes to data schema. This is why we decided to switch to Azure Cosmos DB. As a non-relational database, it was a perfect match. It also provides a geo-replication and easy way to manage the data using Azure Cosmos DB SDK.<\/p>\n<h4>Power BI<\/h4>\n<p>To visualize the results we used Power BI. With this easy to use tool, we were able to create dashboards with information about customer satisfaction.<\/p>\n<h4>Azure Application Insights<\/h4>\n<p>Monitoring and resolving issues in the solution are very important. We wanted to be sure that if an issue occurs we will be able to identify it quickly and resolve. This is why we integrated Azure Application Insights in our solution.<\/p>\n<h3>Data flow<\/h3>\n<p>Below diagram presents the data flow in our solution:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2020\/10\/call-center-talks-analysis-3.png\" alt=\"call-center-talks-analysis-3.png\" \/><\/p>\n<p>Now let&#8217;s talk about some implementation details.<\/p>\n<h2>Azure Cosmos DB for .NET in action<\/h2>\n<p>Once text, audio, and video files are analyzed by different Azure cloud services, we store analysis results in Azure Cosmos DB. This allows us to easily connect to our data from Power BI and we can extend the data model in the future. The code snippet below shows how we implemented a service class that uses a <code>CosmosClient<\/code> instance from the new <a href=\"https:\/\/www.nuget.org\/packages\/Azure.Cosmos\">Azure Cosmos DB client library for .NET<\/a>:<\/p>\n<pre><code class=\"csharp\">    public sealed class CosmosDbDataService&lt;T&gt; : IDataService&lt;T&gt; where T : class, IEntity\n    {\n        private readonly ICosmosDbDataServiceConfiguration _dataServiceConfiguration;\n        private readonly CosmosClient _client;\n\n        \/\/Shortened for brevity\n\n        private CosmosContainer GetContainer()\n        {\n            var database = _client.GetDatabase(_dataServiceConfiguration.DatabaseName);\n            var container = database.GetContainer(_dataServiceConfiguration.ContainerName);\n            return container;\n        }\n\n        public async Task&lt;T&gt; AddAsync(T newEntity)\n        {\n            try\n            {\n                CosmosContainer container = GetContainer();\n                ItemResponse&lt;T&gt; createResponse = await container.CreateItemAsync(newEntity);\n                return createResponse.Value;\n            }\n            catch (CosmosException ex)\n            {\n                _log.LogError($\"New entity with ID: {newEntity.Id} was not added successfully - error details: {ex.Message}\");\n                if (ex.ErrorCode != \"404\")\n                {\n                    throw;\n                }\n                return null;\n            }\n        }\n    }\n<\/code><\/pre>\n<p>To follow <a href=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/lifetime-management-and-thread-safety-guarantees-of-azure-sdk-net-clients\/\">best practices related to lifetime management<\/a>, <em>CosmosClient<\/em> type is registered as singleton in the IoC container.<\/p>\n<h2>Azure Storage Blobs for .NET SDK in action<\/h2>\n<p>As mentioned at the beginning of the post, we used the Azure Media Services Video Indexer to analyze audio and video file content. To make it possible for the Video Indexer to access files stored in the Azure Storage, we have to add a Shared Access Signature (SAS) token to the URL that points to the blob. We created our own <code>StorageService<\/code> using a <code>BlobServiceClient<\/code> instance from the new <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-net\/blob\/master\/sdk\/storage\/Azure.Storage.Blobs\/README.md\">Azure Storage Blobs .NET client library<\/a>. Please note that the last method is responsible for generating SAS tokens:<\/p>\n<pre><code class=\"csharp\">    public class StorageService : IStorageService\n    {\n        private readonly IStorageServiceConfiguration _storageServiceConfiguration;\n        private readonly BlobServiceClient _blobServiceClient;\n\n        \/\/Shortened for brevity\n\n        public async Task DownloadBlobIfExistsAsync(Stream stream, string blobName)\n        {\n            try\n            {\n                var container = await GetBlobContainer();\n                var blockBlob = container.GetBlobClient(blobName);\n\n                await blockBlob.DownloadToAsync(stream);\n\n            }\n\n            catch (RequestFailedException ex)\n            {\n                _log.LogError($\"Cannot download document {blobName} - error details: {ex.Message}\");\n                if (ex.ErrorCode != \"404\")\n                {\n                    throw;\n                }\n            }\n        }\n    }\n<\/code><\/pre>\n<p>As I mentioned above when talking about lifetime management, we followed best practices and registered our <code>BlobServiceClient<\/code> instance as singleton in the IoC container.<\/p>\n<h2>Azure AI Text Analytics and Form Recognizer .NET SDK in action<\/h2>\n<p>Having chat history in the form of text files, we used Azure Cognitive Services Text Analytics to analyze the content and Form Recognizer to apply OCR scanning on the documents to extract the text content. One of the great features of Text Analytics is the ability to analyze sentiment. After scanning the chat conversation, we can get sentiment analysis results that will be displayed on our Power BI dashboard. To process text files, we created <em>TextFileProcessingService<\/em> class which uses <em>TextAnalyticsClient<\/em> instance from the new <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-net\/blob\/master\/sdk\/textanalytics\/Azure.AI.TextAnalytics\/README.md\">Text Analytics .NET client library<\/a>:<\/p>\n<pre><code class=\"csharp\">    public class TextFileProcessingService : ITextFileProcessingService\n    {\n        private readonly TextAnalyticsClient _textAnalyticsClient;\n        private readonly IStorageService _storageService;\n        private readonly IOcrScannerService _ocrScannerService;\n\n        \/\/Shortened for brevity\n\n        public async Task&lt;FileAnalysisResult&gt; AnalyzeFileContentAsync(InputFileData inputFileData)\n        {\n            if (inputFileData.FileContentType == FileContentType.PDF)\n            {\n                var sasToken = _storageService.GenerateSasTokenForContainer();\n                inputFileData.FilePath = $\"{inputFileData.FilePath}?{sasToken}\";\n\n                var textFromTheInputDocument = await _ocrScannerService.ScanDocumentAndGetResultsAsync(inputFileData.FilePath);\n                try\n                {\n                    DocumentSentiment sentimentAnalysisResult = await _textAnalyticsClient.AnalyzeSentimentAsync(textFromTheInputDocument);\n                    var fileAnalysisResult = new FileAnalysisResult();\n                    fileAnalysisResult.SentimentValues.Add(sentimentAnalysisResult.Sentiment.ToString());\n                    return fileAnalysisResult;\n                }\n                catch (RequestFailedException ex)\n                {\n                    _log.LogError($\"An error occurred when analyzing sentiment with {nameof(TextAnalyticsClient)} service\", ex);\n                }\n            }\n\n            throw new ArgumentException(\"Input file shuld be either TXT or PDF file.\");\n        }\n    }\n<\/code><\/pre>\n<p>Please note that again here, we use the instance of the <em>StorageService<\/em> class to get Shared Access Signature and make it possible for Text Analytics API to access the file stored on the Blob Storage. Again, when talking about lifetime management, we followed best practices and registered <em>TextAnalyticsClient<\/em> instance as a singleton in the IoC container.<\/p>\n<p>Form Recognizer SDK was used to apply OCR scanning on the PDF documents and to extract the text content:<\/p>\n<pre><code class=\"csharp\">    public class OcrScannerService : IOcrScannerService\n    {\n        private readonly FormRecognizerClient _formRecognizerClient;\n        private readonly ILogger&lt;OcrScannerService&gt; _log;\n\n        \/\/Shortened for brevity\n\n        public async Task&lt;string&gt; ScanDocumentAndGetResultsAsync(string documentUrl)\n        {\n            FormPageCollection formPages = await _formRecognizerClient.StartRecognizeContentFromUriAsync(new Uri(documentUrl)).WaitForCompletionAsync();\n            StringBuilder sb = new StringBuilder();\n\n            foreach (FormPage page in formPages)\n            {\n                for (int i = 0; i &lt; page.Lines.Count; i++)\n                {\n                    FormLine line = page.Lines[i];\n                    sb.AppendLine(line.Text);\n                }\n            }\n\n            return sb.ToString();\n        }\n    }\n<\/code><\/pre>\n<p>We followed best practices and registered <em>FormRecognizerClient<\/em> instance as a singleton in the IoC container.<\/p>\n<h2>Reporting collected data<\/h2>\n<p>Below there is a fragment of Power BI dashboard with collected data:<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-content\/uploads\/sites\/58\/2020\/10\/call-center-talks-analysis-2.png\" alt=\"call-center-talks-analysis.png\" \/><\/p>\n<h2>Full project available on GitHub<\/h2>\n<p>The project described in this article is available on my GitHub under <a href=\"https:\/\/github.com\/Daniel-Krzyczkowski\/MicrosoftAzureAI\/tree\/feature\/add-form-recognizer-azure-sdk\/src\/call-center-talks-analysis\">this link<\/a>. If you would like to see implementation details or play with the project yourself, feel free to fork this repository. If you want to read more about Text Analytics SDK, please read <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-net\/tree\/master\/sdk\/textanalytics\/Azure.AI.TextAnalytics\">Text Analytics Service documentation<\/a> and if you are interested in reading more about lifetime management, please refer to <a href=\"https:\/\/devblogs.microsoft.com\/azure-sdk\/lifetime-management-and-thread-safety-guarantees-of-azure-sdk-net-clients\/\">Lifetime management for Azure SDK .NET clients <\/a> blog post.<\/p>\n<p><!-- FOOTER: DO NOT EDIT OR REMOVE --><\/p>\n<h2>Azure SDK Blog Contributions<\/h2>\n<p>Thank you for reading this Azure SDK blog post! We hope that you learned something new and welcome you to share this post. We are open to Azure SDK blog contributions. Please contact us at <a href=\"&#109;&#x61;&#105;&#x6c;&#116;&#x6f;&#58;&#x61;z&#115;&#x64;&#107;&#x62;&#108;&#x6f;&#103;&#x40;&#109;&#105;&#x63;&#114;&#x6f;&#115;&#x6f;&#102;&#x74;&#46;&#x63;o&#109;\">&#x61;z&#115;&#x64;&#107;&#x62;&#108;&#x6f;&#103;&#x40;&#109;&#105;&#x63;&#114;&#x6f;&#115;&#x6f;&#102;&#x74;&#46;&#x63;o&#109;<\/a> with your topic and we&#8217;ll get you setup as a guest blogger.<\/p>\n<h2>Azure SDK Links<\/h2>\n<ul>\n<li>Azure SDK Website: <a href=\"https:\/\/aka.ms\/azsdk\">aka.ms\/azsdk<\/a><\/li>\n<li>Azure SDK Intro (3 minute video): <a href=\"https:\/\/aka.ms\/azsdk\/intro\">aka.ms\/azsdk\/intro<\/a><\/li>\n<li>Azure SDK Intro Deck (PowerPoint deck): <a href=\"https:\/\/aka.ms\/azsdk\/intro\/deck\">aka.ms\/azsdk\/intro\/deck<\/a><\/li>\n<li>Azure SDK Releases: <a href=\"https:\/\/aka.ms\/azsdk\/releases\">aka.ms\/azsdk\/releases<\/a><\/li>\n<li>Azure SDK Blog: <a href=\"https:\/\/aka.ms\/azsdk\/blog\">aka.ms\/azsdk\/blog<\/a><\/li>\n<li>Azure SDK Twitter: <a href=\"https:\/\/twitter.com\/AzureSDK\">twitter.com\/AzureSDK<\/a><\/li>\n<li>Azure SDK Design Guidelines: <a href=\"https:\/\/aka.ms\/azsdk\/guide\">aka.ms\/azsdk\/guide<\/a><\/li>\n<li>Azure SDKs &amp; Tools: <a href=\"https:\/\/azure.microsoft.com\/downloads\">azure.microsoft.com\/downloads<\/a><\/li>\n<li>Azure SDK Central Repository: <a href=\"https:\/\/github.com\/azure\/azure-sdk#azure-sdk\">github.com\/azure\/azure-sdk<\/a><\/li>\n<li>Azure SDK for .NET: <a href=\"https:\/\/github.com\/azure\/azure-sdk-for-net\">github.com\/azure\/azure-sdk-for-net<\/a><\/li>\n<li>Azure SDK for Java: <a href=\"https:\/\/github.com\/azure\/azure-sdk-for-java\">github.com\/azure\/azure-sdk-for-java<\/a><\/li>\n<li>Azure SDK for Python: <a href=\"https:\/\/github.com\/azure\/azure-sdk-for-python\">github.com\/azure\/azure-sdk-for-python<\/a><\/li>\n<li>Azure SDK for JavaScript\/TypeScript: <a href=\"https:\/\/github.com\/azure\/azure-sdk-for-js\">github.com\/azure\/azure-sdk-for-js<\/a><\/li>\n<li>Azure SDK for Android: <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-android\">github.com\/Azure\/azure-sdk-for-android<\/a><\/li>\n<li>Azure SDK for iOS: <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-ios\">github.com\/Azure\/azure-sdk-for-ios<\/a><\/li>\n<li>Azure SDK for Go: <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-go\">github.com\/Azure\/azure-sdk-for-go<\/a><\/li>\n<li>Azure SDK for C: <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-c\">github.com\/Azure\/azure-sdk-for-c<\/a><\/li>\n<li>Azure SDK for C++: <a href=\"https:\/\/github.com\/Azure\/azure-sdk-for-cpp\">github.com\/Azure\/azure-sdk-for-cpp<\/a><\/li>\n<\/ul>\n<p><!-- FOOTER: DO NOT EDIT OR REMOVE --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In this post, you&#8217;ll find how the new Azure SDK for .NET was used in a real-world call center conversations analysis project.<\/p>\n","protected":false},"author":36500,"featured_media":745,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[773,705,751],"class_list":["post-742","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-azure-sdk","tag-cosmosdb","tag-sdk","tag-textanalytics"],"acf":[],"blog_post_summary":"<p>In this post, you&#8217;ll find how the new Azure SDK for .NET was used in a real-world call center conversations analysis project.<\/p>\n","_links":{"self":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/posts\/742","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/users\/36500"}],"replies":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/comments?post=742"}],"version-history":[{"count":0,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/posts\/742\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/media\/745"}],"wp:attachment":[{"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/media?parent=742"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/categories?post=742"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/devblogs.microsoft.com\/azure-sdk\/wp-json\/wp\/v2\/tags?post=742"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}