March 11th, 2020

In-Memory OLTP Best Practices – Part 1

Developer Support
Cloud Solution Architects

In this multi-part series, App Dev Manager Jon Guerin and Premier Consultant Daniel Taylor (@dbabulldog) lays out some best practices around setting up and using In-Memory OLTP.


Recently a co-worker and I were approach by a team requesting assistance with In-Memory OLTP for SQL Server 2016. As we spoke, the team was interested in best practices around setting up and using In-Memory OLTP. I started my journey of searching the archives of the internet, my notes, and training archives and surprisingly there was not one area that yielded this information. As the In-Memory solution adoption continues to increase we thought it would be fitting to put together a best practices cheat sheet to help drive out conversations with our clients.

What we are presenting here is a living document and one we believe will grow over time with additions to the solution and input from the community. We will be breaking up this post into a two-part series.

The best practices / cheat sheet has been broken up into the following areas:

  • Part one of the series (focus of this post)
    • Understanding the requirements for In-Memory OLTP features
    • Usage Scenarios
    • Planning
    • Schema Durability
    • Natively compiled Stored Procedures
  • Part two of the series (focus of the next post)
    • Indexing
    • Memory Management
    • Performance Considerations
    • Backup and Recovery
    • Monitoring

Setting expectations up front with this best practices / cheat sheet is by no means an exhaustive post when it comes to In-memory OLTP. What is covered are the most common questions we have received to gain a better understanding of where to start when working with the In-Memory OLTP solution. An additional goal was not to type out verbatim what documentation already exist, but to highlight some of the top considerations and then provide links for you to dive deeper into the topic.

Understanding the requirements for In-Memory OLTP features:

  • Follow hardware and software for installing SQL Server documentation.
  • Understanding additional requirements for Memory-Optimized tables
    • Available on SQL Server 64-bit versions
      • SQL Server 2016 SP1 (or later), and edition
      • SQL Server 2016 pre-sp1 (Enterprise or Developer)
      • SQL Server 2014 (Enterprise or Developer)
    • Enough memory to hold
      • data in-memory optimized tables and indexes
      • additional memory to support online workload
      • Ensure enough memory resources are available when running SQL Server in a VM
        • Memory reservation
        • Dynamic Memory, Minimum RAM
      • Enough free disk space (2 to 4 times the durable memory-optimized table)
      • Processor that supports instruction cmpxchg16b
      • Requirements for using memory-optimized tables

Usage Scenarios:

Determine what issue you are solving. Does it fit into some of the examples provided? If yes, moving to the planning phase is the next step recommendation.

  • High-throughput and low-latency transaction processing
    • Performance-Critical transaction use in-memory tables
    • Move as much business logic into native compiled stored proc for best performance
    • Financial and Gaming case studies
  • Data ingestion, including IOT
  • Caching and session state
    • Really attractive for session state (ASP.NET application) and caching
    • Gaming site and mid-tier use cases for in-memory OLTP
  • TempDB Object replacement
  • ETL
    • Using non-durable memory-optimized tables for data staging
    • Use natively compiled stored procedures to do transformations on your staging tables
      • Natively compiled stored procedures will benefit more for processes requiring more clock cycles
      • Natively complied store procs with reduce T-SQL surface area
    • Note of caution. When loading data to in-memory tables it is single threaded and you will need to get creative with your load process, SSIS packages, or BCP files

Planning:

Utilize the transaction performance analysis reports to determine if the table and or stored procedure should be ported into In-Memory OLTP.

  • Run Transaction Performance report against representative production workload to yield
    • Analyze workload determining hot spots where in-memory could potentially help
    • Helps plan and execute the migration to In-memory OLTP identifying incompatibilities and potential impact
  • Review Transaction Management reports analysis
    • Tables
      • Scan statistics
        • Looking for high percent of total accesses. The higher the percentage indicates higher table utilization which could yield benefits from In-Memory OLTP.
        • Examine lookup and ranged scan statistics yields possible gains by converting to In-Memory OLTP due to its optimistic concurrency design.
      • Contention Statistics
        • Examine percent of total waits, latch statistics, and lock statistics
        • Tables with high percentage in any of these categories could yield significant performance gains by migrating to In-Memory OLTP tables.
      • Migration Difficulties
        • Provides a difficulty rating
        • Compare these ratings to the objects identified in the scan and contention statistics to help rank which tables should be ported to In-Memory.
      • Should Table be Ported
    • Stored Procedures
    • Once artifacts are identified for the tables and stored procedure reports generate an In-Memory OLTP checklist.
  • Additional In-Memory Planning considerations
  • For those that prefer the T-SQL approach Ned Otter a strong advocate for in memory has created a Migrating Table to In-Memory OLTP script
  • When considering In-Memory examine your workload and identify how could it benefit by using in-memory temp tables, table variables or functions, and not just your core tables and procedures.

Schema Durability:

  • Schema Only
    • Durability SCHEMA_ONLY
    • Deploy if you want to maintain only the schema of the in-memory table
    • Key to remember is this data is not maintained after a service restart or database goes offline
    • Use case examples
      • Get around the use of temp tables
      • Store temporary data for ETL processes / staging tables for data warehouses
      • Logging tables
  • Schema and Data
    • Durability SCHEMA_AND_DATA
    • Primary Key is required for Durable In-memory table
    • Deploy if your requirement is to maintain the schema and data of the in-memory table
    • Note this will impact/increase recovery time of the database (recovery recommendations found in backup and recovery section)
    • Use case examples
      • Tables that require high throughput and low latency
  • Starting in 2016 automatic update of statistics is supported for in-memory tables

Natively compiled Stored Procedures:

  • Used for business-critical parts of application and frequently executed
  • Frequently executed
  • Compiled into machine code and stored as a DLL
  • All parameters passed are considered to have unknown values (uses statistical data for optimization)
    • Parameter sniffing is not used
    • Natively compiled stored procedures do support the optimized for hint
    • Recommendation is to start with default optimization of unknown
  • Natively compiled stored procs are not recompiled when statistics change
  • The table should contain a representative set of data and statistics before the procedure Is created. Guidelines for deploying tables and procedures.
  • In general, one can expect better performance from native compiled stored procs
    • Will shine for high concurrent workloads
    • As the number of rows increase
    • Use of the following should exhibit better performance
      • Aggregation, Nested-loop Joins
      • Multi-statement selects or DML operations
      • Complex expressions, procedural logic
  • For the full list of supported constructs in natively compiled stored procedures
  • For the list of constructs not supported by In-memory OLTP

As you find value in this post and feel like there are items that could be added or modified please let us know. Microsoft embraces a growth mindset and we are ever learning and open for feedback. We want to send a special thanks to In-Memory community advocate @nedotter for review of this information.

Please come back for part 2 of this best practices / cheat sheet where we will provide thoughts and links around the following areas:

  • Indexing
  • Memory Management
  • Performance Considerations
  • Backup and Recovery
  • Monitoring

Author

Developer Support
Cloud Solution Architects

Microsoft Developer Support helps software developers rapidly build and deploy quality applications for Microsoft platforms.

0 comments

Discussion are closed.