In-Memory Optimization in SQL Server: A Simple Experiment

In-Memory Optimization in SQL Server: A Simple Experiment

Share on Facebook0Share on Google+0Tweet about this on TwitterShare on LinkedIn0Share on Reddit0
In one of my previous articles I briefly introduced In-Memory Optimization in SQL Server 2014. In this article we will conduct a simple experiment utilizing this new technology and discussing the benefits. The experiment will be organized in the following steps:

  1. Create the In-Memory processing enabled database
  2. Create the tables (disk-based, memory-enabled)
  3. Populate the tables with sample data
  4. Prepare the processing code for three scenarios (execution modes):
    • Disk-based processing
    • In-Memory optimization
    • In-Memory optimization with natively compiled stored procedure
  5. Run all three execution modes and measure execution times
  6. Discuss findings

Enough words, let’s proceed with some T-SQL code!

1. Create the In-Memory processing enabled database

–Creates a memory-enabled database by adding a memory-optimized filegroup with a specified directory.
–Note: This assumes that directory c:tmp exists (the subdirectory is created by SQL Server).
CREATE DATABASE InMemDB;
GO
ALTER DATABASE InMemDB
ADD FILEGROUP [inMemOLTP_fg] CONTAINS MEMORY_OPTIMIZED_DATA
ALTER DATABASE InMemDB
ADD FILE ( name = [inMemOLTP_fg_dir],
   filename= ‘c:tmpInMemOLTPDir’)
   TO FILEGROUP [inMemOLTP_fg];
GO
2. Create the tables (disk-based, memory-enabled)
–Create sample tables representing a scenario involving 200000 Products
USE InMemDB;
GO

–Create traditional table (disk-based)
CREATE TABLE tblDiskBased(
id INT PRIMARY KEY,
code VARCHAR(50),
descr VARCHAR(250),
price float
);
GO

–Create a durable (data will be persisted) memory-optimized table
–Alternative choice: SCHEMA_ONLY = Data will be lost if the server turns off unexpectedly
CREATE TABLE tblMemOptimized(
id INT NOT NULL PRIMARY KEY NONCLUSTERED HASH WITH (BUCKET_COUNT=400000),
code VARCHAR(50),
descr VARCHAR(250),
price float
)WITH (MEMORY_OPTIMIZED=ON, DURABILITY = SCHEMA_AND_DATA);
GO


–Create a durable (data will be persisted) memory-optimized table
–Alternative choice: SCHEMA_ONLY = Data will be lost if the server turns off unexpectedly
–Note: This table will be used by the natively-compiled stored procedure
CREATE TABLE tblMemOptimized2(
id INT NOT NULL PRIMARY KEY NONCLUSTERED HASH WITH (BUCKET_COUNT=400000),
code VARCHAR(50),
descr VARCHAR(250),
price float
)WITH (MEMORY_OPTIMIZED=ON, DURABILITY = SCHEMA_AND_DATA);
GO
So far: We have created a memory enabled database and data structures. Now, it’s time to populate the tables with sample data.
3. Populate the tables with sample data
–Populate the tables with sample data (200K records)
–tblMemOptimized
DECLARE @i INT
SET @i=0
WHILE (@i<200000)
BEGIN
INSERT INTO tblMemOptimized
SELECT CAST(@i as varchar(6)) as id, (‘Code ‘+ (CAST (@i AS varchar(6)))) as code, (‘Description ‘+ (CAST (@i AS varchar(6)))) as descr, (10 + (500-10)*RAND()) as price
SET @i=@i+1
END
GO

–Populate the tables with sample data (200K records)
–tblDiskBased
–tblMemOptimized2
INSERT INTO tblDiskBased
SELECT * FROM tblMemOptimized;
GO

INSERT INTO tblMemOptimized2
SELECT * FROM tblMemOptimized;
GO
All three tables now have same data (200K rows each). You can easily check it out!
4. Prepare the processing code for three scenarios (execution modes):
We will wrap each code block into a stored procedure for easier execution.
The scenario is to decrease each product’s price by 5%.
a. Disk-based processing
–Disk-based processing
CREATE PROCEDURE updateDiskBased as
BEGIN
UPDATE dbo.tblDiskBased
SET price=price-(price*0.05)
END
GO
b. In-Memory optimization
–Memory-Optimized
CREATE PROCEDURE updateMemoryOptimized as
BEGIN
UPDATE dbo.tblMemOptimized
SET price=price-(price*0.05)
END
GO
c. In-Memory optimization with natively compiled stored procedure
–Natively-Compiled Stored Procedure
CREATE PROCEDURE updateMemoryOptimizedNatively
with native_compilation, schemabinding, execute as owner
as 
begin atomic with (transaction isolation level = snapshot,language = N’English’)
UPDATE dbo.tblMemOptimized2
SET price=price-(price*0.05)
END
GO

So far: We have created a memory enabled database and data structures. We have populated the tables with sample data (all three tables have identical data for experiment consistency purposes) and we have prepared the stored procedures that target each one of the three tables and perform the processing, that is the update of the ‘price’ column by decreasing it by 5%.
Now it’s time to run each stored procedure and measure the execution times.
But before executing the stored procedures let’s check a sample (TOP 5) from each table:
Figure 1: Top 5 Records from each table before update.
5. Run all three execution modes and measure execution times
Ok, now it’s time to execute the three stored procedures, thus update all tables and measure execution times and here they are:

Figure 2: Execution times for three modes.

Additionally, let’s check once more some records just to confirm that the output is the same in all three tables:

Figure 3: Top 5 Records from each table after update.

6. Discussion
First of all, from the above figure we can see that the output of the three update operations is exactly the same which is something that validates the correctness of the three different execution modes.

As you can see from the screenshot with execution times (Figure 2), the slowest execution was the one that involved the disk-based table (elapsed time: 1924 ms) having also the highest CPU time (328 ms).

The fastest execution was the one of the memory-optimized table with the natively compiled stored procedure where the elapsed time was 329 ms and the CPU 93 ms!

The memory optimized execution was faster than the disk-based with 404 ms elapsed time and 250 ms CPU time but it was not faster than the memory-optimized scenario combined with the natively compiled stored procedure.

In-Memory optimization in SQL Server is a great new technology which can significantly improve performance in many cases. Especially in cases of certain workloads such as staging processes in Data Warehouse systems, high data insert rates (i.e. smart metering) and low latency execution scenarios (i.e. investment trading), In-Memory optimization can boost performance. Of course, before applying it on Production systems, a proper analysis is required along with testing.

During the last six months I have been working developing a simulator for In-Memory optimization in SQL Server. I expect to make it publicly available by the end of September 2014. So make sure that you check my official website often for updates!


Recommended eBooks on SQL Server:

Tuning SQL Server: eBook by SQL Server MVP Artemakis Artemiou
Tuning SQL Server: eBook by SQL Server MVP Artemakis Artemiou
Administering SQL Server: eBook by SQL Server MVP Artemakis Artemiou
Administering SQL Server: eBook by SQL Server MVP Artemakis Artemiou
Artemakis Artemiou
Artemakis Artemiou is a Senior SQL Server Architect, Author, Software Developer and a Microsoft Data Platform MVP. He has over 15 years of experience in the IT industry in various roles. Among other, via his initiative SQLEBooks.com, Artemakis authors and publishes eBooks on different topics on SQL Server. Artemakis currently serves as the President of the Cyprus .NET User Group (CDNUG) and the International .NET Association Country Leader for Cyprus (INETA). Additionally he is the founder of the SQLArtBits initiative that aims to provide the technical community with simple, yet powerful and high-quality SQL Server tools. Currently, the highlights of these tools are DBA Security Advisor and In-Memory OLTP Simulator. Artemakis's official website can be found at aartemiou.com. Artemakis's blogs can be found at: SQLNetHub.com and TechHowTos.com.