Perform DBCC CHECKDB Data Integrity checks on Log Shipping Secondary server

Perform DBCC CHECKDB Data Integrity checks on Log Shipping Secondary server

As a professional maintaining and managing databases for clients I am very well aware of the fact that Data Integrity in relational databases is absolutely crucial.

As DBAs the data we maintain and support, is in many cases the organization’s most important asset.

Data Corruption is a topic for it self that I will not be covering in this short post but just to emphasize the importance I would like to point out that there are systems that when faced with a data corruption issue are obliged to be shut down until the issue is resolved what ever that could mean for the business.

One practice I find useful is to use a Log Shipping Secondary server to carry out the DBCC data integrity checks offloading the intensive IO operations from the production server. It’s true that it can also be done with Always On but I prefer to keep things simple using Log Shipping which is available for all versions/ editions and no license is required if business applications do not read that copy of the data.

One of the reasons I like Log Shipping is that I see it’s simplicity as it’s strength, it is so simple there is almost nothing that can go wrong, it’s almost like set it and forget it.

To accomplish this task I use a Powershell script being executed from an SQL Server Agent job using a job type of “Operating System (CmdExec)”. I typically locate this job on the Secondary server itself but it can be executed on any server with access to the Primary server and Secondary servers.

The job step command looks like this: powershell.exe “C:\Scripts\Powershell\dbcc_log_shipping\dbcc_log_shipping.ps1”

You can get the Powershell script I use from my GitHub dbcc_log_shipping project.

A few points on this Powershell script

  • Note that the Powershell script uses a configuration file named config.json to read values and assign to local variables instead of hard coding those values in the script itself (and this is because I use a script and not a function that would have accepted input parameters). This is just a method I use to improve the “portability” of the scripts eliminating the need to edit the scripts but rather edit the configuration file instead. You can edit the config file to meet your needs or just assigning those values directly in the script bypassing the config file.
  • Another point to mention is that I use Apache log4net as the logger for the log file of this script. For more details on using log4net with Powershell see this post: https://sqlserverutilities.com/using-log4net-with-powershell
  • To get the log4net.dll file that is refferenced by the Powershell script you can either download the source code from the Apache log4net site and compile it or download the log4net.dll file that I am using on my Windows 10 with Powershell version 5 and that is version 2.0.8.0 compiled for .net 4.5
  • The script high level break down:
    • Get the list of databases to process
    • For each database perform the following:
      • UPDATE msdb.dbo.log_shipping_secondary_databases SET restore_mode = 1, disconnect_users = 0, restore_delay = 0.  This is to assure that an upcoming step of a t-log restore can be done
      • Start the LSBackup job on the Primary server
      • Start the LSCopy job on the Secondary server
      • Start the LSRestore job on the Secondary server
      • Insert the log table DBA.dbo.dbcc_history a “start” message
      • Execute the stored procedure sp_CheckDataIntegrity that issues the DBCC command to check data integrity
      • Insert the log table DBA.dbo.dbcc_history an “End” message
      • Revert the update done earlier for msdb.dbo.log_shipping_secondary_databases

A few points on this approach

  1. Using a Log Shipping Secondary server to carry out DBCC data integrity commands is a great way to protect your self against data corruption.
  2. The Secondary server can be a “small” machine in terms of hardware/resources
  3. No license is required for the Secondary server sql instance
  4. Using this method you can carry out the data integrity checks more often than what usually happens in the industry which is on a weekly basis and by the time you get to know the system suffers from data corruption you already lost precious time possibly making the issue worse.
The following two tabs change content below.
Yaniv Etrogi is an SQL Server consultant. He loves SQL Server and he is passionate about Performance Tuning, Automation and Monitoring.

Leave a Comment

Your email address will not be published. Required fields are marked *