How to Restore a SQL Server database to a lower version

Recently, I was asked to restore a database from a 2019 production environment to a 2017 development environment. Both databases have the same collation and compatibility level set to SQL Server 2016:

This can’t be done. At least, not using the traditional backup and restore process.

However, this had to be done as the application that queried this database had hardcoded the server and database name throughout the code.
Migrating the entire test server to 2019 was not an option at this time because there were other databases on this 2017 instance for other applications. I don’t doubt that their credentials were probably hardcoded too -_-

Doing a cold backup was also not an option because management did not want to detach the database from the production instance.

One solution is to take an export of the database into a .bacpac file by:
Right-click the database => Tasks => Export Data-tier Application
But that resulted in an error:

Another solution is to take a logical export of the entire database:
Right-click the database => Tasks => Extract Data-tier Application

This method is fine for smaller databases. The logical export file size for a database 800 MB in size turned out to be 2.3 GB. This solution may not be feasible for a database over 50 GB in size.

Opening this file in SSMS produced an error because there is a 2GB file size limit and our script is 2.3GB

In order to open this file in SSMS, we must first split it into smaller 500 MB files that can be opened by SSMS. I found this PowerShell script that does it in less than a minute
Note: depending on your hardware, you may need to split it into even smaller files about 100MB in size.
The code will be left here in case the question is ever removed from StackOverflow:

$from = "C:\Temp\db-script.sql"
$rootName = "C:\Temp\export_files"
$ext = "sql"
$upperBound = 500MB

$fromFile = [io.file]::OpenRead($from)
$buff = new-object byte[] $upperBound
$count = $idx = 0
try {
    do {
        "Reading $upperBound"
        $count = $fromFile.Read($buff, 0, $buff.Length)
        if ($count -gt 0) {
            $to = "{0}.{1}.{2}" -f ($rootName, $idx, $ext)
            $toFile = [io.file]::OpenWrite($to)
            try {
                "Writing $count to $to"
                $tofile.Write($buff, 0, $count)
            } finally {
                $tofile.Close()
            }
        }
        $idx ++
    } while ($count -gt 0)
}
finally {
    $fromFile.Close()
}

The files will be exported

But we are not done yet. The PowerShell script does not split files based on row number; only bytes. So there may be incomplete insert rows and function definitions. This has to be fixed. Go to the end of each file and fix the code. For example, if the insert statement is cut off like this:
File 1

Insert into dbo.some_table(id, name, 

File 2

country) values (1, 'John', 'Trinidad & Tobago') 

Copy the code from file 2 to the end of file 1 to complete the statement. Then do it for File 3 and so on until all the files are complete.

Another fix that has to be made is setting IDENTITY_INSERT to ON for any table being inserted into at the start of the script. For example, if the first statement of any of the files looks like:

Insert into dbo.some_table...

Set the IDENTITY_INSERT to ON so that the primary key can be inserted.

set [dbo].[some_table] identity_insert on;

The last fix is to change the datafile location at the start of the script to the datafile location in the lower version instance. Also, change the name of the data file and log file if it conflicts with any existing files.

We can then open the files in SSMS and execute the scripts.

Note:
The user login creation statements may fail but the logins can be created afterwards. It may be that the same logins are not available for the test instance.

Leave a Reply

Your email address will not be published. Required fields are marked *