IT Depends

a database administration blog


SQL Server

Working with DACPACs

First, lets outline what a DACPAC is.  According to Microsoft:

“A data-tier application (DAC) is a logical database management entity that defines all of the SQL Server objects – like tables, views, and instance objects, including logins – associated with a user’s database. A DAC is a self-contained unit of SQL Server database deployment that enables data-tier developers and database administrators to package SQL Server objects into a portable artifact called a DAC package, also known as a DACPAC.” jump to article

So, it’s essentially a package of changes that your development team will bundle up and send over to you to apply to a production server.  I think that’s great and because all the changes are encapsulated in a single file.  As such it can be source controlled, shared, emailed etc.

The DACPAC is a binary file, so you’d think that ascertaining its contents would be a tricky affair but, very much like your Azure DB backups, .bacpac files, it’s just a ZIP archive so editing the dacpac with a .zip extension will allow Windows to open the archive and display the packaged deployment.  Nice.

Not everything is peachy, however.  Firstly, you only really want to be deploying dacpacs on SQL Server 2012 or above.  There’s generally a shoehorn within grasp for most things and dacpacs aren’t an exception to this rule. Having said this, you lose a lot of the advantages if the dacpac has to be pulled apart and executed manually by your Production DBA.

It’s a requirement to match the version of SqlPackage.exe with the instance you’re deploying to.  It’s a world of hurt otherwise.

If your developers are ‘click and go’ developers, you may be taking data and security from the development environment and applying it to production.

If you or your development team are looking at using dacpacs for data tier deployments, I would recommend working closely with your dev team to set this up so that you can avoid the pitfalls and reap all the benefits from their use.


mssqlsystemresource – The database you may have never seen.

When I blog, it tends to be due to me having done something at work and thinking it may be useful, primarily for me to make a note of what I did somewhere but also because this might help someone else.  A while back, I posted regarding moving system databases to another location.  It was a pretty simple and already well documented but there is another database, a secret one which some may not know about and I didn’t think about when moving system databases.  This database is called mssqlsystemresource.

This database can not be backed up like a visible system database or a user database but when moving system databases, you may need to consider this database for the move to be successful.

The mssqlsystemresource database has two main functions.  Firstly, it contains all the physical system objects that come with SQL Server.  When you query sys.objects from database1, for example or any other user database, the object exists physically in the resource database, but logically in your user database.  Secondly, and because of this, it makes upgrades a much simpler prospect since the physical system objects can just be overwritten in one, known place.

For a more detailed overview of this database, here’s a link to Microsoft Docs with more detail:

Moving SQL Server system databases

This one doesn’t come up that often for me.  Since it doesn’t I feel I need to blog about it as an aid to memory and to anyone else who stumbles across it.

So, if you need to move your system databases as part of a routine operation and to a location on the same server/instance, it is actually a surprisingly easy and easy to figure out process.  The whole thing end to end falls into the following steps.

  1. Modify startup options -d and -l to point to the new location of your Master database data and log files.
  2. Alter the file locations of the remaining SQL Server system databases (model, tempdb, msdb etc.).
  3. Stop the SQL Server service.
  4. Move the data and log files to their new homes.
  5. Start the SQL Server service.

For those a little newer to this sort of operation, I’ll detail the first two steps.  The last three shouldn’t really need covering here.

  1. Modify startup options -d and -l to point to the new location of your Master database data and log files.
  • Open SQL Server Configuration Manager
  • In the left pane, select SQL Server Services
  • In the right pane right click > properties on SQL Server (INSTANCENAME)
  • On the Startup Parameters tab select the line with the -d switch, edit the parameter in the box above with the new Master database data file and click Update.  Do the same for the log file, using the parameter with the -l switch.

2.  Alter the file locations of the remaining SQL Server system databases (model, tempdb, msdb etc.).

  • For each data and log file associated with each of the system databases you want to relocate, simply run the following command, replacing with your own pertinent data.  Note that NAME is the logical database name.
MODIFY FILE ( NAME = MSDBData, FILENAME = 'G:\system_db\MSDBData.mdf'); GO

MODIFY FILE ( NAME = MSDBLog, FILENAME = 'G:\system_db\MSDBLog.ldf'); GO

Before moving on to later steps, you can check to make sure the locations have updated correctly by querying sys.master_files and checking the physical_name column.

BOL article can be found here.

Modifying Logical Filenames in SQL Server (T-SQL)

So this is syntax I generally struggle to remember but since I perform quite a few restore operations to multiple development/staging servers and I’m quite particular about my neatness, I like to modify the logical file names for the database and log files.  It’s a simple script and it looks like this

MODIFY FILE (NAME = Logical_filename1, NEWNAME = Locical_filename2);

MODIFY FILE (NAME = Logical_filename1_log, NEWNAME = Logical_filename2_log);

According to Books Online (filespec being the modify file option)

Only one <filespec> property can be changed at a time. NAME must always be specified in the <filespec> to identify the file to be modified.

This is the reason for the two separate statements to modify the logical name of the database file and the log file.

Implicit Transactions and Try/Catch blocks

When I first came across SQL Server’s Try/Catch implementation back in 2005, one of the first things I tested was how transactions worked within them and whether or not both the try and catch blocks created implied transactions.

If you have no interest in any detail and you’re just after a quick answer, the answer is, No.  SQL Server Try/Catch blocks do not create implicit transactions as a result of their use.

If you didn’t want the details but now you’re thinking ‘Prove it’ then please read on.

Continue reading “Implicit Transactions and Try/Catch blocks”

Powershell and SQL Server

So, I’ve been challenging myself to use remote sessions less and use Powershell more for remote SQL Server administration tasks.

I’m hoping to create some posts with some useful scripts I’ve created, starting today with a simple script to grab all SQL Server related services and some basic information, from a remote server.

Get-WmiObject -ComputerName SERVER1 win32_service | Where-Object {$ -like "*sql*"} | Format-Table name, DisplayName, Startname, State

This could easily be turned into a function and given a list of servers to go through and check, you could just add the host name to the Format-Table cmdlet.  I built this script to identify the credential used to start the services as I wanted to use different credentials for each service rather than the same one for all SQL Server services across our entire estate.

Installing a SQL Server failover cluster and configuring and migrating databases (Part 2)

So, you have a Windows 2012 R2 Cluster, you have your copy of SQL Server 2008 R2 all slipstreamed with SP2 or above, you’re good to go, you launch the installer and at the cluster verification check you get a “Cluster Service verification rule” failure.  This issue was detailed by Microsoft in their document titled “How to use SQL Server in Windows 8, Windows 8.1, Windows Server 2012 and Windows Server 2012 R2” but since this will very likely be a common complaint, I’ll list it in more detail here.

This is the error message you will be presented with:


The reason you’re getting this is due to MSClust.dll not being enabled.  This is a COM-based library that forms the Failover Cluster Automation Server feature that you can easily install with the following PowerShell Cmdlet:

 add-windowsfeature RSAT-Clustering-AutomationServer

You’ll need to run an elevated PowerShell for this command.  Since this is a Remote Server Administration Tool, I have no idea what would happen if you were running a Windows Server 2012 R2 Core installation.  Maybe this is why Microsoft recommend you don’t install Core for SQL Server.


Installing a SQL Server failover cluster and configuring and migrating databases (Part1)

Today I installed SQL Server 2008 R2 Standard on a Windows Server 2012 R2 cluster.  Not my preferred choice of SQL Server product but needs must.

With the hardware built, the Windows cluster built and the whole set up ready and waiting for a purpose, I went ahead and installed SQL Server.

My excitement was short lived as I had failed to read a very useful article from Microsoft (possibly the only one I hadn’t read), outlining in perfect detail, the issues with versions of SQL Server that don’t play so well with Windows Server 2012 R2.  For your convenience, that article is here:

It turns out that SQL Server 2008 R2 won’t play nice with Windows Server 2012 R2 unless it is patched up to SP2 or above.  This lead me to another article I hadn’t read on how to slipstream a service pack into the installer.  Something I’d never done before but had heard Hardware support guys bash on about when sliding SP3 for XP in.  Again for your convenience, see that article here:

Despite what the article claims, if you’re very sure which architecture you’re going to use, you will only need to slipstream the relevant one.  I patched up to SP3 and was then allowed to continue onwards with my install.


Create a free website or blog at

Up ↑