June 27, 2011 | SQL Server

SQL Server v.Next (Denali) : Why you should start testing early

Denali is coming, whether you like it or not. You may not be an early adopter and you may not have plans on your current calendar, but at some point you will need to move your apps and databases to this release – or one very much like it. There are a lot of great new features you will be able to take advantage of, but not everything is a double rainbow. There are some changes that will break your spirit if you let them. What does it mean?

I go over several breaking changes in my presentation that are well documented and have been announced for a long time, and what you can do to work around them. For example: 

  • DATABASEPROPERTY()
    You should use DATABASEPROPERTYEX() or, better yet, get your database-related metadata from catalog views such as sys.databases. Chances are, DATABASEPROPERTYEX() will find its way out of the product at some point, too.
  • 80 (SQL Server 2000) compatibility
    You will not be able to use 80 compatibility, perform an in-place upgrade of any 2000 databases or later that are in 80 compatibility mode, or restore / attach databases that were backed up or detached in that mode. Your workaround in this case is to backup / detach the database, restore or attach it to an instance of SQL Server 2005, 2008 or 2008 R2 (yes, you can use a temporary Evaluation Edition for this), switch the compatibility to the highest feasible level, back it up from the new location, and then restore it on Denali. This does not take into account any testing that will be required.
  • osql
    Use sqlcmd or PowerShell. No reason to continue using this antiquated command-line utility.
  • SQLMail
    You should be using Database Mail by now (unless you are just getting around to moving off of SQL Server 2000).
  • sqlmaint.exe
    I'm not sure that there's a simple workaround for this, as I've never used it, but you'll no longer be able to manage maintenance plans using this command-line utility.
  • SQL-DMO
    The writing's been on the wall for some time now: use SMO.
  • SET FMTONLY
    Currently your code (and perhaps the drivers you are using to connect to SQL Server) may be using SET FMTONLY ON to inspect the result set of a command. This functionality is being replaced, wholesale, by the new metadata discovery dynamic management objects. You should test your code against the next CTP if you aren't already testing, especially if you are using legacy providers to connect to SQL Server.
  • SET ROWCOUNT for DML
    If you currently rely on using SET ROWCOUNT to limit the number of rows affected by an UPDATE or DELETE command, you'll want to start thinking about using TOP instead. SET ROWCOUNT for SELECT will continue to work (though personally I am making a concerted effort to stop using the command at all, because I am not sure exactly what it means – for example, when it stops working for DML, how will that affect MERGE, or DML based on subqueries, derived tables or CTEs?).

Those are the easy ones. There are some other changes that will break your code, but they may never show up on some discontinued features document or appear in a trace for deprecated events. Let me illustrate with an example that I came across recently.

DBCC LOGINFO

If DBCC LOGINFO changes, you shouldn't expect it to show up on any document or in a trace, because it is, after all, undocumented. But like many other undocumented commands, you will find its use quite prevalent in systems out there — so even a slight change could have a substantial impact. It turns out that the output of DBCC LOGINFO has changed in Denali, and I'll show how this can break your existing code.

Let's say you are creating a #temp table to store DBCC LOGINFO results (maybe because you are looping through and collecting data for multiple databases). Perhaps you are using code like Leonardo Pasta's to automate VLF management. Your code would look something like this:

<pre style="padding: 10px 20px; background: none repeat scroll 0% 0% transparent; font-size: 12px; font-family: consolas,lucida console,courier new,courier; -moz-background-inline-policy: continuous;">CREATE TABLE #LI
(
   [FileId]      INT,
   [FileSize]    BIGINT,
   [StartOffset] BIGINT,
   [FSeqNo]      INT,
   [Status]      INT,
   [Parity]      INT,
   [CreateLSN]   DECIMAL(25, 0)
);
 
INSERT #LI EXEC('DBCC LOGINFO(''msdb'');');

This code will work fine in SQL Server 2008 R2 and lower, but if you run this on Denali, you will receive this error:

Msg 213, Level 16, State 7, Line 1
Column name or number of supplied values does not match table definition.

This is because there is a new leading column in the DBCC output in Denali, RecoveryUnitID (INT). So you'll need to change your #temp table declaration as follows: 

<div style="padding: 10px 20px; background: none repeat scroll 0% 0% transparent; font-size: 12px; font-family: consolas,lucida console,courier new,courier; -moz-background-inline-policy: continuous;">CREATE TABLE #LI
(
   <b>[RecoveryUnitId] INT</b>,
   [FileId]         INT,
   [FileSize]       BIGINT,
   [StartOffset]    BIGINT,
   [FSeqNo]         INT,
   [Status]         INT,
   [Parity]         INT,
   [CreateLSN]      DECIMAL(25, 0)
);
</div></td></tr></table>

Once that leading column is in place, the insert will work fine. What this means is that if you have code that needs to run against multiple versions of SQL Server, or at least needs to maintain compatibility with older versions, you'll need to build the #temp table conditionally. I'd suggest using something other than an undocumented command for this, but it seems that DBCC LOGINFO remains your only choice, as it is clear that Microsoft isn't yet interested in adding any features that will ease log management (see Connect item #322149 and vote if you agree).

Do your homework

If you think DBCC LOGINFO is the only breaking change that you might encounter, think again. You don't need to look much further than the changes to the memory manager in Denali to know that there are some other issues coming your way (for example, many of the memory-related DMVs have changed column names and different data behind those columns, and DBCC MEMORYSTATUS output has been revised as well).

As open as the CTP and beta program have been, and as much as folks like myself have been trying to push this information as early on as possible, if you install Denali after it is released and are surprised by breaking changes, you have nobody to blame but yourself.

Here is a relatively complete list of deprecated items, albeit currently out of date (it has a mix of info from CTP1 documentation, and the 2008 R2 doc it originated from):

http://bit.ly/AB_Denali_Deprecated

You'll want to watch that page when it is updated for the next CTP. And if you're not already playing with Denali, please at least consider grabbing that CTP when it becomes available.

I will continue to post examples that I come across where code may break simply by upgrading to Denali. You can never be too prepared…