Setting SQL Server Maximum Memory

One of the dafter defaults in SQL Server is the Maximum Server Memory (MB) – set to a value of 2147483647. Who has a server with 2PB RAM? Give me a call if you do – I want to have a look!

SQL Server’s standard operating procedure is to gradually take over all the memory it can, up to the limit specified, and then some – SQL Server also puts some objects outside of that pool. However, when it does that, there’s a risk that the operating system (and anything else running on the server) will be starved of resources, and SQL Server will be forced to yield memory back to the OS, which can result in a performance hit while that happens. [I blogged about a Memory Pressure query last year.] Generally, it’s a Good Thing that SQL Server does this – after all, the bulk of this is cached data – previously read, and retained in memory in case someone else wants to use it later – improves performance by reducing the number of times the server has to go to disk to get data.

The thing to do, ideally, is to configure the maximum server memory when you build the server; however, sometimes you walk into a place where there are many servers where this hasn’t been done, or are otherwise looking for a quick way to determine what the setting should be. Jonathan Kehayias of SQLSkills blogged about a sensible SQL Server Maximum memory calculation (in response to a post elsewhere about a really dodgy memory config advisor, but I’m not going to link to that…)

What I’ve done below is codify that knowledge into a nice friendly T-SQL query that you can run, below. It makes use of the sys.dm_os_sys_info DMV to get the memory physically in the server; that DMV, though, has changed form between SQL 2008R2 and SQL 2012, the new version reporting physical_memory_kb whereas the previous version had physical_memory_in_bytes. Hence a bit of dynamic SQL nastiness at the start of the query.

DECLARE @CurMemoryMB BIGINT ;
DECLARE @PhysMemMB BIGINT ;
DECLARE @CalcdMemMB BIGINT ;
 
SELECT
    @CurMemoryMB = CONVERT(BIGINT, value)
FROM
    sys.configurations
WHERE
    name LIKE 'max server memory%' ;
 
DECLARE @physmemtable TABLE ( PhysMemMB BIGINT ) ;
 
DECLARE @ver INT ;
DECLARE @ProductVersion VARCHAR(20) ;
DECLARE @memoryquery VARCHAR(4000) ;
 
SELECT
    @ProductVersion = CAST(SERVERPROPERTY('ProductVersion') AS VARCHAR(20)) ;
 
SELECT
    @ver = CAST(LEFT(@ProductVersion, CHARINDEX('.' ,
                                                   @ProductVersion
                                               ) - 1) AS INT) ;
 
SELECT
    @memoryquery = 'SELECT ' + CASE
                                   WHEN @ver >= 11
                                       THEN 'physical_memory_kb / 1024 '
                                   ELSE
                                       'physical_memory_in_bytes / (1024*1024) '
                               END + ' AS PhysMemMB FROM sys.dm_os_sys_info' ;
 
INSERT INTO @physmemtable EXEC(@memoryquery) ;
 
SELECT
    @PhysMemMB = MAX(p.PhysMemMB)
FROM
    @physmemtable AS p ;

Yeah, it’s not the prettiest bit of code I’ve ever written, but it does the job.

This next chunk is where we take the @PhysMemMB and calculate what Jonathan recommends it should be (approximately – I’ve tweaked it a little for low memory servers, of which we have enough to depress me). His recommendations can be summarised as:

  • reserve 1GB RAM for the OS
  • reserve a further 1GB RAM for every 4GB between 4GB and 16GB
  • reserve a further 1GB RAM for every 8GB above 16GB

My calculation routine:

SELECT
    @CalcdMemMB = CASE
                      WHEN @PhysMemMB <= 2048
                          THEN 1024
                      WHEN @PhysMemMB <= 4096
                          THEN @PhysMemMB - 1024
                      WHEN @PhysMemMB <= 16384
                          THEN @PhysMemMB * 3 / 4
                      ELSE
                          12288 + (( @PhysMemMB - 16384 ) * 7 / 8 )
                  END ;

Close enough. 🙂

The last chunk of the query is based around calling sp_configure to alter the “max server memory (MB)” setting. You may need to enable advanced settings first; I’ve been polite, and written code to check whether the advanced options are visible, make them visible if not, and re-invisiblificate them if applicable…

The full query:

DECLARE @CurMemoryMB BIGINT ;
DECLARE @PhysMemMB BIGINT ;
DECLARE @CalcdMemMB BIGINT ;
 
SELECT
    @CurMemoryMB = CONVERT(BIGINT, value)
FROM
    sys.configurations
WHERE
    name LIKE 'max server memory%' ;
 
DECLARE @physmemtable TABLE ( PhysMemMB BIGINT ) ;
 
DECLARE @ver INT ;
DECLARE @ProductVersion VARCHAR(20) ;
DECLARE @memoryquery VARCHAR(4000) ;
 
SELECT
    @ProductVersion = CAST(SERVERPROPERTY('ProductVersion') AS VARCHAR(20)) ;
 
SELECT
    @ver = CAST(LEFT(@ProductVersion, CHARINDEX('.' ,
                                                   @ProductVersion
                                               ) - 1) AS INT) ;
 
SELECT
    @memoryquery = 'SELECT ' + CASE
                                   WHEN @ver >= 11
                                       THEN 'physical_memory_kb / 1024 '
                                   ELSE
                                       'physical_memory_in_bytes / (1024*1024) '
                               END + ' AS PhysMemMB FROM sys.dm_os_sys_info' ;
 
INSERT INTO @physmemtable EXEC(@memoryquery) ;
 
SELECT
    @PhysMemMB = MAX(p.PhysMemMB)
FROM
    @physmemtable AS p ;
 
SELECT
    @CalcdMemMB = CASE
                      WHEN @PhysMemMB <= 2048
                          THEN 1024
                      WHEN @PhysMemMB <= 4096
                          THEN @PhysMemMB - 1024
                      WHEN @PhysMemMB <= 16384
                          THEN @PhysMemMB * 3 / 4
                      ELSE
                          12288 + (( @PhysMemMB - 16384 ) * 7 / 8 )
                  END ;
 
SELECT
    @CurMemoryMB AS CurrentMaxMemory ,
    @PhysMemMB   AS PhysicalMB ,
    @CalcdMemMB  AS CalculatedMB ;
 
DECLARE @Advanced INT ;
 
SELECT
    @Advanced = CONVERT(INT, value_in_use)
FROM
    sys.configurations
WHERE
    name = 'show advanced options' ;
 
IF @Advanced <> 1
BEGIN
    EXEC sp_configure 'show advanced options', 1 ;
 
    RECONFIGURE WITH OVERRIDE ;
END ;
 
EXEC sp_configure 'max server memory (MB)', @CalcdMemMB ;
 
RECONFIGURE WITH OVERRIDE ;
 
IF @Advanced <> 1
BEGIN
    EXEC sp_configure 'show advanced options', 0 ;
 
    RECONFIGURE WITH OVERRIDE ;
END ;

That’s it. A nice, easy query to set maximum memory to a sensible value. Of course, it may need tuning for certain servers if they’re running SSRS, SSAS, or any other software, but for a general SQL Server config, this is a good starting point. I’ve tested it on multiple versions of SQL from 2005RTM upwards, with memory from sub-4GB to 128GB+ machines.

A couple of options for running this quickly on multiple servers – either use the SQL Server Management Studio Registered Servers to run a query against multiple servers, or, for a little more control, open the query in SQLCMD Mode in SSMS, put a :CONNECT <> line at the top, and just change the servername.

Disclaimer: Don’t run code you don’t understand. No, really. Do not open it. Apparently this one is pretty nasty. It will not only erase everything on your hard drive, but it will also delete anything on disks within 20 feet of your computer.

It demagnetizes the stripes on ALL of your credit cards. It reprograms your ATM access code, screws up the tracking on your VCR and uses subspace field harmonics to scratch any CD’s you attempt to play. It will re-calibrate your refrigerator’s coolness settings so all your ice cream melts and your milk curdles. It will program your phone autodial to call only your ex-spouses’ number. This virus will mix antifreeze into your fish tank. It will drink all your beer. It will leave dirty socks on the coffee table when you are expecting company. Its radioactive emissions will cause your bellybutton fuzz (be honest, you have some) to migrate behind your ears. It will replace your shampoo with Nair and your Nair with Rogaine, all while dating your current boy/girlfriend behind your back and billing their hotel rendezvous to your Visa card. It will cause you to run with scissors and throw things in a way that is only fun until someone loses an eye. It will give you Dutch Elm Disease and Psitticosis. It will rewrite your backup files, changing all your active verbs to passive tense and incorporating undetectable misspellings which grossly change the interpretations of key sentences. It will leave the toilet seat up and leave your hair dryer plugged in dangerously close to a full bathtub. It will not only remove the forbidden tags from your mattresses and pillows, but it will also refill your skim milk with whole milk. It will replace all your luncheon meat with Spam. It will molecularly rearrange your cologne or perfume, causing it to smell like dill pickles. It is insidious and subtle. It is dangerous and terrifying to behold. It is also a rather interesting shade of mauve. These are just a few signs of infection. Bad times, indeed

Posted in SQLServerPedia Syndication | Tagged , , , | 1 Comment

T-SQL Tuesday #85 – Backup Compression

TSQL2sDay150x150T-SQL Tuesday time again. First one I’ve taken part in for over a year – oops. This time, Kenneth Fisher (blog|@sqlstudent144) is hosting a blog party about backups.

So here’s a quick post about Backup Compression.

Backup Compression

Told you.
Available since SQL Server 2008 Enterprise edition, or SQL Server 2008 R2 in other editions, this neat little bit of functionality allows you to compress your backups as they are made, but it needs to be enabled at the server level. And, by default, it’s not enabled. Thanks, Microsoft.

I’m not going to talk about the details of enabling backup compression here. Just one of the side-effects of enabling it. It’s OK, it’s a good one.

What does it do?

SQL Server backup compression – does what it says on the tin. Instead of SQL Server taking a backup by reading pages / extents of data from the database file and writing them out to the backup file, it compresses the data before it writes. It’s not the best compression you’ll get, as it won’t read the entire file before compressing it; however, it’s good enough that on OLTP databases that contain normal varchar / numeric data you could see savings of over 75% – indeed, I have some databases that save 90% of disk space. And that’s made my storage guys happy. Well, less unhappy.

You mentioned a side-effect…

Yes…

As well as the obvious not-taking-up-so-much-disk-space, there’s another little benefit – the backup jobs, now they don’t have to write out so much data, are rather quicker.

How much quicker?

Well, it depends. (Sorry.)

Remember that thing I did a while ago about pulling SQL Server job history into an Outlook calendar? Well, here’s what our backups looked like a couple of weeks ago – at least, some of ’em…

This is what the backups looked like a few weeks ago, before we enabled compression globally. Yes, I know, all the jobs (well, a lot of ’em) start at the same time, but that’s kinda the point – hammering the storage…

These are the timings for Monday-Wednesday, 21st-23rd November:

backupcompressionbefore

And here, two weeks later, after compression was enabled everywhere, Monday-Wednesday, 5th-7th December:

backupcompressionafter

Spot the difference.

Posted in SQLServerPedia Syndication, T-SQL Tuesday | Tagged , , | 2 Comments

PowerShell – Slicing

I was skimming through the first part of the Simple-Talk series on PowerShell Desired State Configuration, and I got to the bit that says “To install them, all you need to do is copy and paste them into your PowerShell modules folder”, and shows you the output of the author’s PSModulePath.

Get-ChildItem -Path Env:\PSModulePath | select value | fl *

poshslice1

Yeah, now compare that with the mess on my PC:

poshslice2

What’s that 4336724? Short attention span you guys have… See previous post.

Not very readable. At least, I can’t read it easily, so I need to split that into something a bit friendlier, like an actual list, one line per item. Fortunately, PowerShell has a split command that should do the job, as per this post ScriptingGuy’s Split Method in PowerShell.

Let’s give it a go:

poshslice3

As you can see from the above attempt, the result from that Get-ChildItem line is not a string, therefore the split() method is not available to us.

What it is, though, is a DictionaryEntry. And we can convert that to a string:

$([Convert]::ToString((get-childitem -Path env:\PSModulePath | select-object Value)))

poshslice4

which we can then split

$([Convert]::ToString((get-childitem -Path env:\PSModulePath | select-object Value))).split(";")

poshslice5

Which seems to work, except the results show the artefacts left behind by the string having been converted from a dictionary entry, ie the “@{Value=” at the beginning, and the “}” at the end.

So what we need to do instead is to treat it right. Treat it like the right sort of object in the first place, and iterate through the members of the dictionary collection, and slice the string properly:

(get-childitem -path Env:\PSModulePath | select-object Value) | foreach-object { $_.("Value").split(";") }

poshslice6

That’s what I was looking for. A nice, clear list that I can read easily. And in reading, I can spot some inconsistencies, such as some lines ending with “\”, and some without. And one set of modules having a “\\” in the path. Tsk.

Posted in SQLServerPedia Syndication | Tagged | 4 Comments

PowerShell – Prompts, Paths, Profiles

PowerShell – PowerShell_ISE, Profiles and Prompts

By default, your PowerShell prompt is “PS <<path>> >”. So normal; so dull. However, that can be easily changed. In my case, I want to add:

  • the current time (well, the time at which the prompt was generated)
  • a reminder which account I’m using (I have two – a “normal user” one, and a “DBA/SU” one
  • a countdown of seconds remaining until my next holiday

Oh, and I still need to see the current path.

The obvious place to do this is in the profile file that PowerShell_ISE loads when you launch it; this can contain various bits of functionality – for example, code to import modules you always need, or to connect up to Azure so you can do stuff up there as well as on-prem.

In this case, we’re using it to change the PS prompt.

First things first.

Create a Profile

I’m assuming you’ve not already got one…

Check what your profile path is by running this:

$profile

PoShPPP1
Who’s a naughty boy? Running in his su account? Tsk.

Now, if you try to open that file as a new PoSh ISE user, chances are, it won’t exist. Which is to be expected. Slightly less expected is that the directory path also doesn’t exist.

Fortunately, there’s a quick one-liner to remedy that problem:

New-Item ($profile) -ItemType File -Force

The -Force parameter creates the required directory path; passing in $Profile like this saves the possibility of a typo somewhere along the way. And it’s quicker / easier to type.

Now, use this little PoSh_ISE-specific command to open the file you’ve just created in the editor:

psEdit ($profile)

Lo and behold, an empty file. Ready for you to fill with your lovely code.

Changing the Prompt

In PoSh-land, the prompt is generated by a function called, unbelievably, “prompt”. So all we need to do is create our own version of that function.

In your shiny new ISE profile file, type the following:

function prompt {
    "PS LoneDBA>"
}

Now save your profile file.

That won’t automatically change your prompt. You have to run the script to load the new function definition. For shame. Press F5.

PoShPPP3

Right. Not what we want as our end result, but at least we’ve changed the prompt. Progress.

So, time to start adding things into the prompt. First, we’ll add a space to the end, so that there’s a clearer separator between the > and the beginning of your PoSh input; like you normally see… No, you don’t need to see that separately.

Next up, we’ll add the path, traditionally shown at the end of the prompt:

function prompt {
    "PS LoneDBA $(Convert-Path .)> "
}

Save, and re-run.

PoShPPP4

Things start getting a bit long-winded, so we’re going to split the prompt onto multiple lines, so it’s a bit easier to follow along. One line per feature.

function prompt {
    "PS LoneDBA " +
        "$(Convert-Path .)" +
        "> "
}

Let’s add in the user name:

function prompt {
    "PS LoneDBA " +
        "$($env:username) " +
        "$(Convert-Path .)" +
        "> "
}

and the current time (I can usually remember the date…)

function prompt {
    "PS LoneDBA " +
        "$((get-date).tostring().substring(11)) " +
        "$($env:username) " +
        "$(Convert-Path .)" +
        "> "
}

This one was a bit more complicated, but what you need to know is that get-date returns a Datetime-type object. So we need to convert that to a string, and strip off the first 11 characters, just leaving the time.

PoShPPP5

Getting there.

Here’s the fun part – seconds until the next holiday:

function prompt {
    "PS LoneDBA " +
        "$((get-date).tostring().substring(11)) " +
        "$($env:username) " +
        "$([Convert]::ToInt32(([datetime]'2016-10-28 16:30' - (get-date)).duration().TotalSeconds)) " +
        "$(Convert-Path .)" +
        "> "
}

It’s a bit of a hack, but conceptually simple – a subtraction of the current date from the holiday start date, and then convert the difference into a number of seconds; we’ll convert that into an integer, as we don’t want to see fractional seconds…)

PoShPPP6

Success!

4.9 million seconds might seem like a lot, but it goes down quickly – 50k+ overnight, 200k+ over a weekend…

Multiple Profiles

You have multiple profiles, depending on the PoSh environment you’re running. I’m not going to think about that here – but there’s more reading in this MSDN article – How to Use Profiles in Windows PowerShell ISE

Posted in SQLServerPedia Syndication | Tagged | 4 Comments

Which databases were backed up in which backup task?

It’s a confusing question / title / headline. Let’s see if I can clarify what I’m after.

For each scheduled backup job, which databases were backed up? And (more to the point) which databases weren’t backed up? And how can I find this without trawling through the logs myself?

Assumptions

  1. Backup jobs that do full backups don’t overlap
  2. There’s nothing else doing full backups
  3. erm…
  4. that’s it

The reason I was thinking about this is that we have occasional-but-annoyingly-frequent backup job failures, wherein most of the databases back up just fine, but the odd one fails. (SharePoint box, I’m looking at you…) Rather than trawling through the error logs to find out which particular database didn’t back up successfully, I wanted a query to do the heavy lifting. Yes, I’m a lazy lone DBA…

The key components to this are:

  1. MSDB’s sysjobs and sysjobhistory tables
  2. MSDB’s backupset table
  3. JOINing these to get some useful data in the right form for the inevitable
  4. PIVOT
  5. and join with sysdatabases to see if we’re missing anything.

Right then. Here we go.

MSDB’s sysjobs and sysjobhistory tables

The MSDB database is full of useful bits and pieces, including the database tables used by the SQL Agent service. Two of these are required today:

  1. sysjobs – basic header-type information about the various scheduled tasks – we’re only really interested in the task name
  2. sysjobhistory – information about individual runs of those tasks – we want to get at the recent occurrences of those tasks, in particular the start & end dates

Due to various inconsistencies in the environment where this query was developed, I can’t say that the backup job name is a particular thing, but I can say that it includes the words “backup” and “full”. I’m only interested in the last ten days of data.

SELECT  jh.instance_id ,
        msdb.dbo.agent_datetime(jh.run_date, jh.run_time) AS RunStartDateTime ,
        DATEADD(SECOND, CONVERT(INT, RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 2)),
                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 3, 2)),
                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 2)),
                                DATEADD(DAY, jh.run_duration / 1000000, msdb.dbo.agent_datetime(jh.run_date, jh.run_time))))) AS RunFinishDateTime
FROM    msdb.dbo.sysjobhistory jh
        LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
WHERE   LOWER(j.name) LIKE '%backup%'
        AND LOWER(j.name) LIKE '%full%'
        AND jh.step_id = 0
        AND msdb.dbo.agent_datetime(jh.run_date, jh.run_time) >= DATEADD(DAY, -10, GETDATE());

Some results to show that we’re on the right lines:
SysJobs and SysJobHistory Query Results

MSDB’s backupset table

MSDB really is a useful little database – also containing various tables relating to SQL Server database backups and restores. Here, we’re just interested in the backupset table, as it contains summary information about backups. This will do for our purposes.

We only want a very small set of information here – the database name, and the creation and finish date of the backup itself.

Again, we’re only interested in full backups that were taken in the last ten days.

SELECT  backup_start_date ,
        backup_finish_date ,
        database_name AS DBName
FROM    msdb.dbo.backupset
WHERE   type = 'D'
        AND backup_start_date > DATEADD(DAY, -10, GETDATE());

And some results:

BackupSet Results

JOIN these together

This is where it starts to get a bit more interesting. What we need to do is join the above two queries together based on database backup files that were created between start & end dates of a particular database backup job. Results required are the name of the database, and the start time of the database backup scheduled task.

I’m joining these using CTEs rather than creating a single SELECT statement, mainly to aid readability. This might come back to bite me later, but, y’know, this is an ongoing thing.

WITH    JobRuns
          AS ( SELECT   jh.instance_id ,
                        msdb.dbo.agent_datetime(jh.run_date, jh.run_time) AS RunStartDateTime ,
                        DATEADD(SECOND, CONVERT(INT, RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 2)),
                                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 3, 2)),
                                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 2)),
                                                DATEADD(DAY, jh.run_duration / 1000000, msdb.dbo.agent_datetime(jh.run_date, jh.run_time))))) AS RunFinishDateTime
               FROM     msdb.dbo.sysjobhistory jh
                        LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
               WHERE    LOWER(j.name) LIKE '%backup%'
                        AND LOWER(j.name) LIKE '%full%'
                        AND jh.step_id = 0
                        AND msdb.dbo.agent_datetime(jh.run_date, jh.run_time) >= DATEADD(DAY, -10, GETDATE())
             ),
        BackupsTaken
          AS ( SELECT   backup_start_date ,
                        backup_finish_date ,
                        database_name AS DBName
               FROM     msdb.dbo.backupset
               WHERE    type = 'D'
                        AND backup_start_date > DATEADD(DAY, -10, GETDATE())
             )
    SELECT  CONVERT(VARCHAR(20), RunStartDateTime, 120) AS RunStarted ,
            BackupsTaken.DBName
    FROM    JobRuns
            LEFT JOIN BackupsTaken ON JobRuns.RunStartDateTime <= BackupsTaken.backup_start_date
                                      AND BackupsTaken.backup_finish_date <= JobRuns.RunFinishDateTime;

And, as is obligatory, some results:

BackupSet Joined with SysJobs and SysJobHistory

This is looking hopeful. Ugly query, though, and it’s about to get uglier…

PIVOT

The next step is to pivot the data so we can see the results in the more friendly (well…) format.

The output I’m looking for is a table with one row per database, and one column per backup job / scheduled task. In effect, we’ll end up with a series of ticks and crosses (or ones and zeroes) to indicate success/failure. And this is where things get really gnarly.

Because we’re going to be running this on multiple servers, and they don’t all share the same backup schedule, I need to dynamically generate the column headers. I’m using the XML concatenation trick to generate a list of the full backup job start times. This is being held in a string that will be used to generate the field names in the final result set. I’m not going to show the PIVOT right now…

DECLARE @RunStartedList VARCHAR(MAX);
SELECT  @RunStartedList = STUFF(( SELECT    ', ' + QUOTENAME(CONVERT(VARCHAR(20), msdb.dbo.agent_datetime(jh.run_date, jh.run_time), 120))
                                  FROM      msdb.dbo.sysjobhistory jh
                                            LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
                                  WHERE     LOWER(j.name) LIKE '%backup%'
                                            AND LOWER(j.name) LIKE '%full%'
                                            AND jh.step_id = 0
                                            AND msdb.dbo.agent_datetime(jh.run_date, jh.run_time) >= DATEADD(DAY, -10, GETDATE())
                                  ORDER BY  msdb.dbo.agent_datetime(jh.run_date, jh.run_time) DESC
                                FOR
                                  XML PATH('')
                                ), 1, 2, '');
SELECT  @RunStartedList;

The output from this is just a big old string:

[2016-07-17 21:00:00], [2016-07-16 21:00:00], [2016-07-15 21:00:00], [2016-07-14 21:00:00], [2016-07-13 21:00:00], [2016-07-12 21:00:00], [2016-07-11 21:00:00], [2016-07-10 21:00:00], [2016-07-09 21:00:00], [2016-07-08 21:00:00]

So, putting this all together, we now have a query that looks like this:

WITH  JobRuns
          AS ( SELECT   jh.instance_id ,
                        msdb.dbo.agent_datetime(jh.run_date, jh.run_time) AS RunStartDateTime ,
                        DATEADD(SECOND, CONVERT(INT, RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 2)),
                                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 3, 2)),
                                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 2)),
                                                DATEADD(DAY, jh.run_duration / 1000000, msdb.dbo.agent_datetime(jh.run_date, jh.run_time))))) AS RunFinishDateTime
               FROM     msdb.dbo.sysjobhistory jh
                        LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
               WHERE    LOWER(j.name) LIKE '%backup%'
                        AND LOWER(j.name) LIKE '%full%'
                        AND jh.step_id = 0
                        AND msdb.dbo.agent_datetime(jh.run_date, jh.run_time) >= DATEADD(DAY, -10, GETDATE())
             ),
        BackupsTaken
          AS ( SELECT   backup_start_date ,
                        backup_finish_date ,
                        database_name AS DBName
               FROM     msdb.dbo.backupset
               WHERE    type = 'D'
             ),
        BackupPivot
          AS ( SELECT   DBName ,
                        [2016-07-17 21:00:00] ,
                        [2016-07-16 21:00:00] ,
                        [2016-07-15 21:00:00] ,
                        [2016-07-14 21:00:00] ,
                        [2016-07-13 21:00:00] ,
                        [2016-07-12 21:00:00] ,
                        [2016-07-11 21:00:00] ,
                        [2016-07-10 21:00:00] ,
                        [2016-07-09 21:00:00] ,
                        [2016-07-08 21:00:00]
               FROM     ( SELECT    CONVERT(VARCHAR(20), RunStartDateTime, 120) AS RunStarted ,
                                    BackupsTaken.DBName
                          FROM      JobRuns
                                    LEFT JOIN BackupsTaken ON JobRuns.RunStartDateTime <= BackupsTaken.backup_start_date
                                                              AND BackupsTaken.backup_finish_date <= JobRuns.RunFinishDateTime
                          WHERE     JobRuns.RunStartDateTime >= DATEADD(DAY, -10, GETDATE())
                        ) AS src PIVOT      ( COUNT(RunStarted) FOR RunStarted IN ( [2016-07-17 21:00:00], [2016-07-16 21:00:00], [2016-07-15 21:00:00],
                                                                                    [2016-07-14 21:00:00], [2016-07-13 21:00:00], [2016-07-12 21:00:00],
                                                                                    [2016-07-11 21:00:00], [2016-07-10 21:00:00], [2016-07-09 21:00:00],
                                                                                    [2016-07-08 21:00:00] ) ) AS pvt
             )
    SELECT  sd.name AS [SysDatabases.Name] ,
            bp.*
    FROM    BackupPivot bp
            FULL OUTER JOIN master.sys.databases sd ON bp.DBName = sd.name
    ORDER BY sd.name ,
            bp.DBName; 

You think that’s ugly? Wait for the full dynamic monstrosity…

Dynamic query, with added sysdatabases

Finally, I’m going to make this thing dynamic. I may want to change the number of days of backup history we’re going to look at, and I will definitely want to run it on servers that have different backup schedules, so can’t hardcode the backup job times. And, for bonus information, I’m going to join the results with master..sysdatabases so we can get information about databases that aren’t being backed up.

Here’s the query:

DECLARE @SQL VARCHAR(MAX);
DECLARE @RunStartedList VARCHAR(MAX);
DECLARE @NumDays INT;
SELECT  @NumDays = 10; -- not doing this as a single line declare-define, as we might be running on older versions.
 
IF @NumDays > 0
    BEGIN
        SELECT  @NumDays = @NumDays * ( -1 );
    END;
 
SELECT  @RunStartedList = STUFF(( SELECT    ', ' + QUOTENAME(CONVERT(VARCHAR(20), msdb.dbo.agent_datetime(jh.run_date, jh.run_time), 120))
                                  FROM      msdb.dbo.sysjobhistory jh
                                            LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
                                  WHERE     LOWER(j.name) LIKE '%backup%'
                                            AND LOWER(j.name) LIKE '%full%'
                                            AND jh.step_id = 0
                                            AND msdb.dbo.agent_datetime(jh.run_date, jh.run_time) >= DATEADD(DAY, @NumDays, GETDATE())
                                  ORDER BY  msdb.dbo.agent_datetime(jh.run_date, jh.run_time) DESC
                                FOR
                                  XML PATH('')
                                ), 1, 2, '');
 
SELECT  @SQL = '
WITH    JobRuns
          AS ( SELECT   jh.instance_id ,
                        msdb.dbo.agent_datetime(jh.run_date, jh.run_time) AS RunStartDateTime ,
                        DATEADD(SECOND, CONVERT(INT, RIGHT(''000000'' + CONVERT(VARCHAR(20), jh.run_duration), 2)),
                                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT(''000000'' + CONVERT(VARCHAR(20), jh.run_duration), 6), 3, 2)),
                                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT(''000000'' + CONVERT(VARCHAR(20), jh.run_duration), 6), 2)),
                                                DATEADD(DAY, jh.run_duration / 1000000, msdb.dbo.agent_datetime(jh.run_date, jh.run_time))))) AS RunFinishDateTime
               FROM     msdb.dbo.sysjobhistory jh
                        LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
               WHERE    LOWER(j.name) LIKE ''%backup%''
                        AND LOWER(j.name) LIKE ''%full%''
                        AND jh.step_id = 0
                                         AND msdb.dbo.agent_datetime(jh.run_date, jh.run_time) >= DATEADD(DAY, ' + CONVERT(NVARCHAR(8), @NumDays) + ', GETDATE())
             ),
        BackupsTaken
          AS ( SELECT   backup_start_date ,
                        backup_finish_date ,
                        database_name AS DBName
               FROM     msdb.dbo.backupset
               WHERE    type = ''D''
             ),
              BackupPivot
                AS ( SELECT DBName, ' + @RunStartedList + '
                        FROM (
                                  SELECT  CONVERT(VARCHAR(20), RunStartDateTime, 120) AS RunStarted,
                                                BackupsTaken.DBName
                                  FROM    JobRuns
                                                LEFT JOIN BackupsTaken ON JobRuns.RunStartDateTime <= BackupsTaken.backup_start_date
                                                                                           AND BackupsTaken.backup_finish_date <= JobRuns.RunFinishDateTime
                                  WHERE   JobRuns.RunStartDateTime >= DATEADD(DAY, ' + CONVERT(NVARCHAR(8), @NumDays) + ', GETDATE())
                                  ) AS src
                           PIVOT
                           ( COUNT(RunStarted) FOR RunStarted IN (' + @RunStartedList + ') ) AS pvt)
SELECT  sd.name as [Sys.Databases.Name],
        bp.*
FROM    BackupPivot bp
        FULL OUTER JOIN master.sys.databases sd ON bp.DBName = sd.name
ORDER BY sd.name ,
        bp.DBName
       ;
       ';
--SELECT  @SQL;
EXEC (@SQL);

And some results:

Backup Query Final Results

Look! We have a database that failed to back up in the job that started on 2016-07-13 at 21:00! And we have a couple of databases that aren’t being backed up at all! (Mind you, one of those is tempdb… I really should filter that out…)

Now, if only I could get this query to identify the reason for the failure from the logs, and create support tickets to annoy the infrastructure team. So many logs, so little time…

Disclaimer

This code worked for me on every server I tested it on, which started off with SQL Server 2005 SP1 (I couldn’t find anything more primitive) in the 2005 range, and this won’t run on SQL 2000 because that can’t cope with VARCHAR(MAX). And, hey, nobody should have SQL 2000 any more. Or 2005.

This code is worth what you paid for it, and may eat your bacon sandwich. Don’t run it without testing it. But how are you going to test it without running it? Hah.

Posted in SQLServerPedia Syndication | Tagged , , , | 3 Comments

Redgate SQLPrompt Execution Warnings

What?

Have you never run a DELETE or an UPDATE without a WHERE clause by accident? Was it on Production data, on the busiest day of the year? Actually, y’know what, that doesn’t matter. Even it happened in the middle of the quietest part of the working week, there’s still that heart-thumping moment of realisation, that wondering about if your CV is up to date, and the feeling that maybe you ought to change your grots or find the defibrillator.

There are ways to try to prevent this – you could try wrapping all your code up with BEGIN TRAN / ROLLBACK, and convert the ROLLBACK to a COMMIT when you’re sure it’s right, but that still doesn’t prevent you from accidentally failing to write that. Which is where Mladen Prajdic’s SSMS Tools Pack comes in handy as it automatically creates new query windows with BEGIN TRAN / ROLLBACK statements written in them for you. But that doesn’t prevent you from selecting part of the query, failing to select the BEGIN TRAN statement, deleting the transaction wrappers or just hitting Shift+F5 on the offending statement (if you’re still stuck on an older version of SQLPrompt – why aren’t you updating it? The Prompt team puts out new versions at an alarming rate!) (Shift+F5, for those who are SQLPrompt-less, will run the T-SQL statement that’s got the cursor in it. Great bit of functionality, but in this circumstance, maybe a bit dangerous.)

So what has Redgate done? They’ve gone and given us some protection.

Here’s some test code:

USE tempdb;
 
CREATE TABLE dbo.SacrificialTable
    (
      id INT IDENTITY ,
      sometext VARCHAR(50)
    );
 
DELETE  FROM dbo.SacrificialTable;

Here’s what happens when you run that:

EWDelete

Great. Not sure about the “Don’t show this warning again” bit, but nice warning – even for a table in tempdb…

Let’s just have a little play… Change to a temporary table (one of each sort):

USE tempdb;
 
CREATE TABLE #SacrificialTable
    (
      id INT IDENTITY ,
      sometext VARCHAR(50)
    );
 
DELETE  FROM #SacrificialTable;
 
CREATE TABLE ##SacrificialTable
    (
      id INT IDENTITY ,
      sometext VARCHAR(50)
    );
 
DELETE  FROM ##SacrificialTable;

No warning message. Why not? Well, it’s a temporary table, so perhaps Redgate decided you didn’t need to know.

And, for fun, a table variable:

DECLARE @SacrificialTable TABLE
    (
      id INT IDENTITY ,
      sometext VARCHAR(50)
    );
 
DELETE  FROM @SacrificialTable;

Again, no warning. Less of a surprise here.

OK, so what about a CTE?

CREATE TABLE dbo.SacrificialTable
    (
      id INT IDENTITY ,
      sometext VARCHAR(50)
    );
 
WITH    someCTE
          AS ( SELECT   id ,
                        sometext
               FROM     dbo.SacrificialTable
             )
    DELETE  FROM someCTE;

Redgate’s got you covered.

EWCTE

What about creating or altering a Stored Procedure?

USE tempdb;
 
CREATE TABLE dbo.SacrificialTable
    (
      id INT IDENTITY ,
      sometext VARCHAR(50)
    );
GO
 
CREATE PROCEDURE SacrificeTable
AS
    BEGIN
        DELETE  FROM dbo.SacrificialTable;
    END;

EWSP

Which seems a bit odd, given that you’re not actually executing the code at this point, merely creating the SP that will execute the non-filtered delete statement. But, y’know, whatever.

And actually calling the SP that doesn’t have a WHERE clause?

EXEC dbo.SacrificeTable;
(327410 row(s) affected)

Ah. Bad news there – that’ll execute just fine.

And it won’t protect you against Dynamic SQL either:

DECLARE @sql VARCHAR(MAX) = 'DELETE FROM tempdb.dbo.SacrificialTable';
EXEC (@sql)
(0 row(s) affected)

Still, it’s a whole heap more protection than you used to have, and could easily justify the cost of SQLPrompt on its own. With one query prevented. Yeah, it’s that big a deal.

And for some more bad news – currently, there’s no way of configuring this thing to, say, ignore anything in tempdb or inside a stored procedure definition, or to increase the level of alerts so you get the warnings on temporary tables. But, hey, they’re temporary tables. But if you think this stuff should be a bit more configurable, I’ve put a request on Redgate’s Uservoice page to improve the functionality of the execution warnings – feel free to go in and vote it up or add your own thoughts.

Posted in SQLServerPedia Syndication | Tagged , , | 2 Comments

SQL Jobs – On a Calendar?

SQL Server backup jobs are IO-intensive, he says, stating the obvious.

When you have a lot of servers which have been set up over the years, and you’re new to the environment, and want to see when storage is being hammered , you might end up thinking that a calendar view might be of help – I know I did.

So all I had to do was figure out a query to look at SQL Server scheduled task history complete with start & end times (which I’d already got), and then get those into a series of appointments in Outlook so I could see when the busy times were.

Step 0: Registered Servers Groups

Step 0?  Well, this isn’t directly related to what I’m doing, but helps a lot.

If you don’t know about Registered Servers, then go and find out about them now.  I’ll wait.  They’re a great way of logically grouping your server estate and allow you can run queries against multiple servers at the same time pulling results back into a single result set.

I’m using them from a Central Management Server; this provides a single shared location that gives the whole team a view of the SQL Server Estate, while removing the overhead of everyone keeping their own lists.

Start working with Registered Servers groups by launching SQL Server Management Studio, and hitting CTRL+ALT+G or selecting “Registered Servers” on the “View” menu.

LaunchRegisteredServers

ViewRegisteredServers

(I’ve blocked out the actual server name).

As you can see, we have six groups of servers defined.  What you can’t see is that there are folders within those folders that contain multiple servers, so you can keep more organised lists.  More of how this stuff works another time.  Maybe.

I’m interested in the top tier “gold standard” servers, so I’ll select “1 Gold”, and click to create a new query.  This will take a while – a single query window will open and attempt to connect to all the SQL Servers registered in that group.

Step 1: Query for SQL Job Runtime Information

Of course, this assumes that we’re holding on to SQL Server Agent Job History for a little while – the default job history retention settings might be too aggressive to allow you to get history going back any reasonable length of time.

We need to look at two tables in the MSDB database – sysjobs and sysjobhistory.  We only need the job name from the first, and the run information from the second.  I’ve already got a CTE defined in my snippets collection in Red Gate’s SQLPrompt that provides me with all this information and a bit more, but I’m feeling lazy (it’s the weekend), so won’t hack out the excess baggage.  The JobRuns CTE (given below) gives information about all recorded runs of all jobs; the complexity is in translating the run_date, run_time and run_duration fields from their native format in the msdb.dbo.sysjobhistory table (they’re stored as integers representations of dates and times) to a pair of real datetime fields – RunStartDateTime and RunFinishDateTime.  These will come in handy later.

The internal date formats are quite simple – if you remember:

  • Dates are integers in yyyymmdd – year, month, day
  • Times are integers in ddddhhmmss – days, hours, minutes, seconds
WITH    JobRuns
          AS ( SELECT   j.name ,
                        jh.step_id ,
                        jh.step_name ,
                        jh.run_date ,
                        jh.run_time ,
                        DATEADD(SECOND, CONVERT(INT, RIGHT('0000000' + CONVERT(VARCHAR(6), jh.run_time), 2)),
                                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(6), jh.run_time), 6), 3, 2)),
                                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(6), jh.run_time), 6), 2)),
                                                CONVERT(DATETIME, CONVERT(CHAR(8), jh.run_date))))) AS RunStartDateTime ,
                        jh.run_duration ,
                        DATEADD(SECOND, CONVERT(INT, RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 2)),
                                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 3, 2)),
                                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 2)),
                                                DATEADD(DAY, jh.run_duration / 1000000,
                                                                                                -- the rest is the start date/time, above
                                                        DATEADD(SECOND, CONVERT(INT, RIGHT('0' + CONVERT(VARCHAR(6), jh.run_time), 2)),
                                                                DATEADD(MINUTE,
                                                                        CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(6), jh.run_time), 6), 3, 2)),
                                                                        DATEADD(HOUR,
                                                                                CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(6), jh.run_time), 6), 2)),
                                                                                CONVERT(DATETIME, CONVERT(CHAR(8), jh.run_date))))))))) AS RunFinishDateTime ,
                        jh.sql_message_id ,
                        jh.sql_severity ,
                        jh.message ,
                        jh.run_status
               FROM     msdb.dbo.sysjobhistory jh
                        LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
             )
    SELECT  @@servername ,
            JobRuns.name ,
            JobRuns.RunStartDateTime ,
            JobRuns.RunFinishDateTime
    FROM    JobRuns
    WHERE   LOWER(JobRuns.name) LIKE '%backup%'
            AND JobRuns.RunStartDateTime >= '20160101'
            AND JobRuns.step_id = 0;

So, to recap, we have a query that returns Servername, (job) name, RunStartDateTime & RunFinishDateTime, for all jobs which have “backup” in the name, and which started some time this year.

That’s potentially going to return a lot of data – in my case, about 140k rows, which (spoiler alert!) causes us with another difficulty later on.  First, though, we’ll save this data into something we can use.

Step 2: Saving / exporting data

We have some options here:

  1. Select all results in SSMS Grid View, and paste into an open spreadsheet
  2. Use SQL Server Management Studio to save the results into a CSV file, and manipulate that into Excel
  3. Use a third party tool to export the data into an Excel spreadsheet – I’m using (surprise) Red Gate’s SQL Prompt because I have it installed here; other tools are available, such as Mladen Prajdic’s SSMS Tools Pack.  They both offer this functionality.  Other tools to do the same may well be available; I haven’t bothered to look…

Step 3: Massaging the data for import into Outlook

There’s a handy technet post about importing stuff into outlook calendar & tasks from excel.

Unfortunately, Outlook isn’t quite intelligent enough to understand what I want to give it based on the data I’ve exported.  Oops.  So we have to massage the data a little to provide columns named as follows:

  • Subject – I use a combination of server name & job name
  • Start Date – from the RunStartDateTime column – formula is =Date(Year(D2), Month(D2), Day(D2)) (assuming you’re on Row 2 and the RunStartDateTime is in column D…)
  • Start Time – also from RunStartDateTime – formula is =Time(Hour(D2), Minute(D2), Second(D2))
  • End Date – from RunFinishDateTime – same formula as for Start Date, but referencing the other field
  • End Time – really?  I need to complete this thought?

Yes, you could set that up in your SSMS query; no, the caffeine hadn’t hit at the time I was doing this.

Once you’ve filled that lot in down your spreadsheet, then you can filter out the data you actually want to import – importing 140k items at a time could be a bit much!  I’ve gone for three filters – I’m only interested in servers whose names indicate they’re in GB, and I’m only interested in jobs that ran in January 2016, and I’m only interested in FULL backup jobs.  Yes, that could have been filtered out earlier; no, I didn’t do that because I was also importing other dates and jobs.

Apply the filters, click somewhere in the filtered resultset, then click and highlight those five columns mentioned above.  Copy & paste into a fresh spreadsheet.  Swear when you see the error saying that the copy area and paste areas are not the same size and shape.

Excel CnP error

Easy enough to get round – Copy and paste into a fresh worksheet on your original spreadsheet, then do CTRL+A, and copy that into a fresh spreadsheet.  Strewth, why does this have to be so complicated?  Is it any easier in Office 2013?  Please say “yes”.

In that fresh spreadsheet, select the data you want to import, and give it a name.  The easiest way of doing that is by just typing in the box I’ve highlighted:

Excel Named Range

If you don’t set up a Named Range in the excel spreadsheet, then Outlook will complain it can’t find any data to import…

Save your spreadsheet as an Excel 97-2003 .xls file – this version of Outlook can’t cope with .xlsx files.  Again, please tell me that later versions can handle more up-to-date file formats?  Because there’s a problem with Excel 97-2003 .xls file format, and that is that it can’t cope with more than 64k-1 (65535) rows of data (that’s what the spoiler alert was for).  But, hey, at least Microsoft used an unsigned integer for the rowcount, unlike what they did with the score in their first version of Tetris, go on, ask me how I know…

Step 4 – Importing into Outlook

At last!

I’ve created a new Calendar folder in my Inbox for this (after all, I’ll want to share this information with the boss and maybe with the storage guys…)

What’s that?  You want to see those steps too?  OK.  In brief, he says, knowing that the Outlook stuff will take up fully half the post…

Right-click on your inbox, and select “New Folder”

I’m creating this as a subfolder of my Inbox, and naming it “Backup Job Run History”.  Type in a name, and Click “OK”.

Then, go to the file menu, click on “Open” and then “Import”

OutlookImport

That brings up the Import/Export Wizard, which we’ll quickly rattle through:

ImportExportWizard

That last window picks up on the named range you defined earlier.  If you want to add other information, look at the Map Custom Fields bit – we might have been able to do some of this up front, but we’re here now.

ImportCustomFields

When you’re done, click “Finish” and, depending on the amount of data you’re loading, and the speed of your Exchange server, either twiddle your thumbs for a few seconds, or go and get a cup of coffee…

ImportExportProcess

When that’s done, go and have a look at what you’ve created:

CalendarWithBackupJobs

Wow.  That’s a lot of backup jobs all running at the same time…  but at least I can now see where / when they all are.

Now, you’ll want to share this with the boss, so you’ll need to share the calendar with him.

ShareCalendar1

ShareCalendar2

And then, having lit the blue touch-paper, retire to a safe distance.

When the boss decides he likes this, make a recurring appointment in your calendar to keep this up-to-date.  I’m still waiting.

Updated Query

I’ve updated the query to (a) make use of the (undocumented) msdb.dbo.agent_datetime function, and (b) output columns in the right format for easier importing into Excel… The more awake of you will notice that I’ve left the original fields in there as well…

WITH    JobRuns
          AS ( SELECT   j.name ,
                        jh.step_id ,
                        jh.step_name ,
                        jh.run_date ,
                        jh.run_time ,
                        msdb.dbo.agent_datetime(jh.run_date, jh.run_time) AS RunStartDateTime ,
                        jh.run_duration ,
                        DATEADD(SECOND, CONVERT(INT, RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 2)),
                                DATEADD(MINUTE, CONVERT(INT, SUBSTRING(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 3, 2)),
                                        DATEADD(HOUR, CONVERT(INT, LEFT(RIGHT('000000' + CONVERT(VARCHAR(20), jh.run_duration), 6), 2)),
                                                DATEADD(DAY, jh.run_duration / 1000000,
                                                                                                       -- the rest is the start date/time, above
                                                        msdb.dbo.agent_datetime(jh.run_date, jh.run_time))))) AS RunFinishDateTime ,
                        jh.sql_message_id ,
                        jh.sql_severity ,
                        jh.message ,
                        jh.run_status
               FROM     msdb.dbo.sysjobhistory jh
                        LEFT JOIN msdb.dbo.sysjobs j ON j.job_id = jh.job_id
             )
    SELECT  @@servername ,
            JobRuns.name ,
            JobRuns.RunStartDateTime ,
            JobRuns.RunFinishDateTime ,
            UPPER(@@servername) + ':' + JobRuns.name AS [Subject] ,
            CONVERT(VARCHAR(10), JobRuns.RunStartDateTime, 120) AS [Start Date] ,
            RIGHT(CONVERT(VARCHAR(20), JobRuns.RunStartDateTime, 120), 8) AS [Start Time] ,
            CONVERT(VARCHAR(10), JobRuns.RunFinishDateTime, 120) AS [End Date] ,
            RIGHT(CONVERT(VARCHAR(20), JobRuns.RunFinishDateTime, 120), 8) AS [End Time]
    FROM    JobRuns
    WHERE   LOWER(JobRuns.name) LIKE '%backup%'
            AND JobRuns.RunStartDateTime >= '20160731'
            AND JobRuns.RunStartDateTime < '20160807'
            AND JobRuns.step_id = 0;
Posted in SQLServerPedia Syndication | Tagged , , , , , , , | 18 Comments