T-SQL Tuesday #170 – Abandoned Projects

New year, new blogging regime? No. I just saw Deborah Melkin’s T-SQL Tuesday post, and had to participate…

T-SQL Tuesday, second Tuesday of every month, click the image above for a link to the archive.

This month – Reitse Eskens is inviting us to blog about lessons learned from abandoned SQL projects. I’ve been in the business a while. I’m bound to have seen some stuff. The example that springs immediately to mind when discussing such things, however, is below.

wibbly-wobbly-timey-wimey

20-mumble years ago, I was working on a multi-year multi-million dollar multi-national project. OK, I wasn’t there as a DBA, but SQL Server was involved, so I’m claiming it is suitable for this.

The project was intiated in the usual way. A $2M budget was set. A team was assembled within the company, and some consultants were brought in from outside to do some of the particularly niche techie bits. (I was one of those external bods.)

I arrived to set up that particularly techy back-end bit, and to write the code around it. Which I did – in a way that was easy to document and easy to extend. After a few months, I had done what I needed to do, and handed over my bit to a new guy at the client’s firm, and went on my way to other projects. The rest of the project was for Other People to deal with, and the aim was that it would be done in six months or so. Yeah, right. You can see where this is going.

Time passes…

Two years later, I was back on that client site. The guy I had handed over to had changed virtually nothing – “about six lines of code” – because he hadn’t needed to.

The project still hadn’t been delivered.

Thorin sits down and starts singing about gold…

There had been a bit of a hoo-hah, apparently, and the sort of project meeting that may have involved having to mop blood off the floor – I couldn’t comment, as I wasn’t there. However, during those times, the project had gone through $7.5M of its original $2M budget. And a gap analysis on the project had identified a need for a further $3.5M to get the project over the line.

The project was cancelled. I was back on site to wrap up and document everything that we had done.

The Lesson…

It’s (almost) never too late to cancel a project.

The client was able to escape The Sunk Cost Fallacy. 

I assume some heads rolled. I would be surprised if they didn’t.

Apologies

My apologies for any lack of coherence – I’ve just spent the last five hours on the road, after a full day of work, and only saw the invitation quite late in the evening… excuses, excuses…

Posted in T-SQL Tuesday, Uncategorized | Tagged , | 1 Comment

Why I’m Not Attending SQLBits 2024

OK, so I’m jumping on the Steve Jones / Brent Ozar bandwagon of explaining why I’m not attending SQLBits this year…

Why am I not going? Covid.

No, not because I’ve got Covid, or I’m going to have Covid, or am suffering from Long Covid, or caught Covid at the last SQLBits I attended, or am afraid of catching Covid. And yet Covid is still the reason.

Me, 2020, Music, and I

As some of you may know, I’m also a musician of sorts. I play double bass well enough that local orchestras ask me to dep on short notice, and I’ve played in quite a few operas over the years. I even blog about it sometimes as “TheLoneDoubleBass“. I enjoy doing this, I regularly get booked up a year or more in advance for some of these things, and this was the case back in 2020. 

The West Yorkshire Symphony Orchestra had, in the years running up to Covid, been putting on occasional semi-staged operas. These have included Traviata, Cavelleria Rusticana, and Cosi Fan Tutte. In 2020, on the last weekend in March (or was it the one before?) we were due to perform Puccini’s “Le Villi”, and the usual final few weeks’ preparation work were well under way.

And then Covid lockdown happened – it kicked in a few days before our performances were scheduled.

Opera’s not a trivial thing to put on – musicians, singers (I love that distinction), venues, everything – it can be complicated to rearrange. People move on – venues move on – and getting things back together takes time.

That time has now passed, and WYSO is putting on another opera that weekend. It was going to be Le Villi, because why not reuse the effort that’s already been put into rehearsing it, but there are issues that make that currently not a viable option, so we’re teaming up with Northern Opera Group to do two performances of Parry’s short opera “Pied Piper of Hamelin”.

So I’m doing that instead.

It’s not the first time I’ve done an opera rather than attending SQLBits, but there’s only so many weeks in the year, and it’s quite often a case of who books me first. And, as I said, I can get booked up for gigs waaaay ahead of time.

So, I hope you all going to SQLBits have a lovely time, and I hope that I can make it next year. Or maybe I’ll be doing a different gig… 

Posted in Uncategorized | Tagged | Leave a comment

Death of a Community, or RIP AskSSC

Ask SQLServerCentral is closing down on 11th December, for Reasons.

When I started out as a full-time DBA back in 2010, it was the first Q&A site I found that was active, friendly, and easy to use.

It differs from the main SQLServerCentral Q&A boards by not being broken down into questions relating to a particular version of SQL Server, and a particular aspect of that version. This makes it easier for someone to just ask a question without worrying about if their question is going to the right place. A great benefit to those new to the scene.

It’s been through various hosting platforms – I dimly recall that it was hosted for a while on one of the Stack environments, but, again, for Reasons, it was moved elsewhere.

I met some great people through AskSSC. The first time I went to SQLBits, for example, the first two people I met there were two of the stalwarts – Jonathan Allen, and Kev Riley. I later worked with Jonathan on the committee for SQL Southwest’s SQLSaturday events.

Somehow I found myself on the moderatorial team of AskSSC, and slowly crawled my way up the charts of users there – but still only in 4th place behind Jonathan, Kev, and, erm, Grant Fritchey. Hail the mighty ScaryDBA. (He’s not been on the site for over 4 years, tsk.)

I’ve stepped away from answering questions there a couple of years ago, manly because that’s now part of the day job – working at Coeo as a “database platform engineer” in the managed services side, supporting 60ish clients with some really tricky stuff. I’ve just not had the mental capacity to stick around on AskSSC much other than to occasionally sweep away some of the mountains of spam that accumulate.

The last few years have not been kind to the quieter Q&A boards, though – mountains of spam, the rampant growth of the StackExchanges of this world, and the improvement in accessibility to the suppliers themselves via their support forums have all contributed. And then there’s the suppliers of the Q&A board platforms themselves…

It’s a shame it’s going – it really was a friendly place that helped build the wider SQL community. But the world moves on. I’ll miss it – mainly for the people I met along the way.

The above ramble was brought to you by a lack of sleep and a lack of caffeine.

Posted in Uncategorized | Tagged | 3 Comments

Well This Is Different, or “You did *what*?”

Relax, nobody dropped the production database…

July 2021

There I was, coming to the end of my contract with the data science team at NHS Digital, working on shutting down their mahoosive SQL Server, and beginning to think about looking for the next contract. So I was trawling the job sites, as usual, and keeping half an eye on LinkedIn, and I got a message from James Boother, who I’ve known for years thanks to various SQLBits events and SQL Saturday Exeter (I remember what you did, James…) He suggested a chat, which kinda escalated into a couple more formal chats, and suddenly – as in less than one office hour after the fourth such chat – I had an email with an offer and a ridiculously close start date.

August 2021

After the fastest recruitment process I’ve ever seen (once it got going), I have abandoned the world of contracting and gone back to the Permie Life, for a little company called Coeo, as a “Data Platform Principal Engineer”, working in the team looking after their support clients. Two(*) Three months later, my head is still spinning – but that might be less to do with the pace of the work and life generally over the last few weeks. But that’s another story(+).

(*) Yeah, I’ve been taking my time over this…

(+) Don’t ask, unless you’re bringing me beer. Or gin. Gin’s good. Anyway, given the time lapse between starting this post and publishing it, things have calmed down on the life front…

Why Go Permie?

Well… They made me an offer I couldn’t refuse, because it was the offer I requested, and it would have been a bit churlish to turn it down. It’s financial stability for me, which, given the current state of the UK, and the offspring reaching A Levels / University ages mightn’t be a bad thing.

Once upon a time, when I was new to the whole contracting thing, SQL Bits put on a conference at York University. I had nothing else to do that Saturday, it was free, it was only an hour’s drive from home, and there was the chance of meeting a couple of guys I’d been chatting with over on Ask.SQLServerCentral. So I went. The first person I met was one of the guys I’d been hoping to meet – Jonathan “fatherjack” Allen, and the next person was Kev Riley – both of whom were AskSSC regulars. And I also met a few other guys, including the one and only Buck Woody, and topped him up with coffee, but that’s another story, and Kevin Kline, and Rob Farley who greeted me with “Oh, *you’re* the twitter guy”. I guess I had a reputation already. Oh dear. I mean, I’d only been a full-time DBA for a few months.

Aaanyway, where was I? Oh yes, pondering the future while meeting people.

During that day, I think the idea either came to me, or coalesced, that maybe after two or three years of doing short term (three to six month) contracts, I would start hunting around for a permanent role again (at this point, I really had only been a contractor for a few months, and was still in the “permie” mindset). I had enjoyed my time as a consultant many years ago – it had that mix of talking with people across and through an organisation rather than just a small team, a wide ranging role, lots of new challenges – much like contracting, but with a regular guaranteed pay packet. And the short term contracts over that time frame would be, what, 8-10 different environments, different sets of problems, different ways of thinking – so a reasonable exposure to the product set and the way people used it. (I should point out that, at that point, I already had 15 years of experience of SQL Server, having been working with it since 1995, but it had only ever been part of my job – the rest of it being, well, the whole IT career thing – programmer, consultant, architect, manager, general fac totum.)

This is rambling a bit – let’s move it along.

So Who To Work For?

What to look for? Over the next few years, my wishlist of things started coming together. A mix of work types, a solid team, a friendly bunch of people, a challenge, no, regular challenges. Oh, and some training would be nice! I mean, I probably wasn’t going to get all that at a large organisation, and my experiences of being contracted through a certain three-letter organisation was definitely sub-optimal, so I was thinking about smaller (but not too small) firms.

But what sort of firm? Straight consultancy? or a pre-sales technical role? Over the years, I got down to a small list of companies, most of whom will be familiar to those who attend these conferences – Microsoft (obviously not meeting all the criteria, but on the list by dint of being Microsoft), a couple of suppliers (Redgate, obviously, and maybe SolarWinds and SQLSentry/SentryOne), and Coeo.

Coeo logo

Why Coeo?

Well, they offered…

…Yeah, there’s more to it than that.

When I started out in this SQL Server contracting line, they were a small(ish) firm with a reputation for having some of the most qualified SQL Server guys in the country, if not the world. Microsoft used to have a MCM (Microsoft Certified Master) programme, very limited, very expensive, very hard to get. And then they went one better with the MCA – Microsoft Certified Architect. And, in the UK, the company that had the most of these MCMs/MCAs outside of Microsoft was Coeo. And this was not a large company. So I knew they were serious about training and certification.

I definitely remember having a conversation with a couple of their guys at SQLBits in Brighton in 2011. One of those guys was Gavin Payne, who set up the system that I was looking after at that time at MessageLabs. And, given the age of bits of it, he was surprised that some of the servers were the same as the ones he had set up… It turns out that ML/SYMC wasn’t the only company Coeo & I shared as previous clients – yes, even though I’d never worked for them, I was still recommending them, based on their reputation.

I’d kept in touch with them at Bits, and other events, where they were regular sponsors, and the question of working for them did crop up from time to time, but they weren’t looking for remote-based staff, and I would have to be in the office most of the time, and that wasn’t an option – not as a permie, anyway.

And then came Covid, and lockdowns, and home-based working, and they found that they could cope with people being remote, and now they have staff up and down the land – which is one of the reasons that James started trying to get in touch.

Various other #sqlfamily friends have been through Coeo’s ranks, which I’m not quite sure what to think about – but none of them have left really recently, and some of them were there for quite a long time, so maybe it’s all OK really. I’ve not gone asking, because that always seems a bit odd to me. Certainly it’s more a question I would discuss over a beer than over email, and these things are quite often personal.

Anyway. I’ve been there for a few months now, and I finish work most days with my brain fried – I’m not used to having to think this much! I think another few months of this I’ll have hit pretty much every weird and wonderful feature of SQL Server – and every version from 2005 onwards – and I’m keen to keep going.

And, who knows, maybe they’ll even let me blog on their site as well as my own! Although, given the rambling above, I suspect I’ll be edited. Heavily.

Posted in Uncategorized | Tagged , | 1 Comment

Capturing Extended Properties on Drop Table

TL;DR – Nope.

Extended Properties

Custom extended properties can be a useful tool when creating a self-documenting database. However, in an environment where tables are routinely dropped and recreated by software/systems not within your control, it may be desirable to have these stored so they can be repopulated when the table is recreated.

The obvious way to capture this data for this purpose would be to run a query in a DROP_TABLE trigger and store the output in a separate table for (re-)use later (eg on a create table trigger to repopulate the values as they were when the table is recreated).

USE Testing;

DROP TRIGGER IF EXISTS DropTableGetVars ON DATABASE;
IF OBJECT_ID ('TestDrop', 'U') IS NOT NULL EXEC ('DROP TABLE TestDrop');
GO

CREATE TABLE TestDrop (i INT);
GO

EXEC sp_addextendedproperty
    @level0type = 'SCHEMA',
    @level0name = 'dbo',
    @level1type = 'TABLE',
    @level1name = 'TestDrop',
    @name = 'Purpose',
    @value = 'Table for testing drop functionality';
GO

These first few lines to drop anything that we might be about to use, and to create a test table and add a custom extended property.

Now, a query to check the extended property is there by querying the sys.extended_properties view:

SELECT s.name, t.name, ep.name, ep.value
FROM
                sys.extended_properties AS ep
    INNER JOIN  sys.tables              AS t
                ON ep.major_id = t.object_id
    INNER JOIN  sys.schemas             AS s
                ON s.schema_id = t.schema_id;

And the results:

Drop Table Trigger

Now, a quick trigger to capture (well, show, in this case) the EPs when dropping the table:

CREATE TRIGGER DropTableGetVars
ON DATABASE
FOR DROP_TABLE
AS
    BEGIN
        PRINT 'Drop table trigger';
        SELECT s.name, t.name, ep.name, ep.value
        FROM
                        sys.extended_properties AS ep
            INNER JOIN  sys.tables              AS t
                        ON ep.major_id = t.object_id
            INNER JOIN  sys.schemas             AS s
                        ON s.schema_id = t.schema_id;
    END;
GO

A very simple trigger – print a message, and run that select statement. The idea being (later) to enhance that functionality by inserting the data into another table.

Anyway, let’s test it:

DROP TABLE TestDrop;
GO

The output:

Drop table trigger

(0 rows affected)

Completion time: 2021-03-06T12:18:39.2286808+00:00

And the results of the SELECT:

Nothing there. Bother. It’s already gone before we get here.

Conclusion

This is the wrong approach for what I need to do – the extended properties are apparently deleted before the drop table trigger is fired. Which is a bore, and means I have to do one (or more) of:

  • write triggers against the extended properties object itself (no)
  • rewrite the extended properties procedures (no)
  • wrap the extended properties maintenance SPs (depends on people using your maintenance SPs rather than the default / standard ones)
  • write a routine to periodically poll sys.extended_properties and store the results elsewhere (probably the most reliable, assuming these don’t change value regularly, but a faff.)

Posted in Uncategorized | Tagged , , , | 2 Comments

Shooting Yourself In The Foot While Testing Triggers, or The Importance of GO

No, not “Go” the game…

I was working on some functionality for a create table trigger. I had the whole creation / testing thing wrapped up in a single script for easy testing and possible deployment across multiple databases:

USE Testing

IF NOT EXISTS (SELECT * FROM sys.triggers WHERE name = 'NewTableTrigger') 
EXEC ('CREATE TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE AS BEGIN SELECT 1 END')
GO

ALTER TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE
AS
BEGIN
	PRINT 'Create Table Trigger Fired'
END


IF EXISTS (SELECT * FROM sys.tables WHERE name = 'TestTable')
EXEC ('DROP TABLE TestTable');
GO

CREATE TABLE TestTable (i INT);

All well & good. (Obviously, the functionality in the trigger was a little more complicated than a print statement…)

Let’s test that script:

Create Table Trigger Fired

Completion time: 2021-02-27T14:17:09.4783931+00:00

Great. Refresh the tables list in the object explorer to go poking around, and:

Hmm. We have a problem.

Add a line to the script to see if anything’s going on there:

USE Testing

IF NOT EXISTS (SELECT * FROM sys.triggers WHERE name = 'NewTableTrigger') 
EXEC ('CREATE TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE AS BEGIN SELECT 1 END')
GO

ALTER TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE
AS
BEGIN
	PRINT 'Create Table Trigger Fired'
END


IF EXISTS (SELECT * FROM sys.tables WHERE name = 'TestTable')
EXEC ('DROP TABLE TestTable');
GO

CREATE TABLE TestTable (i INT);

SELECT * FROM dbo.TestTable

And run…:

Create Table Trigger Fired
Msg 208, Level 16, State 1, Line 21
Invalid object name 'dbo.TestTable'.

Completion time: 2021-02-27T14:24:44.0301750+00:00

Not just me failing to see it, then.

After some step-by-step testing, I finally found the problem:

USE Testing

IF NOT EXISTS (SELECT * FROM sys.triggers WHERE name = 'NewTableTrigger') 
EXEC ('CREATE TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE AS BEGIN SELECT 1 END')
GO

ALTER TRIGGER NewTableTrigger ON DATABASE FOR CREATE_TABLE
AS
BEGIN
	PRINT 'Create Table Trigger Fired'
END
GO

IF EXISTS (SELECT * FROM sys.tables WHERE name = 'TestTable')
EXEC ('DROP TABLE TestTable');
GO

CREATE TABLE TestTable (i INT);

SELECT * FROM dbo.TestTable

Yup. That one “GO” missing at the end of the ALTER TRIGGER statement was causing the next part of the script to be executed as part of the same batch, meaning it was part of the trigger code, and the trigger was deleting the table that I was attempting to create.

Posted in Uncategorized | Tagged , | 1 Comment

Advent of Code and BULK INSERT problems

Have I said how much I hate the new WordPress “add post” page? No? Well, I do. Particularly when editing an earlier post…

Anyway. Where were we? Oh yes. Advent Of Code. For those of you who don’t know, it’s a problem-setting advent calendar where you have to solve the problems in order to progress. One problem with two parts per day. And the problem is usually accompanied by several hundred lines of input data, so you can’t do the problems by hand unless you’ve got way too much spare time.

I’ve tried this a few times over the years, and never managed to stick at it for very long, probably because, as a musician, this tends to be a busy time of year. 2020, though, is a bit different. No, a *lot* different. So I have more time to think about these things.

Loading data with BULK INSERT

Step one in solving these problems in SQL Server is loading the data into a database so you can do something with it.

I chose to use BULK INSERT, because, according to the documentation, it:

Imports a data file into a database table or view in a user-specified format in SQL Server

Ideal, right?

Day 1’s input is a list of numbers. So, in theory…:

IF OBJECT_ID('Day01Input', 'U') IS NOT NULL
 BEGIN
     EXEC ('drop table Day01Input');
 END;
 CREATE TABLE Day01Input
 (
     LineItemInput BIGINT
 );
 BULK INSERT dbo.Day01Input
 FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt';
 GO

Right?

Msg 4864, Level 16, State 1, Line 13
 Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (LineItemInput).

Wrong.

OK, so let’s try with a varchar(max), in case there’s something weird going on in the file.

DROP TABLE dbo.Day01Input;
GO
CREATE TABLE Day01Input
(
    LineItemInput VARCHAR(MAX)
);

BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt';
GO

Right?

(1 row affected)

Wrong. There’s 200 lines in this input. So what’s going on?

SELECT * FROM dbo.Day01Input;
LineItemInput
-----------------------------
 1619
 1919
...
 1623
 14
 (1 row affected)

mm’kay – what seems to have happened, then, is that the whole text file has been read into a single varchar(max) field. Which is fine, but not quite what I was hoping for. Fortunately, there are a few options you can apply with the BULK INSERT command, so this would seem to be an appropriate thing to try:

DROP TABLE dbo.Day01Input;
GO
CREATE TABLE Day01Input
(
    LineItemInput BIGINT
);

BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
    FORMAT = 'CSV',
    DATAFILETYPE = 'char'
);

And…

Msg 4864, Level 16, State 1, Line 34
 Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (LineItemInput).

Oh.

So how about that ROWTERMINATOR parameter?

BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
    FORMAT = 'CSV',
    DATAFILETYPE = 'char',
    ROWTERMINATOR = '
'
);

And…

Msg 4864, Level 16, State 1, Line 51
 Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (LineItemInput).

(ignore the line numbers – that’s my script of error generators…)

OK, so it doesn’t like the carriage return / linefeed combo. How’s about using CHAR(10)?

BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
    FORMAT = 'CSV',
    DATAFILETYPE = 'char',
    ROWTERMINATOR = CHAR(10)
);

And…

Msg 102, Level 15, State 1, Line 48
 Incorrect syntax near 'CHAR'.

Oh look. A different error. Still, I suppose this is progress of a sort…

How about we wrap that up in a variable, and pass that in?

DECLARE @RowTerm VARCHAR(1) = CHAR(10);

BULK INSERT dbo.Day01Input
FROM 'C:\.....\AdventOfCode2020\Dec01-1 - input.txt'
WITH
(
    FORMAT = 'CSV',
    DATAFILETYPE = 'char',
    ROWTERMINATOR = @RowTerm
);

And…

Msg 102, Level 15, State 1, Line 70
 Incorrect syntax near '@RowTerm'.

What?

It turns out that every parameter in a BULK INSERT statement has to be a literal, not a variable, so the way to do this is way more complicated than first thought. The good news, though, is that we can ditch the rest of the WITH stuff, and just pass in the ROWTERMINATOR.

DECLARE @Bulk_cmd VARCHAR(MAX);
SET @Bulk_cmd
    = 'BULK INSERT AdventOfCode2020.dbo.Day01Input
FROM ''C:\.....\AdventOfCode2020\Dec01-1 - input.txt''
WITH (ROWTERMINATOR=''' + CHAR(10) + ''')';
EXEC (@Bulk_cmd);

And…

(200 rows affected)

Hurrah!

Maintaining Data Order

For the some of the AOC challenges, though, maintaining file / data order is important – so we’ll need to force that somehow. Obviously, SQL Server doesn’t do that in an easy way; if you put the data into a staging table like this one, and then insert it into a table with an IDENTITY (or SEQUENCE) field, you can’t necessarily guarantee that the data will make it in the right order.

The only way to guarantee order is to apply it as the data is loaded. I came across a tip somewhere to create a table with an identity field and a data field, and then a view on that table that just selects the data field, and then BULK INSERT into the view… So, for day three of AOC, I ended up with this:

IF OBJECT_ID('TobogganTreesInput', 'V') IS NOT NULL
    BEGIN
        EXEC ('DROP VIEW TobogganTreesInput');
    END;

IF OBJECT_ID('TobogganTrees', 'U') IS NOT NULL
    BEGIN
        EXEC ('DROP TABLE TobogganTrees');
    END;

CREATE TABLE TobogganTrees (RowID INT IDENTITY(1, 1), InputString VARCHAR(MAX));
GO


CREATE VIEW TobogganTreesInput
AS
SELECT  InputString FROM TobogganTrees;
GO


DECLARE @Bulk_cmd VARCHAR(MAX);

SET @Bulk_cmd
    = 'BULK INSERT AdventOfCode2020.dbo.TobogganTreesInput 
FROM ''C:\.....\Dec03-1 - input.txt''
WITH (ROWTERMINATOR=''' + CHAR(10) + ''')';

EXEC (@Bulk_cmd);
GO

You get the idea.

Posted in Uncategorized | Tagged , , | Leave a comment

CmdExec vs Powershell – error? What error?

Just a quickie Batch file scripts, called by SQL Agent, are not propagating their errors back up to the SQL Server job system because…. they’re being called from a Powershell script?

To replicate the problem, create a .bat file containing the following, being a quick way of generating an error response:

exit /b 1

Then create a SQLAgent job which calls this script as an “Operating System (CmdExec)” job step, and a second step of doing something dull, such as

select 1

Set the first job step to go to the second job step in the event of a failure. I’ve set up two steps, one is just the exit command above, the second calls that script instead using cmd /C

cmd /C "exittest.bat"

Launch the job. The job succeeds, but shows in history with a warning:

both options give an error / warning combo

So, how do we fix this? My suspicion is that we can’t, as powershell would be looking at the error generated / returned by cmd.exe, as that is (I’m guessing) what’s invoked when you launch a .bat file inside a powershell script. If you know otherwise, please let me know!

Posted in Uncategorized | Tagged , , , | Leave a comment

Testing SQL Agent Job Step Output During The Job

(This follows on from my previous script about emailing SQL Job ouput.)

There are times when a SQL server scheduled task needs to look at the output from the previous step, and make decisions based on that – particularly when parsing output from noisy/chatty scripts.

I have, therefore, put together a quick and dirty script to check the output from a SQL Server job step, and pull that into a string, so that it can be examined and a script can then make decisions about what to do next.

CREATE FUNCTION [dbo].[JobStepOutput]
(
    @StartDate INT,
    @StartTime INT,
    @job_name sysname,
    @step_id INT
)
RETURNS NVARCHAR(MAX)
AS
BEGIN
    /*
	FN: JobStepOutput
	Parameters:
			@StartDate int - msdb-formatted date at which the job starts
			@StartTime int - msdb-formatted time at which the job starts
			@job_name sysname
			@step_id int - step to look at.  If zero, then all.

	Usage:
			To be used in a SQL Agent job step:

			declare @StartDate int = CONVERT(int, $(ESCAPE_NONE(STRTDT)));
			declare @StartTime int = CONVERT(int, $(ESCAPE_NONE(STRTTM)));
			declare @job_name sysname = '$(ESCAPE_NONE(JOBNAME))';
			declare @StepID int = CONVERT(int, $(ESCAPE_NONE(STEPID))) - 1;

			select JobStepOutput(@StartDate, @StartTime, @job_name, @StepID);

*/

    DECLARE @StartDateTime DATETIME = msdb.dbo.agent_datetime(@StartDate, @StartTime);
    DECLARE @job_id UNIQUEIDENTIFIER =
            (
                SELECT job_id FROM msdb.dbo.sysjobs WHERE name = @job_name
            );
    DECLARE @EndDateTime DATETIME =
            (
                SELECT ISNULL(MIN(msdb.dbo.agent_datetime(run_date, run_time)), GETDATE())
                FROM msdb.dbo.sysjobhistory
                WHERE msdb.dbo.agent_datetime(run_date, run_time) > @StartDateTime
                      AND job_id = @job_id
                      AND step_id = 0
            );

    DECLARE @results NVARCHAR(MAX) = N'';

    SELECT @results = STUFF(
    (
        SELECT message
        FROM msdb.dbo.sysjobs j
            INNER JOIN msdb.dbo.sysjobhistory jh
                ON j.job_id = jh.job_id
        WHERE 1 = 1
              AND msdb.dbo.agent_datetime(run_date, run_time) >= @StartDateTime
              AND msdb.dbo.agent_datetime(run_date, run_time) < @EndDateTime
              AND jh.job_id = @job_id
              AND
              (
                  (
                      jh.step_id = @step_id
                      AND @step_id >= 0
                  )
                  OR (@step_id = -1)
              )
        ORDER BY msdb.dbo.agent_datetime(run_date, run_time),
                 step_id
        FOR XML PATH(''), TYPE
    ).value('.', 'nvarchar(max)'),
    1   ,
    0   ,
    ''
                           );

    RETURN (@results);

END;
GO

Some notes:

  • I’ve shown STEPID - 1 in the example; this gives the previous job step.
  • If you pass in -1, it’ll return all the job step output for this job
  • I’ve tweaked the core sysjobhistory query from the previous post; this now looks for the end date of the job so you can more easily refer to previous job runs as well. Probably not needed, but came in handy while testing.

In case WordPress does something weird, it’s over on my github repo as well.

Postscript

While putting this script together, I noticed (in SQL2016) a function called “fn_sqlagent_jobs”, and hoped it might be useful. It didn’t appear to do anything, so wasn’t. And so I ended up writing the above.

Posted in Uncategorized | Tagged , , , | 1 Comment

SQL Server Agent Job output to email

I came across a need for SQL Server Agent job output to be emailed out to the developer who created the code that the job was running. For various reasons, they don’t necessarily have access to the SQL Server itself – hurrah – but need to see what’s being generated by their scripts in case something goes wrong.

So, I ended up writing this extremely rough and ready SP:

USE master;
SET QUOTED_IDENTIFIER ON;
IF OBJECT_ID('SendJobOutput', 'P') IS NULL
    EXEC ('CREATE PROCEDURE SendJobOutput AS BEGIN SELECT 1 END');
GO
ALTER PROCEDURE SendJobOutput
    @StartDate INT,
    @StartTime INT,
    @job_name sysname,
    @recipients VARCHAR(MAX)
AS
BEGIN
    /*
    SP: SendJobOutput
    Parameters:
            @StartDate int - msdb-formatted date at which the job starts
            @StartTime int - msdb-formatted time at which the job starts
            @job_name sysname
            @recipients varchar(max) - string full of semi-colon-separated email addresses to get the output message
    Usage:
                To be used in a SQL Agent job step:
            declare @StartDate int = CONVERT(int, $(ESCAPE_NONE(STRTDT)));
            declare @StartTime int = CONVERT(int, $(ESCAPE_NONE(STRTTM)));
            declare @job_name sysname = '$(ESCAPE_NONE(JOBNAME))';
            declare @recipients varchar(max) = 'user@address.bar; someoneelse@foo.com';

            exec SendJobOutput @StartDate, @StartTime, @job_name, @recipients
*/
    DECLARE @StartDateTime DATETIME = msdb.dbo.agent_datetime(@StartDate, @StartTime);
    DECLARE @Subject sysname = N'Job output - ' + @job_name;
    DECLARE @job_id UNIQUEIDENTIFIER =
            (
                SELECT job_id FROM msdb.dbo.sysjobs WHERE name = @job_name
            );
    DECLARE @message NVARCHAR(MAX)
        = N'<html><body><table><tr><th>JobName</th><th>DateTime</th><th>Step Name</th><th>Message</th></tr>';
    DECLARE @results NVARCHAR(MAX) = N'';
    SELECT @results = @results + STUFF(
                                 (
                                     SELECT j.name AS TD,
                                            '',
                                            msdb.dbo.agent_datetime(run_date, run_time) AS TD,
                                            '',
                                            jh.step_name AS TD,
                                            '',
                                            message AS TD
                                     FROM msdb.dbo.sysjobs j
                                         INNER JOIN msdb.dbo.sysjobhistory jh
                                             ON j.job_id = jh.job_id
                                     WHERE 1 = 1
                                           AND msdb.dbo.agent_datetime(run_date, run_time) >= @StartDateTime
                                           AND jh.job_id = @job_id
                                     ORDER BY msdb.dbo.agent_datetime(run_date, run_time),
                                              step_id
                                     FOR XML PATH('TR'), ELEMENTS
                                 ),
                                 1,
                                 0,
                                 ''
                                      );
    SELECT @message = @message + @results + N'</table></body></html>';
    EXEC msdb.dbo.sp_send_dbmail @profile_name = 'DoNotReply - SQLMail', -- name of email profile already defined in msdb.dbo.sysmail_profile
                                 @recipients = @recipients,
                                 @subject = @Subject,
                                 @body_format = 'HTML',
                                 @body = @message;
END;

At some point, I’ll revisit it to smarten up the email message being generated, but this at least is usable.

Instructions for use are in the header block, but, spoiler alert, it involves adding an extra step at the end of a job, and pasting in a few lines of script which makes use of the SQLAgent job step macros/tokens.

Note the hack to get multiple “TD” elements – if those extra ''s aren’t selected, the whole things gets blended into a single which is not right.

You’ll need to configure the email profile being used.

And, in case WordPress corrupts the script above (as if…), it’s over on my github page too as “SendJobOutputByEmail.sql”

Posted in Uncategorized | Tagged , , , | 2 Comments