SQL Saturday Cleveland 2018: Feedback Stack

My congratulations to the team of SQL Saturday Cleveland, for an outstanding event in 2018. You all put on a consistently well-run event, and I’m proud to be a part of it. I’m looking forward to submitting again in 2019. :-)

I wanted to make a post regarding the feedback I received from my session. I had somewhere between 30 – 40 people in my room, and the engagement was great. Thanks to everyone who came up to talk to me after my session. 19 feedback forms were received from the attendees.  Of those, 14 of which were all positive, for which I thank you very much. It’s good to know that what I’m providing is something people will find useful.

I’m also extremely grateful for the 4 negative feedback forms I received. I want to take a moment to address those specifically. Here were the negative comments I received under the category, “What could the speaker do differently to improve?”

“Slow down a little.” – Yep. I am in complete agreement with you on this one. I have a tendency to talk quickly when I get excited about something, or simply have had too much caffeine. Valid feedback, and duly noted. I will work on this.

“Make slides + scripts available before class begins.” – This is a hard one for me, because I have a tendency to modify the slides and scripts due to feedback I receive during the session. If someone points something out during the session that would require a correction on my part. I want to do that before posting the material. Making the materials available beforehand defeats that purpose. I could always issue an update, sure, but I’m not confident most people would bother downloading it. I am open to suggestions on this.

“More focus on diagnosis, remove inclusive vs. exclusive section, typo on DMV slide “individual””.  – Thanks for all that!  You’ll be happy to know that the typo was fixed before the materials were uploaded, so the available slides are correct. I’d be interested to hear more about why you think I should remove the inclusive vs. exclusive filter section, though. Lucky for me, you were kind enough to let me know who you are, so I will reach out to you individually. :-)

Here’s a comment from the “Did you learn what you expected to learn?” question: “No. Way over my head, was expecting use of EX.Events GUI.”  I’m sorry you didn’t get what you expected out of my session. However, I did explain my reasons for avoiding the GUI. You won’t be able to use any of the automation tactics or scrips I showed with the GUI. I’m not against using the GUI, but I choose not to so that I can save my scripts and automate them. I would suggest playing around with the demo scripts a bit to see if they make sense, and if not – contact me. I’ll see if I can help you out.

Astute readers, I am sure, will note that 14 positive and 4 negative feedback forms does not add up to 19 total. I have one last comment to post, from my favorite feedback form I received, and I think you’ll agree this one defies categorization. I love it.

What could the speaker do differently to improve? “Expand the universal constant governing relative time, and do a 4-day seminar in 45 minutes.”

I’m pretty sure I have a PowerShell script for that. :-)

Thanks for reading.


How to handle the PASS Summit.

About this time every year, there are a plethora of articles that offer advice on how to handle the upcoming PASS Summit. And well they should – it’s a huge event. There are thousands of people congregating there and so, so, SO MUCH to do. Special interest events and community mixers abound. And I’m not even going to touch the number of parties, both public and private, and the sightseeing and karaoke and… Oh, yeah – there’s a conference there, too. Tons of educational material, networking sessions, professional development opportunities, the MS CAT Team and…

Yeah – it’s a lot. Especially to try and pack into three+ days.

One of the things that I’ve heard time and time again is about the opportunities for networking. In fact, there’s a saying about that which I’ve heard used many times regarding this event. It’s, “If you’re eating dinner / lunch, etc… by yourself, you’re doing it wrong.” And to be honest, that kind of rubs me the wrong way.

While I try very hard to be friendly and approachable, I’m not exactly the most social person. Now, if you’re reading this and planning to attend the Summit, please don’t take this to mean I don’t want to talk to you. I most definitely do. Hey – if you read my blog on even a semi-regular basis, I consider you a BFF and will likely buy you a drink in appreciation. :-)

What I’m saying, though, is that I fit the current popular definition of an Introvert. Not a big fan of crowds, I don’t generally stay out that late, and really – I need time to myself to recharge. If you don’t see me running around during the event, it’s likely that I’ve slipped off to my hotel room for a cat nap or down to Pike’s Place for a solo walk by the water. I suspect that there are more than a few of you out there that fit that definition as well.  If so, then this post is for you.

If you’re eating a meal by yourself, that’s FINE.

Just here for the sessions? GREAT.

Want to go sightseeing alone after the day’s events? PERFECT.

Yes, there’s a lot to do there, but you shouldn’t feel like you have to do it all, nor should you feel like you have to jump out of your comfort zone in order to do it. Sure, taking a risk can pay off, and getting a little outside your comfort zone with a little professional networking is one way to do that. It has certainly paid off for me. But it’s hard. I get that. So don’t feel like you’re missing something if you don’t. Don’t feel bad. You’re fine.

And I hope to see you there. But if I don’t, that’s OK. :-)

Thanks for reading.


PASS Election Endorsements

I’m not in the habit of getting political, but here are my endorsements for the PASS board elections this round:

  1. Allen White – I’ve known Allen for quite some time, and I feel his dedication to the community is beyond question. He’s moving the needle in the right direction for his chair, and I would like to see him have more time to continue doing so.
  2. Eduardo Castro – I do not know Eduardo personally, but from what I have learned over the past week or two, I feel he would do an excellent job for PASS. He’s involved, passionate, and ready to roll up his sleeves.
  3. Wendy Pastrick – I’ve known  Wendy as long as I’ve known Allen, and can echo many of the same sentiments. I also appreciate her ability to look into the future of PASS and shape things in a way that benefits the community as a whole. Wendy’s leadership will help PASS continue to grow.

Now, in no way does this mean that I think any less of the other candidates. In fact, this was a hard call to make due to the caliber of the slate. I look forward to seeing the results of this election, and will support the new board members as best I can, whoever they may be.




Joining UpSearch

After what may be the longest interview process ever, I have joined the professionals at UpSearch. This marks a few interesting turns in my career. A few firsts:

  • Working from home, full-time. I’ve worked from home before for short periods; perhaps a week or two at the most. So this will be an interesting challenge for me. I do like my home office though, and find that I am quite productive in it. Often more so than I am in a traditional office environment. I’m confident that this will work out well.
  • Consulting. I’ve done contract work, and many of my previous engagements have been contract-to-hire. I know what it takes to keep clients happy. (Candy. Clients like candy.) However, my most recent previous positions have been as a full-time employee. I’m looking forward to the challenge of working for multiple clients. I feel like I have a decent handle on my time management and documentation skills, so again, I’m confident that I am up to the challenge.
  • Community Involvement. I like to think I’ve been doing this already, blogging, speaking at SQL Saturdays and participating in the local user group. Now, however, this is considered a focus of my job. Maybe a minor focus, but a focus none the less. I plan to expand my community involvement because of this.

Of course, I would be remiss if I didn’t mention that the two best parts of joining UpSearch are the people I’ll be getting to work with. I have great deal of respect for the professionalism and skills of both Colleen Morrow ( b | t ) and Kendal Van Dyke ( b | t ). I’m looking forward to working with, and learning from, both of them.

Let’s do good work.

Thanks for reading.


PASS Summit Thoughts & Speaker Idol Wrap-Up

Summit 2015. Where do I start? The Annual PASS Summit, as anyone who has been there can tell you, is almost always a whirlwind. But there were three specific things I was focusing on this year, and I want to talk about each of them in turn.

First, this year was a lot more about networking and meeting people than it has been for me in the past. I’m not the most social of people, though I do like some karaoke and hanging out with friends. So one of my main goals this year was to meet more people, and get more involved in different activities. Sadly this meant missing out on seeing some people I usually like to spend time with, as I wasn’t going to be at the usual or largest events. For that, I’m sorry my friends – but I will be in touch with you again, soon.

Second, I did not make many technical sessions last year, and wanted to make sure I saw a few this year. So I stopped by Paul Randal’s Mythbusters session, Jason Strate’s Plan Cache session, and Colleen Morrow’s SQL Audit session. All three were excellent, of course. No matter how much I think I may know about a topic, attending sessions, even ones I’ve seen before, always yields a few new nuggets of information that are extremely useful to me.

Third, I was a contestant in this year’s Speaker Idol. I want to spend the rest of this post on that, since that consumed the bulk of my attention this year. Of the 12 people who were originally picked for the event, 2 had to eventually back out. I was very disappointed by that and hope that all is well with them. I would really have liked to have seen them present.

I showed up on day one of the event, with a nicely prepared presentation that I had practiced several times and was happy with. Then I saw the first four presenters and said to myself, “Um… I better go practice some more. And work on my slides. And my demos… ”  I did so, and returned on day two and experienced much of the same sentiments. I was floored by the quality of the speakers. Everyone was very polished and professional. I got to go last on day three, which I think was an advantage. I took copious notes of the judges feedback over the first several sessions. I think that helped me to refine my presentation even further.

When it was my turn, I took the stage with a deep breath (sorry, audio tech) and simply presented things the way I had rehearsed them a couple of hours before. I took mental notes on the judges’ feedback, and thought hard about them. At that point, I was not thinking about advancing in the competition. What I was really thinking about was how to apply the feedback I received and studied over the last few days to future presentations. I really thought the competition was over for me. However, once the judges returned from deliberation, they declared their winner, and I was a bit surprised to find myself declared the wildcard for the finals.

Then, the penny dropped – the judges had come up with a new rule. The person who won the wildcard had to go first in the final round. So that meant I had roughly 30 minutes to come up with something new, or further refine my existing presentation. Yikes. The feedback I had was that the judges wanted to see a slide or example of page splits / fragmentation as I described it. I could have easily found a picture out on the web that displayed something “fragmented” but I didn’t think that fit the existing style of my presentation. So I decided to animate some little blue pages in a random fashion. I also thought about different ways to say some things, based on what the judges found lacking the first time around.

Despite the changes, I was a little more relaxed the second time I presented, and was able to engage with the audience a bit more. Q&A, or really any kind of interaction, is the best part of presenting for me. I love it and it’s what keeps me wanting to present. The remaining presenters went again in the final round, and like I did before – I had no thoughts of winning. Only thinking what a wonderful experience it was.

And then – I won. I honestly, truly don’t know how, but I did. And for that, I have some thanks to give.

THANK YOU: To Denny Cherry, Joey D’Antoni, Hope Foley, Mark Simms, Allan Hirt, Andre Kamman and Karen Lopez for giving us all both the opportunity and means to improve our speaking skills in this way. It was a ton of fun.

THANK YOU: To everyone who showed up, supported us, assisted us, and congratulated us. You are all my #sqlfamily, and it meant a lot to see so many friendly faces, even ones I didn’t know personally, in the audience.

Finally, and most importantly, THANK YOU, THANK YOU, THANK YOU to the other participants in this year’s Speaker Idol. Twitter links provided here, so you can all follow them:

You were all awesome. It was you that pushed me to become a better speaker than I was, and for that, you have my undying gratitude. I sincerely hope you all continue to submit to speak at the PASS Summit for 2016. I would love nothing more than to each of us, up on stage, together. Sharing what we know, and learning from each other. Connect. Share. Learn. It’s what we do.

Thanks for reading.


Insidious Corruption

WARNING: The following post involves intentionally doing damage to a *system* database. The usual caveats about having good backups, not attempting to do this on an important system, etc… apply here. You may very well need to remove and reinstall SQL Server if you attempt this. In short – “We’re professionals. Do not try this at home.” 

Today’s lesson is about equality. Treat all databases with equal kindness. For example, I see a lot of people not doing basic maintenance like regular backups and CHECKDB on their system databases. Maybe they think that since those databases don’t change much, there’s no reason to protect them the way that they protect user data.

I am here today to dispel that notion.  If you’ve attended one of my sessions on database corruption, you’ll remember that I state that database corruption is just a fact of life. It happens, and it can happen when and where you least expect it. Example: in the [model] database.

The [model] database is essentially SQL Server’s template. When a new database is created, first, a copy of [model] is made. Then, any further configurations or tweaks that the user specified in either the database creation dialog boxes, or the CREATE DATABASE command, are applied to the new database. The advantage of this is that anything you want to apply to any newly created databases, you can apply to the [model] database. Any objects you want in all new databases, any default file settings like size and growth, or something SIMPLE like the database’s recovery model. (See what I did there? :-))

Let me repeat that real quick – *anything* that exists in [model] will be copied to new databases.

Including corrupted pages.


Sooo… Let’s do one! :-)

First, take a backup of [model] and set it aside, just in case. Then we create an object we can copy to new databases.

/* Create a table that can get propagated to new databases. */
CREATE TABLE Trash (datacol CHAR(8000) NOT NULL); /* if you're gonna go, go big. */
/* Drop in a row. */
INSERT INTO Trash(datacol)

Now that we have an object, we need to corrupt it. I chose to use the method described here.  You’ll need the ID of the page you want to corrupt, which you can get with DBCC IND:

/* Get the page ID of the one lone page in the Trash table. */
DBCC IND('model','Trash',0);

Now, let’s assume we’re not checking [model] for corruption, and so it goes undetected.  What happens when I create a new database?

/* Create new database and check for corruption. */

Survey says…!

Msg 8939, Level 16, State 98, Line 103
Table error: Object ID 581577110, index ID 0, partition ID 72057594039107584, alloc unit ID 72057594043695104 (type In-row data), page (1:312). Test (IS_OFF (BUF_IOERR, pBUF->bstat)) failed. Values are 133129 and -4.
Msg 8928, Level 16, State 1, Line 103
Object ID 581577110, index ID 0, partition ID 72057594039107584, alloc unit ID 72057594043695104 (type In-row data): Page (1:312) could not be processed. See other errors for details.
CHECKDB found 0 allocation errors and 2 consistency errors in table 'Trash' (object ID 581577110).
CHECKDB found 0 allocation errors and 2 consistency errors in database 'TestMe'.
repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB (TestMe).

So now I have a corruption that will appear in every database I create. Keep in mind that this need not show up in a user created object. If any part of the [model] database becomes corrupt, and we’re not checking [model] with CHECKDB, then every database created will also be corrupt, and maybe unusable.

While we’re on the subject, here’s something even worse – while playing around with this concept, I noticed that in SQL Server 2014, DATA_PURITY checks are still *off* for [master] and [model] by default. So I created another test object in [model] and caused a data purity error in it. When I ran CHECKDB on [model], without specifically adding DATA_PURITY as an option, it came back clean. When I created a new database, I ran CHECKDB on it, and lo and behold – it threw a data purity error. So a corruption that was in model was not detected, and still propagated to a newly created database.


Have you hugged your [msdb], [model] and [master] databases today? If not, at least make sure you’re properly checking it for corruption using CHECKDB with DATA_PURITY. Your databases will thank you.

Thanks for reading.


Quickie: Thanks and See You Soon

Massive thanks to Mike Brumley (t) and the SQL PASS Fundamentals VC for allowing me to present my session, DBA 911 – Database Corruption. I had a ton of fun, and the Q&A was awesome. I’m very pleased to see the level of thought being put into the questions – you’re all thinking hard about this stuff and it’s gratifying to know I’m in the same profession as you.

I have my list of follow up questions, both from emails and from the session yesterday. I will be posting the answers here, and emailing everyone who asked a question to let you know the post is live.

Once again, thank you. I’m humbled by having so many people attend my session. I look forward to meeting you at a SQL Saturday or other event, soon.

Thanks for reading.


“Train” Wreck

So MS has announced the end of the MCM, MCSM, and MCA programs for SQL Server. I haz a disappoint. Not because I had been studying fast and furious for certification exams, but because I had finally started to see some recognition for those particular certifications as being worthwhile and meaningful. Those weren’t exams you could cram for. Your study program for those was a boatload of research, experimentation, and years of experience. I think it’s not just a poor decision, but poorly executed on MS’ part. I won’t go into what many others have said, suffice to say I do have my own spin on it.

I wish I could say I was surprised, but I’m not. I’m not too sure about the MCA certification, but from what I understand, the MCM and MCSM were solely Database Engine focused. That means there was no BI component to those particular levels. In the MCSM track, there were lower level certification exams that involved BI components, but at the Master level, it was all Database Engine. I suspect *very* strongly that this is signaling a continued change in MS’ focus from the engine to the BI components. I expect we will continue to see the engine de-emphasized over time, and a heavy, heavy effort placed on marketing and building up the BI portfolio.

Anyway – Enough of that. While we still have a data engine, we still have learning to do. And while people want to learn, I will continue to teach.

So… I will be speaking at my home PASS chapter in Columbus OH on September 12, 2013 at 6PM. Come early for social time, and stick around to hear my thoughts on Database Corruption.

If you miss that, or are a little bit East of here, I’ll be giving the presentation again, a couple days later, at SQL Saturday #250 in Pittsburgh PA. There’s an excellent lineup of speakers planned, and I would encourage everyone in range to attend. Did I mention it’s free? Register here.

Thanks for reading.


SQL Saturday #242 This Weekend!


In about 48 hours, give or take a few, I’ll be on my way to Indianapolis IN for SQL Saturday #242. I can’t wait. :-)  My session will be on database corruption and repair, a topic that I’ve had a lot of fun with. Hopefully, I’ll get some good feedback and follow-up questions on it. Anything that requires follow up, I will post here.

There are tons of good sessions going on there, though. I’m just as excited about attending as I am presenting. I’m planning on going to the following:

First, I always attend the WIT panel. You should too.

Kevin Boles – Windowing Functions: THE reason to upgrade to 2012.

Neil Hambly – MDS and DQS – Beyond the TLAs to Data Quality.

Luke Jian – Anatomy of a Join.

And as always, I’ll be at whatever social events are happening afterwards. That’s where the really good conversation and networking happens. You should go.

I hope to see you there. :-)



Good Tool, Bad Tool?

I like tools. I have a whole toolbox I like to have handy of SQL Server goodies and such. Including but not limited to:

SQL Sentry Plan Explorer – Plans on steroids, as I like to call it. If you want more detail and views on your execution plans, this is the tool that you want.
Qure Workload Analyzer – My favorite tool for slicing and dicing SQL Server trace files. I am trying to make the leap to extended events, but let’s face it – there are still a lot of us running older versions of SQL Server out there, and Trace is still a well-used tool for a lot of us.
SQL Search – One of RedGate’s free tools. Much handier than me trying to export all that messy object definition text to a table where I can search it. ;-)

There are a few more, too, but I’ll confine detailed discussions of specific tools to other posts. What I primarily wanted to talk about today was a thought I had while perusing a toolset that someone brought to my attention a couple of weeks ago. I won’t mention the company or toolset because that’s just not my style. (Praise publicly, criticize privately.) However, I had mixed feelings about some of the tools in that toolset. Here’s why:

Often, in the SQL Server community, we have given the advice, “Don’t reinvent the wheel.” What we mean by that is, if you’re looking to accomplish a specific task, it’s likely that someone else has accomplished that task in the past. You should take a look at their work/script/blog post and build on that, rather than trying to come up with your own solution. When I give that advice, I am usually inclined to add, “…unless your goal is to understand the wheel.” Do you want to understand how index maintenance really works? Write your own maintenance script. Do you want to understand the complexities of a simple database backup? Write a script to backup all the databases on a server, *especially* if they have different requirements. Just try writing a script that will effectively backup a group of databases when you have some in SIMPLE recovery, some in FULL, maybe some filegroup backups, allowing for multi-file backups, etc… You will definitely learn a lot about taking a backup that way.

However, if the goal is not to learn, but to accomplish a task, then I say you should take the shortest path between point A and point B. Don’t spend the time on learning, but getting things done. In many cases, your boss isn’t paying you to learn on the job. (Though it’s pretty awesome when they do.) Instead, you’re being paid to get things done. If that’s not motivation enough for you, think of it in terms of time management. XKCD gives a fantastic example of what gains you actually get by trying to improve processes, rather than just getting them done.

Some tools do a great job of abstracting complex tasks into simple point-and-click routines. SQL Search, mentioned above, is a great one. Before I knew things like that existed, I was writing stored procedures and scripts that would go database by database, (Hi, there sp_MSForEachDB!), running queries on system tables – or worse yet – running sp_helptext on procedures, functions, etc… and looking for a search term in the resultant text. Messy, messy method. After I learned about that particular tool, I retired my search scripts.

I’m starting to wonder if some tools do a little too much abstraction, though. I came across a tool today that moves jobs from one server to another. And another one that moves both databases *and* their logins. My initial thought was, “That’s a lot of background tasks this thing has to do. I wonder how it’s doing it?” My guess is:

  • Run a backup of the database to a user specified location.
  • Get the list of users in the database.
  • Install, if necessary, and run sp_help_revlogin to extract the appropriate logins and their passwords.
  • Create, if necessary, those logins on the destination server.
  • Restore the database to the destination server, placing the files in the location the user specifies.

Now this is a guess on my part, since I haven’t actually opened the tool. But, that’s the simplest way I can think of to move a database and it’s associated users from point A to point B. However, I also think that the skills involved in such a plan are simple enough that the average DBA should be able to do so without a tool. In fact, I would expect it. This is where I get into a quandry. At what point should a DBA be expected to have the skills to accomplish a task without a particular tool handy? I would fear for the DBA who has accomplished a lot of his tasks with tools, without understanding the method or action behind them. What happens when that DBA gets a new job? It’s all too easy for me to imagine the following conversation:

Boss: “We need you to move this database from serverA to serverB.”
DBA: “OK, I’ll need the SuperDBMoverTool.”
Boss: “Why? Our old DBA did it all the time without that tool.”
DBA: “But that’s the tool that I used at my last job.”
Boss: “How much is it?”
DBA: “It’s x Dollars.”
Boss: “That’s too much. Figure it out.”

Is this realistic? I’m not sure. What I do know is that understanding *how* and *why* things work has served me much better over the years than understanding *what* things work. To what extent this is true for everyone, I don’t know. But, I’m learning, and I now tend to shy away from tools that doo too much for me, until I understand the background a bit. The tools I really like are the ones that give me a lot more information, rather than accomplish tasks for me. I’ll handle just about any task, provided I have or can get enough information to accomplish it.

Is this true for you?

Thanks for reading.