SQL Saturday Cleveland 2018: Feedback Stack

My congratulations to the team of SQL Saturday Cleveland, for an outstanding event in 2018. You all put on a consistently well-run event, and I’m proud to be a part of it. I’m looking forward to submitting again in 2019. :-)

I wanted to make a post regarding the feedback I received from my session. I had somewhere between 30 – 40 people in my room, and the engagement was great. Thanks to everyone who came up to talk to me after my session. 19 feedback forms were received from the attendees.  Of those, 14 of which were all positive, for which I thank you very much. It’s good to know that what I’m providing is something people will find useful.

I’m also extremely grateful for the 4 negative feedback forms I received. I want to take a moment to address those specifically. Here were the negative comments I received under the category, “What could the speaker do differently to improve?”

“Slow down a little.” – Yep. I am in complete agreement with you on this one. I have a tendency to talk quickly when I get excited about something, or simply have had too much caffeine. Valid feedback, and duly noted. I will work on this.

“Make slides + scripts available before class begins.” – This is a hard one for me, because I have a tendency to modify the slides and scripts due to feedback I receive during the session. If someone points something out during the session that would require a correction on my part. I want to do that before posting the material. Making the materials available beforehand defeats that purpose. I could always issue an update, sure, but I’m not confident most people would bother downloading it. I am open to suggestions on this.

“More focus on diagnosis, remove inclusive vs. exclusive section, typo on DMV slide “individual””.  – Thanks for all that!  You’ll be happy to know that the typo was fixed before the materials were uploaded, so the available slides are correct. I’d be interested to hear more about why you think I should remove the inclusive vs. exclusive filter section, though. Lucky for me, you were kind enough to let me know who you are, so I will reach out to you individually. :-)

Here’s a comment from the “Did you learn what you expected to learn?” question: “No. Way over my head, was expecting use of EX.Events GUI.”  I’m sorry you didn’t get what you expected out of my session. However, I did explain my reasons for avoiding the GUI. You won’t be able to use any of the automation tactics or scrips I showed with the GUI. I’m not against using the GUI, but I choose not to so that I can save my scripts and automate them. I would suggest playing around with the demo scripts a bit to see if they make sense, and if not – contact me. I’ll see if I can help you out.

Astute readers, I am sure, will note that 14 positive and 4 negative feedback forms does not add up to 19 total. I have one last comment to post, from my favorite feedback form I received, and I think you’ll agree this one defies categorization. I love it.

What could the speaker do differently to improve? “Expand the universal constant governing relative time, and do a 4-day seminar in 45 minutes.”

I’m pretty sure I have a PowerShell script for that. :-)

Thanks for reading.

-D.

How to handle the PASS Summit.

About this time every year, there are a plethora of articles that offer advice on how to handle the upcoming PASS Summit. And well they should – it’s a huge event. There are thousands of people congregating there and so, so, SO MUCH to do. Special interest events and community mixers abound. And I’m not even going to touch the number of parties, both public and private, and the sightseeing and karaoke and… Oh, yeah – there’s a conference there, too. Tons of educational material, networking sessions, professional development opportunities, the MS CAT Team and…

Yeah – it’s a lot. Especially to try and pack into three+ days.

One of the things that I’ve heard time and time again is about the opportunities for networking. In fact, there’s a saying about that which I’ve heard used many times regarding this event. It’s, “If you’re eating dinner / lunch, etc… by yourself, you’re doing it wrong.” And to be honest, that kind of rubs me the wrong way.

While I try very hard to be friendly and approachable, I’m not exactly the most social person. Now, if you’re reading this and planning to attend the Summit, please don’t take this to mean I don’t want to talk to you. I most definitely do. Hey – if you read my blog on even a semi-regular basis, I consider you a BFF and will likely buy you a drink in appreciation. :-)

What I’m saying, though, is that I fit the current popular definition of an Introvert. Not a big fan of crowds, I don’t generally stay out that late, and really – I need time to myself to recharge. If you don’t see me running around during the event, it’s likely that I’ve slipped off to my hotel room for a cat nap or down to Pike’s Place for a solo walk by the water. I suspect that there are more than a few of you out there that fit that definition as well.  If so, then this post is for you.

If you’re eating a meal by yourself, that’s FINE.

Just here for the sessions? GREAT.

Want to go sightseeing alone after the day’s events? PERFECT.

Yes, there’s a lot to do there, but you shouldn’t feel like you have to do it all, nor should you feel like you have to jump out of your comfort zone in order to do it. Sure, taking a risk can pay off, and getting a little outside your comfort zone with a little professional networking is one way to do that. It has certainly paid off for me. But it’s hard. I get that. So don’t feel like you’re missing something if you don’t. Don’t feel bad. You’re fine.

And I hope to see you there. But if I don’t, that’s OK. :-)

Thanks for reading.

-David.

SQL Saturday Rochester Recap and Speaker Feedback

Rochester, you all are so, so kind… :-)

First things first: Big thanks to Matt Slocum, Andy Levy and their entire crew, the sponsors, the speakers, the volunteers, everyone who put together such a fun event. SQL Saturday Rochester was a great time.

I started my day off in Colleen Morrow’s session: “So You Want To Be A DBA?” It’s a question I’ve posed to quite a few people recently, as I’ve been approached by multiple people interested in the career track. I always kind of struggle to describe what I do and what makes a good DBA. Colleen doesn’t struggle with this at all. She did a great job, presenting all the main points, pains and proclivities of a DBA, and even had a fun quiz in her session. The quiz showed the background of several well known people in the community and the attendees were asked to match the person to the bio. I was a bit surprised at a couple of them, which was very cool.

Next up, I went to Adam Belebczuk’s session, “XML Without Xanax.” My knowledge of XML is sorely lacking. (Read, just enough to pass an exam.) So, I was desperate for some help and better insight into how SQL stores and actually works with XML. Adam didn’t disappoint. One of the really good takeaways I got from this session was the idea of using an XSD schema definition as a sort of column constraint, where XML is being stored. Something I’m definitely going to have to look into.

After a break to make sure my laptop was charged, I stopped by David Klee’s session, which was called “My Brain Has Blue Screened.” Rather than deep technical content, this was a bunch of DBAs and the like, gathered around swapping war / horror stories. Many of which had me cringing. Excellent idea for a session, and something we may have to do at SQL Saturday Columbus this year, if he’s willing. (Hint, hint…)

After a very tasty lunch, I stopped in to see Kendal Van Dyke set up a SAN-less cluster with SIOS’ Data Keeper product. This wasn’t something I had seen before in the wild and it was certainly interesting. We’ve been kind of trying to move away from failover cluster instance at work in favor of Availability Groups. I’ll have to check this out to see if it may be a better fit for us.

I skipped the next session to re-run the demos for my own two sessions in the afternoon. Good thing I did so. I made a few final adjustments to one of my demos and was ready to rock. My first session, “The Usual SUSPECTs”, which deals with database states, went off without a hitch. “DBA 911 – Database Corruption” went pretty much as it usually does. I was able to get through both sessions without modifying too much, though I did gloss over a couple of things due to the time limit. I was trying very hard to make sure the attendees were able to get to the raffle on time at the end of the day, since you do have to be present to win.

I don’t have much in the way of speaker feedback this time, since the feedback I received was universally positive across the board. The attendees must have been very kind people. For the Usual SUSPECTs session, I did get a pair of really great comments under the heading “How will you use the information you learned here?”

“Not sure if I will, but good to know.” – Trust me, one day, you will. :-)

“Test at office on production. No, just kidding!” – If you do, let me know. I could use the consulting work…

And from the DBA 911 session:

How will you use the information you learned here? “Test Backups”.

My work here is done. See you in Philadelphia? :-)

Thanks for reading.

-David.

Quick Speaking Event Roundup

Wow… so much on the radar right now.

This Saturday, I’ll be speaking at SQL Saturday Rochester. I’ll be giving two sessions. First, The Usual SUSPECTs which covers some of the states that a database can be in other than “online”. Second, DBA 911 – Database Corruption, which covers on-disk data corruption and repair techniques. I’m really looking forward to it, and hope to see many of you there.

Next month on June 6th, I’ll be speaking at SQL Saturday Philadelphia, giving my DBA 911 – Database Corruption session. That session has certainly been popular. I think I’ve presented it about a dozen times. :-) Both Rochester and Philadelphia are new cities for me, and I’m really looking forward to enjoying my visits.

I’m also excited to say that I’ve been selected to speak for this summer’s 24 Hours of Pass. This is a great event, and the lineup of speakers is awesome. I’m grateful to be included in such a group. I’ll be presenting my latest session, Turbo-Charged Transaction Logs, on transaction log internals and performance. I encourage you to register, and look forward to ‘seeing’ you there.

Last, but most definitely not least, SQL Saturday Columbus is coming up on July 11. We just extended the speaker submission deadline through the weekend for those of you who still have to submit sessions. Please do so; we’d love to hear what you have to say.

Thanks for reading, and I hope to catch up with you at one of these events, or another.

-David.

Why you should speak or volunteer at SQL Saturday Columbus.

“Hi. My name is David, and I’m a SQL Saturday addict.”

“Hi, David…”

If you’ve been following my posts, you’ll no doubt have noticed I’ve spent a good bit of time on talking about SQL Saturdays on here recently. SQL Saturday is a big deal for me. No – scratch that – it’s an all caps BIG DEAL. Why? If for no other reason then it’s the number one thing that has contributed to my career over the last five years.

Let’s break that down, shall we?

1. People. This is the number one reason to attend a SQL Saturday, period. The people you will find at these events are amazing. Not only have I made many valuable professional contacts, I have made some life-long friends as well. At how many professional events can you say that? Also, consider this: How many times in your life can you remember being in a room with a couple dozen people, and being able to explain what you do in five words or less… and everyone *gets it*.  Rare, right? Not at SQL Saturday. Here, you are one of us.

2. Knowledge. Tons of it. Everywhere. Not just in the session rooms, either. Conversations in the hall, during breakfast or lunch, even after the event… There’s a ton of knowledge being shared on just about any aspect of SQL Server that you can think of. The learning opportunities are limitless. I’ve learned more than I thought I would just by talking to people outside of the sessions. That issue you’ve been dealing with at work and can’t seem to find a good solution to? There’s a good chance the person sitting next to you has seen that as well. Maybe you should ask them about it?

3. Community. This could easily have fallen under ‘people’ but I wanted to give it its own separate space. A community is much, much more than just a collection of people. Remember that the people who put on this event are volunteers. This includes the speakers. They don’t get paid for putting on these events, they simply do it because they love it. And when I say they love ‘it’ I don’t mean SQL Server, but the community of professionals that make it better, faster, more powerful, and more valuable to the businesses and organizations that use it. Giving back to that community is kind of like a feedback loop. The more we put into it, the more we get out of it, and the better it gets.

That brings me to the point of this post. I want you – yes YOU – to consider submitting a session for SQL Saturday Columbus. Speaker selection is open until May 15. Have you given talks at work, or maybe for a local user group? Perfect – you’d be an excellent candidate. Need help with the submission process or coming up with your abstract? No problem – contact me via my contact page and I’ll help you out!

And if you’re not interested in speaking, then maybe volunteering is for you. We’ve got all kinds of things that need to be set up, taken down, monitored, moved, and generally managed the day of the event. We need all kinds of people to do it, as well. Volunteers are what make these events run, and make the community great. If you would like to volunteer, you can sign up on the SQL Saturday site, or simply contact me.

I hope to see you there.

Thanks for reading.

-David.

 

SQL Saturday Roundup – April 2015 Edition

This year is picking up nicely for events for me. After thoroughly enjoying SQL Saturday Nashville in January, and SQL Saturday Cleveland in February, I’m really looking forward to my next few SQL-related trips. I’ve updated my schedule page accordingly, but wanted to call out the next couple of events I’m going to be at. If anyone in those areas, or travelling to them for the events, wants to get together and chat, please let me know.

April 11 – SQL Saturday – Madison WI

I’ll be heading to SQL Saturday Madison for the second year in a row, and I’m very excited for a couple of reasons. First, I’ll be presenting my brand new session, “Turbo-Charged Transaction Logs“. In that session, we’ll dig a little bit into the transaction log internals, show how SQL logs what it does, and some ways to make the logging process faster.  Second, and more importantly, I’m looking forward to spending time learning and relaxing with the fantastic people of the Madison SQL Server community. You’ve got a good crew up there, folks. Looking forward to seeing you all.

May 4 – 8: SQL Skills IE:PTO2 – Chicago IL

OK. This isn’t a SQL Saturday event, but I will be travelling to Chicago for a week to get schooled on SQL Server performance by one of the best companies in the business, SQL Skills. At some point, my brain is going to need a break so if you’re in the Chicago area and want to catch up, or just chat about SQL Server, drop me a line. I should have an evening or two free. :-)

May 16 – SQL Saturday – Rochester NY

This will be my first time speaking for SQL Saturday Rochester, and I’m stoked. The schedule hasn’t been finalized, but the approximate schedule is up, and if it sticks, I’ll be doing two sessions on disaster recovery, back to back. A whole afternoon block of breaking and fixing. Sounds like fun!

One session I’ll be presenting is: “The Usual SUSPECTs: When Good Databases Go Bad“.  This session is all about database states. The good, the bad, and the ugly, as it were. In addition to talking about things like moving OFFLINE database files and rolling a database forward through its transactions using a STANDBY restore, we’ll look at what happens when a SAN failure puts a database in to RECOVERY_PENDING, or when a disk crash lands you into SUSPECT mode. I’ll show you how to recover from those states in a safe way.

The other session I’ll be doing is “DBA 911 – Database Corruption“. In that session, we’ll look at the basic definitions and terms of database corruption and repair. We’ll talk about some of the different kinds of corruption, how to look for and troubleshoot corruption issues, and when it’s best to repair vs. restore. As in all things Disaster Recovery – preparation is key, so we’ll also cover steps you can take to ready yourself for a corruption problem. (Hint: have you hugged your backups today?)

Thanks for reading, and I hope to see you at one of these events.

-David.

Passing Exam 70-461 – Querying SQL Server 2012

tl;dr – Study. A lot. Hard.

I recently took and passed MS exam 70-461. Here are my reflections on it. I am, of course, forbidden to tell you exactly what’s on the test, and I don’t know what’s in the entire bank of questions, so this is more of a guideline than anything else. Your mileage may vary, as usual.

Study Materials

I had previously been through the official class with a local training company, as well has having studied the official MS Press book, which I got from the library. (If you’re buying the books just to pass the tests, you’re wasting money.) However, I didn’t take the test immediately after. I’m kind of glad I didn’t since we didn’t actually cover some of the most important things on the exam in the course material. I got my best results by using the MeasureUp practice exam, and reading the suggested BOL articles on MSDN. Between those two things, as well as mucking around with the code on my own, I feel like got enough of a review of the material and exposure to new stuff to be able to pass.

The MeasureUp practice test isn’t perfect – not by a long shot. I sent in at least three corrections on their test. The only time that actually caused me a problem was when I answered a question incorrectly, and was shown an answer in the “explanation” that wasn’t an available answer for the question. Once or twice, the “correct” code was just flat-out wrong. However, even that was close enough, and I already had enough experience with the concept to get what they were driving at. Also, it’s not exactly cheap, but not much more expensive than the book. I would get the book from the library and pay for a 30 day pass to the practice exam. Even with the minor issues I described, I would still recommend it as a study tool.

One more thing about the MeasureUp exam: It’s nowhere near like the actual test, no matter how much they say it is. It’s a great study tool, but a poor excuse for a simulation of the actual exam. Example: Most of the questions on the MeasureUp exam are multiple choice, with some drag and drop. On the official exam, you will be asked to actually write code. So don’t get used to just picking the correct code snippet out of the list. Know how the code works, and be able to actually write it. This is where practicing with the examples in a copy of Management Studio will come in handy.

Subject Matter

There are four sections to the test. I’ll take each one in turn and talk a little about the types of things I think you should practice. Now the way I describe these doesn’t exactly line up with the exam objectives, but I think this is a decent way to study them, since they make more sense to me in this grouping.

Create Database Objects – I did pretty well on this section. Know how to create tables to work with FILESTREAM, and brush up on the ins and outs of views. Pay particular attention to options required for indexed views like SCHEMABINDING, as well as what VIEW_METADATA and CHECK OPTION do. Finally, know how SEQUENCE objects work, and when they’re a good choice.

Troubleshoot and Optimize – I did moderately well on this. Here, I’d recommend knowing isolation levels really well. Also know what the various *physical* join types are, and when they are to be expected. Know how to get execution plan information from T-SQL, and how to optimize queries by doing things different ways. Many of the questions involved optimizing queries not by altering an existing function or CTE, but by taking a different approach all together.

Modify Data – This was my best section. Know the intricacies of MERGE, as well as INSERT/UPDATE/DELETE. Know what the set operators do, like UNION (ALL), INTERSECT and EXCEPT. Often, you’ll see those being used to filter a result set in a derived table, so keep in mind what their output should look like. I’d also recommend careful review of the details of creating stored procedures and functions. More so table valued functions than scalar ones, at least in my experience. COALESCE and IIF are important functions. Know those well.

Work With Data – I did poorest on this section. Caveat: I am not a developer, and rarely spend my time writing queries to work with business data. If I were to study for this section again, I would spend a lot more time on the FOR XML set of functions, things like GROUPING SETS and ranking functions like RANK vs DENSE_RANK. I would recommend reviewing data types very carefully. A few of the questions relied on some of the more intricate details of them. (i.e., there’s only one date/time type that stores time zone information, and you should know the precision levels of the various non-integer numeric types.) Also, I recommend reviewing some of the more obscure types like ‘cursor’.

Exam Generalizations

Most of passing a Microsoft exam is a matter of reading the questions extremely carefully, and understanding what they’re asking for. No lie: I spent the most amount of my time on the very first question, trying to decide what a particular word in the question meant. I definitely recommend marking questions, and returning to them later if they take you more than a few minutes or so to answer. There were 42 questions in 120 minutes on my exam. I completed the test in a little over an hour, but going back to that first question, and deciding what they really wanted as the result took the bulk of my time. Don’t worry about skipping questions – you’ll get the chance to come back and review anything you marked or skipped at the end of the exam. When in doubt, mark it and come back to it later. You’ll make up the time on the simpler, “gimme” questions, and there are quite a few of those as well.

Often, you can tell what the correct answer is just by the context clues in the question. i.e., You’re joining data from two tables, and you want a result set for the first table, regardless of whether or not there are matching rows in the second table… it’s a good bet the correct answer is going to have a LEFT JOIN in it, effectively eliminating answers with INNER JOIN in them quickly. Process of elimination is a perfectly valid way to take this exam. Use it.

Hope that helps.

-David.

Speaking at SQL Saturday Cleveland (#SQLSat241)

I am delighted to be speaking at SQL Saturday Cleveland on Saturday, February 8th 2014.  My ever-popular (read: the one that keeps getting picked) presentation on database corruption, DBA 911, will be the topic, and I’m really looking forward to the lively discussion it usually sparks. Turns out people really do care a lot about being able to handle corruption issues. Who knew?

I hope to see you there.  If you haven’t registered yet, please do so at: http://www.sqlsaturday.com/241/eventhome.aspx.

Thanks,

-David.

SQL Saturday Nashville Recap and Session Feedback

Event Recap

First things first. Thank you so much to all the organizers, volunteers, speakers and sponsors that made SQL Saturday Nashville such an amazing and fun event. I will definitely be attending, if not volunteering and speaking again, next year. Special thanks to Tamera Clark (b|t) for all her hard work and dedication to the #sqlfamily.  Love ya, lady. :-)

Friday night’s dinner at Saffire was really cool. I went ahead and ordered from the menu and the fish tacos were excellent. The appetizers were also good. The building is an old gas stove factory.  When that company went out of business it was refitted to be a mattress factory, then finally fell into disrepair and has recently been revived as a shopping center. There was a rug store in there with some amazingly beautiful area rugs. I wished it had been open so I could get a closer look at some of the very intricate designs, but even if it had, I was kind of late for dinner anyway. :-)

Saturday morning, I showed up bright and early to check things out and see if I could help out anywhere. Lipscomb University was a little different than what I was used to, but a little glancing around (and some helpful students) got me pointed in the right direction. I got to do a little work setting up one of the sponsor tables, then headed upstairs to the speaker room to make sure my laptop was fully charged. Unfortunately I mis-heard the room number and went to the wrong room. I must admit that the university certainly puts out a tasty spread for their weekend classes. Oops. After finding the right room, I double checked a couple of things, and headed out.

The first session I attended was Wendy Pastrick’s (b|t) on SQL Server isolation levels. While I admit I was working on my presentation most of the time, I did get a little inspiration for a couple of blog posts on locking that should be coming up in the next few weeks. The only negative in that presentation was that the projector screen in that room was quite literally on the floor. Sadly, there was no-one in the facility that could hang it back up for us, so Wendy simply soldiered on. Considering she was presenting on a dim blackboard, I think she handled it well.  If you get a chance, I suggest that you attend this presentation, especially for the demos. Audience participation is a fun thing.

Next, it was my turn to present. I’m happy to report that the room was full to the point of people standing and sitting on the floor in the back of the room. I love it when the room is full. I also love it when I get some really good questions and feedback and I got plenty of both this time. Thanks to everyone who attended. I’ll address some of the feedback at the end of this post.

I did my usual decompression back in the speaker room after my session, and then it was time for the Women In Technology panel. I was dealing with food coma, so I didn’t take notes as well as I would have liked to. There was a lot of good discussion, as usual, and it always pleases me how helpful people are in those sessions. There was one lady who was actively seeking a new job, and one who had positions to fill. I’m sure there was additional discussion between them, after the session had ended.  The main focus of the session was on inspiring the next generation of IT workers, and there was plenty of inspiration to go around. Thanks to Hope Foley (b|t), Arlene Grey (b|t), Wendy Pastrick (b|t), Rie Irish (t) and Julie Smith (b|t) for their guidance and participation on the panel.

After that, it was time to learn a little bit more about Master Data Management. There’s a big data governance push going on at work right now, so this is something I’ve been doing a lot of research and playing with. David Mattingly’s (t) session was a good overview from the business side of things and gave me more perspective on where things should probably be going here. We’ve been approaching it as a technical problem when it really is more of a business problem, and it should be directed as such, with IT’s support. I’ll probably get more into that in future posts, as well.

My last full session of the day was Janis Griffin’s (t) on the SQL 2012 system_health extended events monitor. I have relatively little experience with extended events, but I understand the basics, and it was good to get a view into what the system_health session is capturing, and how the information can be used.  Monitoring is a little… different in my current environment, and until we can get things a bit more standardized, all the information I can get is helpful, to say the least. :-)

The last session of the day was the panel discussion on leveling up your career with Allen White (b|t), Denny Cherry (b|t), Joey D’Antoni (b|t), and Kevin Kline (b|t). I say last session, but not last full session, since I had to leave early due to a slightly upset stomach. I’ve been sick for a couple of weeks now, and made very good friends with some very good OTC medication. Between that and the caffeine, I needed to stand up and walk around for a bit. Too bad, since the discussion was going really well. I will have to follow up with the participants at another time. Sorry, gentlemen. :-)

I really don’t know why more people don’t go to the after events. It’s such a wonderful opportunity for meeting other SQL professionals, and having some fun conversation, as well as some excellent food and drink. Not to mention the SQLKaraoke, without which, I probably wouldn’t have met many of the people I now call SQLFamily. This event was no exception.  If you have never gone to the after event, I encourage you to do so. You don’t have to sing and you don’t have to dance, but you should come and hang out with us. It’s fun.

Quick Session Feedback

I had just a couple of pieces of feedback I felt I should go ahead and respond to in this post. There was one other question raised that I intend to devote an entire post to, later.

“Slow down.” – Yes, this is totally valid criticism. When I made a joke about pulling the pin on a grenade, as I downed a Starbucks Frappuccino at the beginning of the session, I wasn’t really kidding. That was a mistake, and I apologize. I also have been struggling a bit with how much of what type of content to include in the presentation. Right now, there’s too much, which leads to me rushing things a bit. I’m going to trim it down a bit for my next session in Cleveland, and hopefully that will make for a better presentation. Most people seem to enjoy the demos better than the slides anyway, so I’ll probably cut at least one of the slides, and spend a lot more time in detail on the demos.

“You should enlarge the font to begin with so you don’t have to zoom in and out.” – The font is already enlarged. Believe me, you don’t want to try to read that at default size, at that resolution. However, there’s a very specific reason I zoom in and out, and this is most critical to this session when we’re talking about error messages. What I do with ZoomIt is not just show you a specific piece of information, but by zooming in, I’m also showing you *where* that specific piece of information is within a larger context. It’s like looking at a whole map, then zooming in to a particular spot. Now you not only get the detail on that particular spot, but you have a sense of where that spot is on the map, so it’s easier for you to find it again. I hope that makes sense.

Thanks for reading.

-David.

DBA 911 – Follow up Q & A

Thanks again to everyone who attended my web session, DBA 911 – Database Corruption, on January 7th. I had a blast spending time with you all. We must do that again, soon. :-)

In this post, I’m going to tackle some of the follow-up questions that we weren’t able to get to in the session. Some questions will require more explanation than others, so I’m going to handle them in separate posts.

First, let’s handle two of the most frequently asked questions:

Where can I get a script to automate running CHECKDB on my databases?

The best scripts for that, bar none, are Ola Hallengren’s. Check out http://ola.hallengren.com

Where can I get the slides and demo code for your presentation?

The slides and demo code are on my resource page, above. (https://dmmaxwell.wordpress.com/resources/)

Now, let’s get into the feedback questions – I’m going to just copy them out of the feedback form and paste them here. I’m going to make an assumption or two here and there, but if I don’t completely answer your question, please feel free to leave a comment below. Some of the questions answered in the presentation will be left alone, but some answers bear repeating, so I will do that here. Also, there are some questions that I want to go into a LOT more detail on, so I’m going to save a few for future posts, where I can include some demo code and explain more.

Here we go! :-)

Is the space taken in tempdb by the dbcc checks cleared after the command is run or does the dba have to manually clear the data?

Space used within TempDB for CHECKDB is cleared automatically by SQL Server. The DBA shouldn’t usually have to manually clear objects out of TempDB. Now, if the problem is that running CHECKDB causes you to run out of space for TempDB, then you need more disk space for TempDB, period. I also recommend pre-sizing TempDB, if at all possible, to the max size you need.

Can we add log file with suspect database if it is suspected due to log file full.

A full log file, meaning a log file that cannot grow, does not put a database into SUSPECT mode. SUSPECT means that SQL Server is not able to read one or more database files to begin with at all, so you wouldn’t be able to add a log file anyway. A suspect database must either be put into EMERGENCY mode in order to correct anything, or restored from backup.

A full log file is a completely different problem. If your log file fills up on an online database, then yes you can add another log file for some temporary space. I would recommend finding out what caused the log file to grow, and either fixing that process or sizing the log file appropriately, then deleting the additional log file.  There’s no good reason to maintain a database with more than one log file.

monitoring the severity alerts > 20, we are monitoring these alerts: 823,824,825 and 832?

For 823, 824 and 832, yes, since those are high-severity alerts. However, 825 is logged quietly as a Severity 10 “Informational” message, so you won’t catch it if you are only monitoring Severity 21 and up. One other reason to monitor specific error numbers outside of just monitoring the severity levels is that you can use custom notification text for each error. For the 823 alert on my servers, I include something like, “This indicates that SQL Server was unable to read an 8k page off of the disk, and that the database is damaged. Please contact the DBA on call to complete a consistency check of this database immediately.”  That way, if I’m not around, someone will see that and know it’s critical – and get a hold of the on call DBA.

how much extra overhead is there with having checksum on for a database

Little to almost none. There is a slight CPU overhead to calculating the checksum. Performance wise, however, most servers are IO-constrained, rather than CPU-constrained. For probably 99.9999% of systems out there, enabling page checksums just isn’t going to be noticed.

I may have missed this but when should you run the datapurity command?

The DATA_PURITY option should be specified when the data purity flag is off for that database.  An example would be a database that is created in SQL Server 2000, then upgraded to 2005 or a later version. In SQL Server 2000, there were no data purity checks. In SQL 2005 and up, the data purity flag was introduced.  To find the data purity flag state, you can use the DBCC DBINFO command, which is included in my demo scripts. Once you have specified DATA_PURITY for a database, that flag will be turned on, so you will get data purity checks automatically going forward, without having to specify that option.

Now, if you’re trying to save time or TempDB space by turning DATA_PURITY off, then you want to specify PHYSICAL_ONLY. Be aware, when you do that, you’re deliberately choosing not to check for logical corruption, but physical readability only. It’s the old speed vs. accuracy trade-off.  I’m not saying one or the other is better, just that you need to understand your options and have reasons for choosing them.

What time will drinks be served?

May I see some ID, please?

What is the licensing implication when you run DBCC on non production environment?

None – you run it wherever you want to. You can copy a production database to a Developer edition server and run CHECKDB on it with no licensing impact.

in What situations is used DBCC UPDATEUSAGE ?

UPDATEUSAGE is a command that verifies allocated space in the database is reported correctly. This has to do with the database catalog views, rather than with corruption or with CHECKDB, in specific. If you’re seeing odd values from sp_spaceused or if  database objects are frequently changed with CREATE, ALTER or DROP statements, then you may want to run this. This does not, however, indicate a corruption issue. I recommend reading the BOL entry for UPDATEUSAGE at http://technet.microsoft.com/en-us/library/ms188414.aspx

Do we need to run CHECKDB after a restore or repair?

If you have the time to run it after a restore, then yes, I would definitely do so. And you should definitely run CHECKDB after doing a repair, especially if you have run with REPAIR_ALLOW_DATA_LOSS. You never want to say to your boss, “Well, I think the database is OK, now.”  You want to be sure. If you have downloaded my demo scripts, you’ll see that I take additional backups both before and after every repair, and run CHECKDB again after every repair. I show this because I think it’s a good habit to have. Your situation may vary, but I would consider this a best practice.  Not only that, but sometimes you need to run multiple repairs, and making it a habit to run CHECKDB after doing one, will allow you to catch anything else that has gone wrong. For an example, see my post on Emergency Mode Repair.

That’s all for now. :-) If I have not answered your question, remember that I’m going to be handling a few other questions separately, in separate posts.  Once I’m done, if I still haven’t answered your question or you have additional questions, feel free to leave a comment, email me, or send me a question on Twitter.   I’m always happy to help.

Thanks for reading.

-David.