SQLAgentMan Contest #1
I originally planned this post to just vent about a vendor doing something evil to their own database hosted on a shared instance of SQL Server I administer. Then I thought that I may be able to make a contest out of this… So, what caused me to come up with this lame stunt to drive traffic to my blog? Well… you tell me!
I just completed an upgrade in our test environment of a commercially-developed product that is used to <REDACTED TO PROTECT THE INNOCENT>. The vendor’s standard process is to run a series of *.sql scripts against their database via a batch file based upon certain parameter values hard-coded into the batch file. I strongly suggest in most circumstances that you do not allow this process in your environment, even with a trusted software vendor. Matter of fact the only time I advocate doing so is when:
- This is a dedicated database environment for only their product.
- You’ve taken a backup (full or differential depending upon size and activity) against every user database on the instance
- You’ve also taken a full backup of your master and msdb databases
- Your resume is current as of this morning.
So, what do I recommend? Simply tell the vendors they can’t update their products, ever. It makes your life far easier and really helps with time management. Ask the vendor to supply you with the scripts so you can either review them ahead of time and verify they’re not doing anything that is an afront to proper database management or their upbringing and a good citizen of the world and then offer to run the scripts manually in-lieu of executing the batch file. If the vendor insists on using the batch file approach, perform your review, confirm that their rights are sufficient (and also assert that most-likely proper rights does not mean System Administrator) and then allow the vendor to execute the batch file.
Why do I advocate this level of micro-management on the database instance?
- I am an ___hole.
- I enjoy taking time out of my schedule to Q/A code that is under an SLA to another entity we’re already paying obscene amounts of cash, as well as 30 pints of fetal panda blood a month to?
- I have a bias against most software vendors I’ve had interaction with.
- Hard coding parametrized security credentials in a batch file is wrong on a scale greater than spandex shorts on a shirtless Abe Vigoda. (Bing “Abe Vigoda” kids if you don’t know who I’m talking about.)
- It saves headaches in the long run.
While the first bullet is correct a fair amount of the time, it is moments like today when you reap the benefits of the additional work put into reviewing code that otherwise would have been run against your RDBMS without oversight by the Database Administrator. While I am not going to tell you right now what the offending line of code I ran across was, I will tell you a little something about what the calls coming in from users would have been like in the next few days. Comment on what you think was the offending line of code and you’ll win a mystery prize of untold worth. I’ll chose a single winner from all correct answers submitted on the blog received by Midnight July 10, 2009. The winner will be notified by email.
Here are the “calls” I could have received:
- “I’m getting a timeout when I try to connect to my database.”
- “My application is running really slow.”
I know, not much to go on, but many of you out there probably already know the SINGLE SQL COMMAND CAUSING THIS. I will not judge you on syntax, never fear.
Remember, this is for a mystery prize of untold worth, delivered to your front door. So, what was I able to successfully avoid running today that would have spelled out nasty performance an periodic timeouts against the specific SQL Server database that was upgraded an hour ago?