SQL Server Monitoring

It was only in due time “Erik please find a third party tool that can monitor our database for performance issues and blocking.” The time is now. So in my option I have about three options.

  1. Create a custom script to monitor for blocking
  2. Run a SQL Profiler trace during high usage hours
  3. Research SQL Monitor (RedGate Tool) to monitor my database. This would be getting a sharper more expensive ax.

I have made numerous suggestions on how to speed up the databases. However, keep in mind with vendor supported databases, those databases are their babies. Asking them to change something, even though they are getting paid by your company – is like asking a mother to cut her daughters long blond beautiful hair to rival Sinead O’Connor. There has been weekly calls with the vendor to discuss how we can improve performance. As of right now they are focusing their time and energy to prove it’s a network/end user PC issue. Since I believe this is a waste of time I know they will come to the DBA to say, how can we improve performance. Therefore, my new task could not come at a better time. My new goal is to find way to kill two birds with one stone. Let the research being.

First, a script that will monitor for blocking:

NOTE: Before you do this you will need to Configure Database Mail on the server.

Add this script to a SQL job and have the job run how ever often you feel as necessary.

DECLARE @Email varchar(max)
select @Email= ‘SPID ‘+convert(varchar(4), B.spid)+’ is blocking SPID ‘+ convert(varchar(4), A.spid)+char(13)+char(10)
+ ‘Transaction Information: ‘+case B.open_tran when 1 then ‘Open Transaction’ else ‘No Transaction’ end +char(13)+char(10)
+ ‘Waittime: ‘+convert(varchar(20), A.waittime) +char(13)+char(10)
+ ‘Blocked User: ‘+convert(varchar(20), A.loginame) +char(13)+char(10)
+ ‘Blocking User: ‘+convert(varchar(20), B.loginame) +char(13)+char(10)
+ ‘Command: ‘+ convert(varchar(20), A.cmd) +char(13)+char(10)
from master..sysprocesses A join master..sysprocesses B on A.blocked = B.spid
where B.spid <> A.spid
IF (@@ROWCOUNT>0)
BEGIN

EXEC msdb.dbo.sp_send_dbmail
@profile_name = ‘ProfileNameFromDBMailConfiguration‘,
@recipients = ‘YourEmail.com; anotheremail.com‘,
@body = @Email,
@subject = ‘ServerName – Blocking Occurring!!!’,
@body_format = ‘HTML’

You do need to keep in mind that blocking is normal. If you have this job run every 5 minutes you will see periodic blocking. However, by the time you get into the server the block maybe have already been cleared. Essentially this script is a good starting point for monitoring. I like to run the scrip every five minutes. If I get an email I will read the email but not take action until I see another email. 

Next Blog: Run a SQL Profiler trace during high usage hours. We will kick off one of the most low impact profile traces on a server.

Posted in Uncategorized | Leave a comment

Vendor supported databases

Here in my profile I title myself as a blogger and look, I have not blogged since April. Sorry to those 5 people who may actually read by blog.

Ok..So there was a DBA, The Customer, and the Vendor. Sorry have you heard this one.

As a SQL Lumberjack you need to find how very interesting, challenging, rewarding, and overall exciting it can be when your handed a project that will be ultimately be supported by a vendor. If you can’t find that emotional connection I’m afraid your project, which will ultimately be a supported application, will continue to hunt you every time an issue arises. I was recently handed a project to get SQL Server set up so the vendor can install a database. Here were my requirements for the project, keeping in mind I had to work with the vendor through each step.

  1. Install SQL Server 2008 R2 on a virtual box
  2. Create an extract from another database to feed profile information for the new database
  3. Create an SSIS package
    1. Create a OLE DB Source which houses the SQL Command from the previous requirement.
    2. Push the extract to a Flat File Destination
    3. Extract as a text file in tab delimited format
    4. Schedule a job to run the SSIS package twice a day
  4. Create maintenance plans for the database

Regardless if you’re a DBA of a home-grown database or a vendor created database your ultimate job does not change. You are the Default Blame Acceptor if something goes wrong. Remember, your company is paying this vendor money. Don’t be afraid to light up the vendor if you disagree with something. They can create the database, the schema, the index, and the views. That’s fine. YOU, create the maintenance plans, you create the optimization plans, you make the calls on how best increase performance on the database. Here is where art comes in, don’t be afraid to light them up, but you still need to sell them on why it’s best to set up the database in the way you want.

When my vendor created their database and it popped in my server I found things that were wrong, it was not my standard setup for a database. I took action immediately. The recovery model was not full, the auto shrink was true, and the compatibility level was SQL Server 2000 just to name a few. First thing I did was send a detailed email about why I want to change these options. Also, always try to avoid the middle man. You get level one on the phone who know nothing about SQL Server and they will end up screwing up your details. Just tell them you need to talk to the Database Developer.

End the end if you keep the control in your court you will have that emotional connection to the database.

Posted in Uncategorized | Leave a comment

Creativity in Database Backups

That’s right it’s time to go all Martha Stuart on this DR plan. And by that I do not mean cheat the system and call your co-workers worthless humans.

I am about to discuss Backup Stripping and is it best to send large backups across the network just to keep them off your local system.

“Marley was dead, to begin with. There is no doubt whatever about that.” Now I am not Marley but I am a SQL Lumberjack and I was faced with an interesting scenario that needed immediate attention.  I needed to create a DR plan and I needed to find a place to keep my backups. The full backup I was working with was about 60gig on average. I called up my NAS dudes and was told “we don’t have the space for a full backup that size”.  Ahh, but you do have some space. This is where your DBA creativity skills need to come into play.

Backup Stripping is when a backup is stored in equal chunks across a device but needs to stay on either a tape or disk. It’s my understand that it cannot be on both. It’s supported to single media type only. So we are taking our full backup and slicing it in equal parts. This will also increase your backup throughput. You as the DBA will not have control on how the data is divided SQL Server will attempt to keep the parts equal but if they are different do not be alarmed. This is determined by the speed of the disks. Slower disk may cause some bottlenecks and create unequal chucks of data. It is recommended you try to keep your chucks equal so try to keep your backups on disks that are similar in speed and performance. Additionally if you are going to be creative and are limited on space you can send your chucks of data to different servers.  Yes, this can make recovery more difficult but at least there will be a recovery model.

What is a SQL Blog without some scripting right?

backup database [sqllumberjack] to
disk= ‘d:\temp\sqllumberjack_1.bak’, –D: and E: backups are staying on the local system
disk=‘e:\temp\sqllumberjack_2.bak’,
disk= N’\\2ndServername\D$\sqllumberjack from dbhostserver\sqllumberjack_3.bak’, –These are being sent to another location
disk= N’\\2ndServername\D$\sqllumberjack from dbhostserver\sqllumberjack_4.bak’
noinit

Files sqllumberjack_1.bak, sqllumberjack_2.bak, will be similar in size, whereas files sqllumberjack_3.bak, sqllumberjack_4.bak are similar in size. However, files sqllumberjack_1.bak, sqllumberjack_2.bak, may be significantly larger than files sqllumberjack_3.bak, sqllumberjack_4.bak.

In both scenarios, drive D may run out of space, and you will get an error before the backup completes if drive D has limited space. If you must have files evenly distributed on both disks, you need to use disks that have comparable I/O throughputs.

The script above brings me into my next piece.

Is it best to send large backups across the network just to keep them off your local system? So what is a large back-up. That is for you, the DBA to decide. As a SQL Lumberjack you may have a slow network, with an old slower server, which is limited on space already. I would recommend keeping any backup larger than 2 gig on your local server IF and ONLY IF your server dudes are backing up the server nightly. This can reduce the I/O throughput across the network. We need to remember as SQL Lumberjacks we do not have another server readily available. There is no HA environment, we relay strictly on DR plans that do not have HA. Mirroring PLEASE! “Can I get a witneeess!” No, you cannot. We are straight up gangsters. These are for non-critical applications and non-critical databases after all. Any company will spend the money on HA if it’s a critical database. When a company bids out services, if they are bidding on a critical applications they will adjust to include HA, least they should.

No back-up server readily available. Worst case scenario the server catches on fire. Even if you send your backups across the network to another server you do not have a server to recover to. Your company has to buy a new one. What is the difference if your backups are local or in data center somewhere. Your server dudes have been backing up the server to tape. Those tapes will be used to recover the server. Even if just your database takes a dive, you can still relay on those tapes to recover.

I know I know, point in time recover. Your database has many transactions. You should send your t-logs across the network to your data center. T-logs are small they will not slow down the network. T-logs are your point in time recovery if you have to recover from the tape.

Chances are you will not even touch the full backups in the data center.

Posted in Uncategorized | Leave a comment

Moving a SQL Database from 2000 to 2005. Best Option?

Recently I was asked to move a database from SQL Server 2000 to a SQL Server 2005 instance. This of course creates two questions. What process will I use to move the data and two, why not SQL Server 2008. I will spend a little bit of time explaining the best two options to move a database. Otherwise for the second question, I’m a SQL Lumberjack. I use the axe I’m giving.

The two options I think are the best are either to restore your database from the most recent full back up or a detach/attach process. Both of these options have their pros and cons. In fact as a SQL Lumberjack you may not have a choice. The customer may set forth requirements that force you to go one way or the other. However, as a DBA you are the “Default Blame Acceptor” if something goes wrong the finger will be pointed at you. Don’t be afraid to say something to provided “constructive criticism” to your manager. Any DBA has to be able to manage their manager.  A manager needs to be a DBA’s friend not an enemy. It is important for a DBA to create and maintain good working relationship with their manager, and with another other managers in the organization.  Your manager hired your for your expertise and your professionalism. If you believe one process is better than that set forth by the customer, then explain that to your manager and let them know you would be happy to give a presentation to the customer to prove your point. In the end it will be a “value add” to the customer and your department for showing concern for the client.

Option 1: To restore from a Full Backup. This is also an excellent opportunity to truly know if your backups are successful backups.

This option is a good option if you are moving the database during working business hours. This is ONLY if you are moving to a new server that is not in production. Once the database is restored on the new SQL 2005 Server you will want to Update the Compatiblity Level and Update Statistics. We need update the statistics because statistics are not automatically updated during the conversion process. If you don’t manually update the statistics, then you may find that our queries are running slowly because of incorrect statistics. This will also bring your indexes to a SQL Server 2005 level.

You will want to change the compatibility level and update statistics regardless either move option.

Option 2: Running the Detach and Attach process.

This is I think the best option if you will be doing the move during off hours. The reason you will do it during off hours is because you will be dropping the database from the SQL 2000 instance and therefor taking it offline. Whereas with the Restore from Backup option you can simply take you’re most recent full backup and restore it to the new server that is NOT the production box yet.  Before you run the detach process be sure as always before you do anything big to a database, you have a good backup. Now preform the following pre-detach steps.

  1. Execute the following query to ensure no processes of the Said Database are running

select * from sysprocesses a

inner join sysdatabases b on a.dbid=b.dbid

where b.name=‘DatabaseName’

  • If this displays any running DatabaseName processes they will need to be killed prior to detatching the database.
    • Document the SPID number
    • Execute KILL 53 (53 is an example of a SPID#)
  1. Execute the following querys to ensure no active user is running a DatabaseName process

select * from sysdatabases

  • Recored the dbid for DatabaseName

USE master

EXEC sp_who ‘6’ <– The ‘6’ specifies the procees ID

  • If a login name is displayed with user name or Domain/USERNAME that person will need to be notified that DatabaseName will be dropped and they need to save all work.
  1. Expand Databases and select DatabaseName.
  2. If the database is in use, restrict access to a single user:
  • Right-click the database name and point to Properties.
  • In the Select a page pane, select Options.
  • In the Other options pane, scroll down to the State options.
  • Select the Restrict Access option, and in its drop-down list, select Single.
  • Click OK.

A message box appears to inform you that this action will close all connections to the database. To proceed, click OK.

You are now ready to run the detach and attach process. Be sure to update the ODBC drivers’ as well.

Feel free to email me or comment on this post if you would like more details steps or scripts for each of these two options.

Posted in Uncategorized | Leave a comment

Moving a SQL Database from 2000 to 2005. Best Option?

This will be the topic of my next blog. Im testing these options now

  1. Restore from last backup
  2. Detach & Attach Process

I know, I know Your Excited.

Also WHY not SQL Server 2008. Remember I’m the SQL Lumberjack. I will usually be discussing those topics where like most companies you may not have the latest and greatest toys.

Posted in Uncategorized | Leave a comment

The SQL Lumberjack … Really ?!?!?!

Well here it is my first blog on The SQL Lumberjack blog site. Why I call myself the SQL Lumberjack I will explain. First, why did I create this blog.

I started reading Brad McGehee’s book How to become an Exceptional DBA. One of the first things you read is an Exceptional DBA will Share his or her knowledge with co-workers and the SQL Server Community. Well here you go.

So….SQL Lumberjack REALLY!

The name of this blog boils down to one thing. As a DBA if it’s a small company or a large one you will encounter some access restrictions or resistance that makes doing your job hard. It’s not called ‘work’ because it’s easy. So my questions to all SQL DBA’s have you ever felt like a Lumberjack without an axe? I think most DBA’s would say yes.

Wikipedia Definitions:

A lumberjack is a worker in the logging industry that performs the initial harvesting and transport of trees for ultimate processing into forest products. The term usually refers to a bygone era (before 1945) when hand tools were used in harvesting trees principally from virgin forest. Because of its historical ties, the term lumberjack has become ingrained in popular culture through folklore, media, and spectator sports. The work was difficult, dangerous, intermittent, low-paying, and primitive in living conditions, but the men built a traditional culture that celebrated strength, masculinity, confrontation with danger, and resistance to modernization.

A database administrator (DBA) is a person responsible for the design, implementation, maintenance and repair of an organization’s database. They are also known by the titles Database Coordinator or Database Programmer, and is closely related to the Database Analyst, Database Modeler, Programmer Analyst, and Systems Manager. The role includes the development and design of database strategies, monitoring and improving database performance and capacity, and planning for future expansion requirements. They may also plan, co-ordinate and implement security measures to safeguard the database.

My Definition:

A SQL Lumberjack is an individual who is responsible for the design, implementation, maintenance, repair, and harvesting of virtual data and virtual logs (Pun Intended) in a SQL Database. The term usually refers to a DBA whose hand tools are virtual and sometimes as dull as a spoon. Whereas it can be difficult to harvest logs without the right tools or access. Therefore, a SQL Lumberjack, like a Lumberjack could be lost without an axe. The work can be difficult but rewarding, high-paying in most cases, not at all dangerous, and living is most likely good. These people traditionally have built a culture that celebrates intelligence, geeks (which FYI, can be a term of endearment), confronts danger with awkward techy jokes, and loves changes in technology but, in SQL Lumberjack attitude, silently curses their manager when turned down for a sharper axe.

Posted in Uncategorized | Leave a comment