No, I haven't lost my job. This is just one of those topics that's good to get out there. There are honestly few things that can cause you to question if you still have a job the next day, but those things still exist. There are ways to mitigate RGEs, but do you know what they are in your area?
Service Level Agreements (SLA)
Gathering information on exactly how important a database is required for a real disaster. Disaster recovery planning is reliant on knowing your SLAs. You need to know how long a database can be down, which one is the most important, at what point do you have to start calling 3rd party people (customers, vendors, app support), how much data can we afford to lose and at what point you should stop fixing the broken server and failover to other hardware. I know, I know... No one likes giving these answers. My experience normally goes like this.
Question: How long can the database be down?
Answers: It should never be down. As little as possible. Why? Are we having issues?
Question: Which one is most important?
Answers: They're all important. Can't you bring them all up at once?
Question: What point do I start calling 3rd party people?
Answers: What's wrong now? That's case by case, call me first.
Question: How much data can we afford to lose?
Answers: None. None. None. Why? What have you done? None. We should never lose data.
Question: What point should I just stand up a new server?
Answers: We don't have spare servers. Why would you need a new one?
What can I do now?
Well, we can take some preventive action. Some of this is harder than you'd expect without first knowing what your SLA's truly are. Here's a few things you can do anyways today to help until you get these answers.
Find where your backups are stored.
Make sure the backups are stored on a different physical medium than the databases.
Make sure you test your backups occasionally to see if they're even good.
Make sure you have all your drivers for anything that's not standard.
Keep a log of what all databases are on a particular server.
Keep a log of the average size (uncompressed) of your databases per server.
Keep a log of the standard settings you might use for that server. (Ram, drive structure, version number)
Update the phone logs or at least your personal contacts with everyone you need to call if a 2AM incident happens.
Is there some sort of form that can be used?
My next post will include a list of the questions I'd want answered for each database as well as a short list of questions I need to ask myself. Having a printed list for each database, or set of databases if they have the same requirement, can be a career saver.
I plan on making a form to make this a bit easier. I will at the very least create an Excel or Word list with examples. I think this is good to have from your highest Sr. DBA to your multi-hat Network Admin who's being forced to manage a rogue database. Having this signed off by your boss may make the difference of keeping your job during a major outage. A little CYA never hurt anyone.
28.4.14
14.4.14
5 Years Running! OKCSQL!
I want to congratulate OKCSQL on a 5 year run.
I personally wasn't here
for that whole duration, but we have leaders and members that have been. We’re
still growing and have plenty of room for new members. We have an awesome
group! We have some fantastic sponsors even. GDH has been a sponsor for our group the
entire 5 years. This month we have additional sponsors as well, Principal Technologies, Redgate and Dell.
Jen and Sean
McCown are speaking in person for us!
They're even doing a
special double header for us. Here is a quick copy and paste from the OKCSQL
site:
"Sean McCown is our
first in person speaker and The title for the talk is: DIY Performance
Reporting, it will start at approximately 6:15 and will cover:
Stop relying on vendors
to provide you with performance data. Between Windows and SQL Server you've
already got all you need to collect and report on server performance. And it’s
far more flexible than you’ll ever get from a vendor. This is often called a
poor man’s method, but it’s so much more than that. I’m going to show you what
your options are for collecting performance data for free and you’ll even walk
away with a framework you can plug into your own environment and start using
tomorrow with very little effort.
Jen McCown is our next
in person speaker and The title for the talk is: How to Build a SQL Solution,
it will start at approximately 7:15 and will cover:
In this session, you’ll
learn about SQL Server stored procedures (SPs): what they are, when and why
you’d use them, and how you’d go about developing a solution with one. We will
address common SP myths and learn about planning for performance. Most of all,
we’ll walk through examples to explore the process of solution building in SQL
Server."
This is a completely
free event. We have prizes, pizza, and great training. Everyone is invited.
Please come visit us and get some free training. We meet every second Monday of
the month. I hope to see you there!
7.4.14
I'm Speaking at SQL Saturday Houston, May 10th.
I will be speaking at my first SQL Saturday. I'll be in Houston May 10th. I'm rather surprised to be honest. No, I don't have that sinking feeling that my presentation isn't good enough or that I'll show up at the event in my underwear with everyone laughing... those fears will come the few nights before. Honestly it's just because this is the first SQL Saturday I've ever submitted too. I expected to see rejections the first few times.
I have submitted to 4 SQL Saturdays this year. I will be submitting to a 5th once they solidify the date. My hopes were to speak at one event at least. I'm rather ecstatic. I would be happy to present at each and every one of them. (I'm sure the shine will wear off in time) My presentation will be on Fill Factor: Performance or Nuisance? The premise behind this presentation is not a deep dive into how the internals work. It's more built around why the changes matter, how they impact the system and how it can or cannot help.
I'd be happy to see you all out there! I'm hoping to get out and meet more people. So far my interactions with SQL Saturdays is making sure the rooms are ready, tables setup, cookies in the speakers room... that sort of thing. It'll be a little strange seeing it from the other side.
I've written before about the list if things I'm taking with me to make sure everything works. My list has both grown and shrunk. Below is the list of what I'm planning on taking. If you have any suggestions, I'd be more than happy to listen!
Presentation Items:
Laptop and power cord.
Spare laptop (eventually something more like the Surface Pro... but for now just a spare laptop)
DVI to VGA converter
USB to DVI/HDMI converter
USB backup of my presentations, databases and installs
External mouse/Presentation mouse (I still need to pick one up... and soon)
Wireless Hotspot (Just in case)
Print outs of the slides in case the projector has issues( Thanks Andy Yun!)
Print outs of the slides in case the projector has issues( Thanks Andy Yun!)
Travel Items:
Extra complete change of clothes including shoes. Changes of clothes = Days Traveling +1
Travel toothpaste/brush
Headphones
Power inverted (car)
Refrigerated cooler (car)
Travel pillow (plane)
Backpack
Bathing suit
Cash: Tolls + 1 Tank of gas
I think that about covers everything. I'm not entirely sure. Any other suggestions?
31.3.14
SQL 2014 Release Date, April 1st
I'm not a fan of
anything releasing on April Fool’s Day, but that's not for me to decide. I
personally cannot wait to upgrade to 2014. I want to see the full power
behind PCI-e based SSDs pushing extended ram. I want to have the security of
having a server go down and it not take out both sides of the Availability
Group. If you haven't heard, when one goes off, it shuts them both down. I want
to rebuild my partitioned indexes online. I have lots of wants, but how long
will I have to wait to install?
I'm torn here. An old
theory was to wait for Service Pack 1 before installing anything by Microsoft.
Did we replace Service Packs with new versions? How long will we have to wait
for the next "version"? I hope I'm wrong here. I do love new
features. I love upgrades. I do not like upgrading every year.
I have this feeling that
we're about to see releases every other year at most. We may quite possibly see
a release per year. I do not want to see this happen. larger agencies stay on
older tech longer. I think that coupled with the increased prices for SQL
Server with the core licensing may in fact push people off of the Microsoft
stack. They may not lose enough to hurt them bad enough, but they will lose
some.
What other Database
platforms would you all consider? I personally hope we get back to seeing
service packs and keep this stack strong. We have too strong of a community to
see it split.
This is a quote from
their SQL Saturday posting, "We are encouraging participation in our food
drive to benefit the Westside Cares food pantry located at the facility where
the SQL Saturday event will take place. Toward the end of the event, we
plan to give people additional chances to win SWAG based on the amount of items
each person brings for donation to the food pantry."
I really hope we can we
can get enough people to bring food to make a large difference.
If you disagree, please let me know. I'd love to hear other opinions.
24.3.14
SQL Server Enterprise is Cheaper than Standard
Free Month of Plural Site
I'm interested in seeing your responses. Any relevant response will be accepted and a free month of Plural Site will be given away. You do not have to agree with me to be relevant. I want to hear what you think.
This is a bit long winded for me.
First we will address the minimum requirements. We are comparing core licensing only. You must purchase at least 4 cores. Enterprise Edition costs $6,874 per core. That's a starting price of $27,496. Standard Edition costs $1,793 per core. That's a starting price of $7,172.
Now I know what you're thinking, a $20,324 difference seems pretty open and shut against this. This is where I ask that you hear me out. It gets a bit tricky here. Let's start comparing the benefits of upgrading.
Limits | Standard | Enterprise |
RAM
|
64 | Unlimited |
Indexing | Offline | Online |
Table Compression | No | Yes |
Fast Recovery | No | Yes |
Table Partitioning | No | Yes |
Resource Governor | No | Yes |
I know there are a lot more difference relating to BI, AS, RS and many other aspects. Let's just get enough out there to prove the point. I just told you that a $20,000 dollar cost was a savings yes? How can we save $20,000 by spending it?
How much do you make a year?
How about your other DBA, or the JR you're about to hire? How many Developers do you have on staff? How many of them are over worked trying to keep your old Standard server running? Look at your database closely. Let's answer some questions; we'll address this question last.
Do you have the maximum ram that your server can support in it?
That may be 192, 384, any other number. Unless your server is older, it should support more than the 64GB of RAM that Standard does. I know I know, Windows Server Standard only supports 32GB of RAM, but that changed in Server 2012. 2TB is the RAM limit now. I know SQL 2014 allows for 128GB of RAM standard, but that still means more room to grow.
Do you have processor cores just going idle most times?
Just because you have 12 cores doesn't mean you need to license 12. You can set SQL to use the limit of what you license it to use. Only license what you need.
Do you have SLA's to meet that have been difficult due to maintenance windows?
Online indexing allows you to rebuild tables just about any time. You do still get a minor lock at the start and end of an online operation, but that's far better than during the whole process.
Do you have issues with archiving those massive tables?
Is their performance falling behind? Partitioning can help you swap parts of the table in and out while being minimally intrusive. You can even address fragmentation per partition instead of hitting that 10 billion row table all at once. In 2014 you can even do that operation online now!
Do you have multiple databases on the same server fighting for resources?
Well now you can split them up logically instead of having that same conversation about splitting them up physically.
Are you fighting for more space or even considering moving to an Enterprise SAN?
Page compression is a beautiful thing. It should pose no problems on archived tables. If CPU is not a bottle neck currently, you can expand the window of what you compress. Heavily used tables may not benefit as much... but here's where partitioning can work with this. Page Compression saves quite a bit of space. This may be just enough of a space saver to allow you to request those SSDs you've been wanting.
Let's add this all together.
Yes there is a cost up front, but now you no longer need to hire a 3rd full time DBA or Developer. If you get that system on SSD's since we're using page compression to keep our sizes small and partitioning to keep our archived data on slow disks, our response time is faster. We were previously running to the edge of our RAM at all times, now we have cached static data from a month ago. These disks are only being accessed to present changes. Our DBAs are getting more sleep now that their fragmentation jobs aren't blocking all night long. Our maintenance windows are getting shorter and our SLAs have more room to breathe.
This view will not fit all organizations. As always, the phrase "It depends" will fit in this scenario as well. Think carefully about all these issues and the time you've burned fixing them. You could be working on that next project to make your company even more money.
I mentioned Developers in this post. I'll explain now why. With compression, partitioning, more caching, faster access to those tables... you can hide a lot of "quick" codding with that much faster power. I'm not saying that we should code poorly because we can. I'm saying that we can code how we need because we can. Once it's up and working, you can then go back and fine tune.
Enterprise Edition has a higher cost. Asking for more SSD's and more RAM has an additional cost. Not having your talented knowledgeable DBAs and Developers quit due to long hours, continually fighting uphill battles and being denied tools or extra personnel will cost you a lot more in the end. Training your next DBA alone may make up this cost. Not losing your customers due to the inability to meet SLA requirements has a large cost monetarily and to your reputation.
Let's make the world happier, one Database shop at a time.
17.3.14
My Favorite Free Scripts
These scripts cover your backups, index maintenance, verifies the integrity of your databases and logs the results of all of this for you. You can set up one part or all. This covers all of your 101 DBA requirements to keep your servers running in a safe fashion. They even give examples of what you might want to throw into the script! It doesn't get much easier than this.
Adam Machanic wrote a great script called "Who is Active". He even has a 30 part blog series on the ins and outs of this procedure found here. Just about anything you want to know can be found there. It's a fantastic script. I would familiarize yourself with it prior to relying on it for day to day operations.
Kendra Little has a great video and some sample scripts to run to view what indexes your server is wanting. As she will stress, do not just put them all into your database. Too much of a good thing can be really really bad. This is where we look for all that beautiful low hanging fruit.
On the same site as the Missing Indexes video by Kendra Little, there is a great script called SP_BlitzIndex. This was written by the same group. It's the step beyond just looking for missing indexes. Take the time to look through this when you have some time. This is not a place I'd look with a fire to put out... At first anyways. Learn about it in detail before assuming too much and jumping into it.
SP_Blitz was written by the Brent Ozar group and it helps you identify many pain points really quickly and even includes a link to what those mean and suggestions on how to handle it. Honestly, how nice is it these exist for us?!
What Can We Do?
So here we have 5 fantastic scripts. What will these accomplish?
1) We start off with Ola's script and get our backups and index fragmentation under control.
2) We find out from our users if anything in particular is slow or below SLA requirements.
3) We run the SP_Blitz and see what shows up as a major issue.
4) We get a quick break down of what the Missing Indexes are suggesting.
5) We put all this together in a solid actionable list.
6) We present our findings and come up with a solution to work on.
Yes I use the word "We" a lot. I do this on purpose. Correcting this many problems isn't a one person operation. You want the other product heads involved. You need to know what this could break or even if there's a problem you're currently trying to fix. If they need a report to respond in under a minute, spending a week trying to get it from 45 seconds to 5 seconds isn't where the focus should be unless everything else is working perfectly.
There Is More Out There!
These are just 5 of the many many scripts out there for free that have been provided just to make your job easier. I've said it before, I'll say it again... MS SQL Server has a fantastic community around it. Few come even close. These 5 scripts alone can help a DBA sustain a workshop with minimal effort. Use this not as a crutch, but as a starting point to make everything even better.
If you all know any other great scripts that can be added to this collection, I would love to hear it!
3.3.14
Issues Installing SQL 2008 R2 on Server 2012 R2 Clustering
Oh so the nightmare begins.
At the bottom I've included a set of links where I found my answers. What I'm doing here is just giving a comprised list of what I had to do to get it to work. I ran into a few issues almost immediately.
I got an error showing that a 2003 patch was not installed.
Windows Server 2003 FILESTREAM Hotfix Check failed
This required a work around. You need to slipstream install SP1 or SP2. I did not write this step by step. I used what I found here written by Peter Saddow.
1. Copy your original SQL Server 2008 R2 source media to C:\SQLServer2008R2_SP1
2. Download the SQL Server 2008 R2 SP1 packages from here. You need to download all Service Pack 1 architecture packages:
•SQLServer2008R2SP1-KB2528583-IA64-ENU.exe
•SQLServer2008R2SP1-KB2528583-x64-ENU.exe
•SQLServer2008R2SP1-KB2528583-x86-ENU.exe
3. Extract each of the SQL Server 2008 SP1 packages to C:\SQLServer2008R2_SP1\SP as follows:
•SQLServer2008R2SP1-KB2528583-IA64-ENU.exe /x:C:\SQLServer2008R2_SP1\SP
•SQLServer2008R2SP1-KB2528583-x64-ENU.exe /x:C:\SQLServer2008R2_SP1\SP
•SQLServer2008R2SP1-KB2528583-x86-ENU.exe /x:C:\SQLServer2008R2_SP1\SP
Ensure you complete this step for all architectures to ensure the original media is updated correctly.
4. Copy Setup.exe from the SP extracted location to the original source media location. Here is the robocopy command:
•robocopy C:\SQLServer2008R2_SP1\SP C:\SQLServer2008R2_SP1 Setup.exe
6. Copy all files not the folders, except the Microsoft.SQL.Chainer.PackageData.dll, in C:\SQLServer2008R2_SP1\SP\ to C:\SQLServer2008R2_SP1\ to update the original files. Here is the robocopy command:
•robocopy C:\SQLServer2008R2_SP1\SP\x86 C:\SQLServer2008R2_SP1\x86 /XF Microsoft.SQL.Chainer.PackageData.dll
•robocopy C:\SQLServer2008R2_SP1\SP\x64 C:\SQLServer2008R2_SP1\x64 /XF Microsoft.SQL.Chainer.PackageData.dll
•robocopy C:\SQLServer2008R2_SP1\SP\ia64 C:\SQLServer2008R2_SP1\ia64 /XF Microsoft.SQL.Chainer.PackageData.dll
7. Determine if you have a DefaultSetup.INI at the following locations:
•C:\SQLServer2008R2_SP1\x86
•C:\SQLServer2008R2_SP1\x64
•C:\SQLServer2008R2_SP1\ia64
If you have a DefaultSetup.INI at the above locations, add the following lines to each DefaultSetup.INI:
PCUSOURCE=".\SP"
If you do NOT have a DefaultSetup.INI, create one with the following content:
;SQLSERVER2008 R2 Configuration File
[SQLSERVER2008]
PCUSOURCE=".\SP"
and copy to the following locations
•C:\SQLServer2008R2_SP1\x86
•C:\SQLServer2008R2_SP1\x64
•C:\SQLServer2008R2_SP1\ia64
This file will tell the setup program where to locate the SP source media that you previously extracted.
8. Run setup.exe as you normally would.
To add again, this post is just to give a good central location for how to get 2008 R2 installed on server 2012 R2. The original post was here.
There was a second error showing that the cluster service verification failed. You can fix this via the GUI, but it is much easier to open a Powershell window and just copy and paste, “add-windowsfeature RSAT-Clustering-AutomationServer” This was written by Emilson Barbosa Bispo on this page.
All in all it was a good learning experience. I know that it’s normally best to upgrade your OS when you upgrade SQL…. But those licenses are expensive. This goes double if it’s an enterprise license and you have a few servers that don’t need to be up to date just yet.
I really hope this is helpful for someone. I know finding these saved me a lot of time. What’s the worst thing you’ve had to work on? I’d love to hear other stories of interesting fixes.If you run into any other errors while doing this, let me know. I may have come across it and have an easy answer for you.
Good starting post.
http://social.technet.microsoft.com/Forums/windowsserver/en-US/9cb1308d-edd0-44e5-a10e-dfb9f10a1ef3/sql-server-2008-r2-failed-on-windows-server-2012
http://social.technet.microsoft.com/Forums/windowsserver/en-US/9cb1308d-edd0-44e5-a10e-dfb9f10a1ef3/sql-server-2008-r2-failed-on-windows-server-2012
How to Slip Stream
25.2.14
Free Month To PluralSight!
Free Month of PluralSight!
They have said that they only have a limited supply. I submitted my own results and I received the code within about 2 minutes of sending the E-Mail. I personally got a subscription to PluralSight during Black Friday where they had a rather large discount for a year. I have one single code that I do not need and I'd like to hear from someone learning that doesn't have a server they can use to take hold of this great offer SQLSkills is hosting.
I'd like to donate my key to you.
I'll draw the name out of a hat before the end of the day. I'm looking for people who need a key and don't have a server to use.
I think this is great and would like to commend all of our SQL Blog posters who give away training for free. They've gone a step further and given away access to training that has other peoples training as well for free.
24.2.14
SQL Saturday OKC 309!
SQL Saturday OKC is coming up!
The event will be August 23rd, 2014!
Now is the time to get signed up. We already have three great precons ready.
BECOME AN ENTERPRISE DBA WITH SEAN AND JEN MCCOWN
REAL WORLD SSIS: A SURVIVAL GUIDE WITH TIM MITCHELL
PRACTICAL SELF-SERVICE BI WITH POWERPIVOT FOR EXCEL WITH WILLIAM E. PEARSON III
All three of these sessions are being done by well respected authors. These are all on an early bird special until July 15th. The normal price is 120, currently they're 99$. We will are excited to host this great event again. If you're not familiar with the SQL Saturday concept, SQLCenturion wrote a rather extensive blog post on his experiences running them here. I have my own take on them from an attendee perspective as well.
I will be submitting to OKC and I hope many other speakers do as well. Given the list we've had in the last few years, I think we'll have a fantastic event once again. I really hope to see you all there.
If you plan on coming out and would like any local information such as where to find a good restaurant, where is the good theater, or where can I play some putt-putt, feel free to drop a line. I will be at the event and the after party for it. I'd love to meet the people willing to endure this blog. ^.^
While we're on the subject of free training, don't forget to talk your boss into a free lunch or to work on the other free training sites available to you.
Never stop learning or growing.
17.2.14
First Presentation Ever And My 50th Post
Last Monday was the first time I've ever presented anything. Honestly, it was the first time really speaking in public in front of a large group. I learned a few things then. This may seem obvious to anyone who's done this even through high school... but I didn't really do that.
Brent Ozar has a good blog on how to start a blog right here. I decided after reading multiple blogs on SQLServerCentral and other sites that I wanted to start one. I started watching Sean and Jen McCown's Midnight web show, Friday at 11pm central, and going to multiple SQLSaturdays. I decided I wanted to present for the first time.
I did a presentation on fill factor. I chose this topic because I didn't see much around on it and I really wanted to explore what all it meant. I've come to understand a lot more while building the presentation. I'm going to share a few things that helped and didn't along the way.
I started off recording my presentation and giving it to just myself, then I made my poor wife listen to it a few times. It is surprisingly helpful to have someone with no knowledge on the subject sit there. They ask the questions you never expect. That added two slides a bit better explanation and a reminder that I repeat my self way too often. I noticed that I talk a little to fast and I do not transition well. I'm sure that will be easier when the nerves aren't acting up. I made an emergency run to best buy two days before the presentation because I didn't own a web cam so I made an emergency run to best buy. I plan on going back and recording my presentation a few more times until I become a lot more comfortable with it before I try to take this into a SQL Saturday style setting.
I brought along laser pointer presenter device. I was using SQLCenturions. It does the job rather well. The timer alone makes it worth it honestly. I could see that I was 20 minutes ahead of schedule. I also had to borrow my poor daughters laptop. I have a lovely alienware laptop and couldn't use it due to not owning a Display port to anything adapter. It has HDMI out and that doesn't register for the HDMI to VGA adapters. I have recently purchased a converter. It honestly works rather well.
I found out 5 minutes before my presentation that I had planned it on a 4th gen i7 with 16 GB of ram and a 1TB SSD and was going to go live with a 2nd gen i3, 4 gigs of ram and windows 8 that I finished installing on the way to the event. I was not prepared near well enough. I am very happy I had a backup at all.
Now this all sounds bad and like a bad experience... but it was quite the opposite. I had a fantastic time. I received great feed back that I can actually work on and ideas on how to do so. I felt like I was finally giving back to a community that for so long I've used to train. I would do it again in a heart beat.
I'm way to the left out side of this picture... I'm fine with that. This is the Dell room that Dell has given us to use once a month for free. GDH brings us the pizza and all of these people came to listen to my first time presentation with snow in the forecast. I think it was amazing. First time presenting, on a subject that most don't really use on a daily basis, and we had a rather good turn out.
If anyone else in the OKC area want's to come to any of these meetings, they're free. Check us out at http://www.okcsql.org/.We would be excited to see you all there.
Below are the links to the three things that I referred to above. There may be cheaper and better versions of these... If you know of them, please let me know. I'm still building my travel bag so I can be better prepared. So far it includes:
1) Logitech HD Pro Webcam C920
2) Logitech Professional Presenter
3) USB 3.0 to HDMI and DVI Dual Screen Adapter
4) A spare laptop cord
5) A tablet with internet access (through my phone works for now)
6) My presentation setup on Amazon's web service... Just in case
7) A spare mouse
and the more ovbious
8) My laptop
9) (days of travel + 10% rounded up) changes of clothes.
10) Spare shoes
11) Enough cash to use a taxi
This is all I have so far. I have been bitten by the presenting bug and hope to do this a lot more.
Brent Ozar has a good blog on how to start a blog right here. I decided after reading multiple blogs on SQLServerCentral and other sites that I wanted to start one. I started watching Sean and Jen McCown's Midnight web show, Friday at 11pm central, and going to multiple SQLSaturdays. I decided I wanted to present for the first time.
I did a presentation on fill factor. I chose this topic because I didn't see much around on it and I really wanted to explore what all it meant. I've come to understand a lot more while building the presentation. I'm going to share a few things that helped and didn't along the way.
I started off recording my presentation and giving it to just myself, then I made my poor wife listen to it a few times. It is surprisingly helpful to have someone with no knowledge on the subject sit there. They ask the questions you never expect. That added two slides a bit better explanation and a reminder that I repeat my self way too often. I noticed that I talk a little to fast and I do not transition well. I'm sure that will be easier when the nerves aren't acting up. I made an emergency run to best buy two days before the presentation because I didn't own a web cam so I made an emergency run to best buy. I plan on going back and recording my presentation a few more times until I become a lot more comfortable with it before I try to take this into a SQL Saturday style setting.
I brought along laser pointer presenter device. I was using SQLCenturions. It does the job rather well. The timer alone makes it worth it honestly. I could see that I was 20 minutes ahead of schedule. I also had to borrow my poor daughters laptop. I have a lovely alienware laptop and couldn't use it due to not owning a Display port to anything adapter. It has HDMI out and that doesn't register for the HDMI to VGA adapters. I have recently purchased a converter. It honestly works rather well.
I found out 5 minutes before my presentation that I had planned it on a 4th gen i7 with 16 GB of ram and a 1TB SSD and was going to go live with a 2nd gen i3, 4 gigs of ram and windows 8 that I finished installing on the way to the event. I was not prepared near well enough. I am very happy I had a backup at all.
Now this all sounds bad and like a bad experience... but it was quite the opposite. I had a fantastic time. I received great feed back that I can actually work on and ideas on how to do so. I felt like I was finally giving back to a community that for so long I've used to train. I would do it again in a heart beat.
I'm way to the left out side of this picture... I'm fine with that. This is the Dell room that Dell has given us to use once a month for free. GDH brings us the pizza and all of these people came to listen to my first time presentation with snow in the forecast. I think it was amazing. First time presenting, on a subject that most don't really use on a daily basis, and we had a rather good turn out.
If anyone else in the OKC area want's to come to any of these meetings, they're free. Check us out at http://www.okcsql.org/.We would be excited to see you all there.
Below are the links to the three things that I referred to above. There may be cheaper and better versions of these... If you know of them, please let me know. I'm still building my travel bag so I can be better prepared. So far it includes:
1) Logitech HD Pro Webcam C920
2) Logitech Professional Presenter
3) USB 3.0 to HDMI and DVI Dual Screen Adapter
4) A spare laptop cord
5) A tablet with internet access (through my phone works for now)
6) My presentation setup on Amazon's web service... Just in case
7) A spare mouse
and the more ovbious
8) My laptop
9) (days of travel + 10% rounded up) changes of clothes.
10) Spare shoes
11) Enough cash to use a taxi
This is all I have so far. I have been bitten by the presenting bug and hope to do this a lot more.
10.2.14
Do Not Set Maximum Ram To 0 Ever - Mistakes We Make
I'm going down memory lane here and remembering one of the first things I did as a DBA. I was told by one of our other DBA's that if you have 40GB of RAM for a database and it's not working very well, the fix is to modify the max RAM from 40,960 to 1,024 then back to 40,960 after it cleared everything out. I understand now that This isn't the best way to handle it. Queries may fail, everything that was cached will have to cache again, and a slew of other performance issues and possible failed reports.
Here comes the bad part. I had the bright idea at the time that if setting it to 1GB fixed the issue... why not set it to zero? Well SQL doesn't accept zero with any grace. It modifies it to 16 MB of ram. SQL just will not run with that much RAM. SSMS wouldn't load, the website went down, nothing was working. We had to stop all services and login through SQLCMD after starting SQL up in single user mode with the minimal configuration switch.
Now we have downtime in the middle of production hours. I will say there are better ways to fix this, but the way we fixed it was restoring master from earlier that day that still had the correct setting. I wasn't really familiar with working in a DOS or PowerShell window with SQL at the time. This obviously caused down time and didn't help my reputation any.
We All Make Mistakes
I now know what not to do. I know to verify things I do not completely understand and cannot logically pick apart. I know how to research more efficiently and I have a better Disaster Recovery plan. We all make mistakes, really we do. No one is perfect. The thing is though, what are we doing to get better?
This is my request to all of you. Post a story about a mistake you've made and what you did to overcome it. Tell me what you've learned and how you plan on preventing it in the future. I think we could all simply learn from our mistakes or we can help others by letting them learn from ours.
I'm sure someone out there has a good story about deleting a table because they forgot the where clause and the transaction to wrap it in.
3.2.14
What Tables Are In My Filegroups
One issue I've found a bit troublesome is trying to find out what's in a specific Filegroup. Let's say you're trying to clear off a lun or just a drive and you see a file labeled, "iudexes7.ndf". Now unless you built this and have a steal trap for a memory... or just are fantastic at documentation... you probably have no clue what's in this file. If you target the offending database, you can either run the query below as is and gather this data, or add the where clause and target just that file.
Below is a good script for exploring and cleanup.
select sch.name AS SchemaName,tbl.name AS TableName,idx.name AS IndexName,ds.name AS Filegroup,
data_compression_desc,total_pages,total_pages*8/1024 AS SizeInMB,max_column_id_used,fill_factor
from sys.partitions p
inner join sys.allocation_units au on au.container_id = p.hobt_id
inner join sys.filegroups fg on fg.data_space_id = au.data_space_id
inner join sys.tables tbl on tbl.object_id = p.object_id
inner join sys.indexes idx on idx.object_id = p.object_id
inner join sys.schemas sch on sch.schema_id = tbl.schema_id
inner join sys.data_spaces ds on ds.data_space_id = au.data_space_id
and idx.index_id = p.index_id
--where ds.name = 'primary'
order by ds.name, idx.name
This is a good way to move files off a specific drive, clean up wasted space or even just help with some space issues related to a specific file. Happy hunting!
Below is a good script for exploring and cleanup.
select sch.name AS SchemaName,tbl.name AS TableName,idx.name AS IndexName,ds.name AS Filegroup,
data_compression_desc,total_pages,total_pages*8/1024 AS SizeInMB,max_column_id_used,fill_factor
from sys.partitions p
inner join sys.allocation_units au on au.container_id = p.hobt_id
inner join sys.filegroups fg on fg.data_space_id = au.data_space_id
inner join sys.tables tbl on tbl.object_id = p.object_id
inner join sys.indexes idx on idx.object_id = p.object_id
inner join sys.schemas sch on sch.schema_id = tbl.schema_id
inner join sys.data_spaces ds on ds.data_space_id = au.data_space_id
and idx.index_id = p.index_id
--where ds.name = 'primary'
order by ds.name, idx.name
This is a good way to move files off a specific drive, clean up wasted space or even just help with some space issues related to a specific file. Happy hunting!
27.1.14
Forcing Results To Conform For Exports
This one is a bit off for me. I've had to make SSIS exports kick out in specific ways such as how a number is returned or how many spaces a column returned every time regardless of how long the actual return was. Here's a few things I've put in use that seemed to work out well in this situation. If you all have any others I don't have here that are more common or a better fit, I'm always interested adding something new to my list of tools.
Two of these were created by Igor Nikiforov. They are included at the bottom of this page. They need to be added before this script will work. If you take nothing else from this post, please visit Igor's page and look at a few of his User Defined Functions. These are very useful if you're background isn't strong into coding.
The original query is a simple select from adventure works.
select
addressid, addressline1, addressline2, city, StateProvinceID, postalcode, modifieddate
from AdventureWorks2012.Person.Address
These are a few of the conversions we used to get the outputs to fit as we needed to match an older method.
select
dbo.padl(addressid,10,'0') as AddressID
,left(addressline1 + space (40),40) as AddressLine1
,Case when addressline2 is null then '' else addressline2 end as AddressLine2
,isnull(city,'No City Listed') as City
,dbo.padl(StateProvinceID,3,'0') as StateProvinceID
,dbo.padr(convert(char(15),postalcode), 15, ' ') as ZipCode
,convert(varchar,ModifiedDate,110) as Date
,convert(varchar,ModifiedDate,108) as Time
from AdventureWorks2012.Person.Address
order by convert(varchar,ModifiedDate,112) desc, convert(varchar,ModifiedDate,108) desc
/****** Object: UserDefinedFunction [dbo].[PADR] Script Date: 01/26/2014 23:30:33 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- Author: Igor Nikiforov, Montreal, EMail: udfs@sympatico.ca
-- PADL(), PADR(), PADC() User-Defined Functions
-- Returns a string from an expression, padded with spaces or characters to a specified length on the left or right sides, or both.
-- PADR similar to the Oracle function PL/SQL RPAD
Create function [dbo].[PADR] (@cString nvarchar(4000), @nLen smallint, @cPadCharacter nvarchar(4000) = ' ' )
returns nvarchar(4000)
as
begin
declare @length smallint, @lengthPadCharacter smallint
select @length = datalength(@cString)/(case SQL_VARIANT_PROPERTY(@cString,'BaseType') when 'nvarchar' then 2 else 1 end) -- for unicode
select @lengthPadCharacter = datalength(@cPadCharacter)/(case SQL_VARIANT_PROPERTY(@cPadCharacter,'BaseType') when 'nvarchar' then 2 else 1 end) -- for unicode
if @length >= @nLen
set @cString = left(@cString, @nLen)
else
begin
declare @nRightLen smallint
set @nRightLen = @nLen - @length -- Quantity of characters, added on the right
set @cString = @cString + left(replicate(@cPadCharacter, ceiling(@nRightLen/@lengthPadCharacter) + 2), @nRightLen)
end
return (@cString)
end
/****** Object: UserDefinedFunction [dbo].[PADL] Script Date: 01/26/2014 23:30:21 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- Author: Igor Nikiforov, Montreal, EMail: udfs@sympatico.ca
-- PADL(), PADR(), PADC() User-Defined Functions
-- Returns a string from an expression, padded with spaces or characters to a specified length on the left or right sides, or both.
-- PADL similar to the Oracle function PL/SQL LPAD
Create function [dbo].[PADL] (@cString nvarchar(4000), @nLen smallint, @cPadCharacter nvarchar(4000) = ' ' )
returns nvarchar(4000)
as
begin
declare @length smallint, @lengthPadCharacter smallint
select @length = datalength(@cString)/(case SQL_VARIANT_PROPERTY(@cString,'BaseType') when 'nvarchar' then 2 else 1 end) -- for unicode
select @lengthPadCharacter = datalength(@cPadCharacter)/(case SQL_VARIANT_PROPERTY(@cPadCharacter,'BaseType') when 'nvarchar' then 2 else 1 end) -- for unicode
if @length >= @nLen
set @cString = left(@cString, @nLen)
else
begin
declare @nLeftLen smallint, @nRightLen smallint
set @nLeftLen = @nLen - @length -- Quantity of characters, added at the left
set @cString = left(replicate(@cPadCharacter, ceiling(@nLeftLen/@lengthPadCharacter) + 2), @nLeftLen)+ @cString
end
return (@cString)
end
22.1.14
PowerShell Remote Commands
This is something I've found useful here recently when I had a server acting up. It's a simple way to send commands to a remote server such as 'shutdown -r'
Enable-PSRemoting -Force
Enables to you actually remote through PowerShell.
Test-WSMan Server1
Just tests that you can reach the server through this method.
Invoke-Command -ComputerName Server1 -ScriptBlock { Get-ChildItem C:\ } -credential Domain\UsrID
If you're only sending a single command, you'd replace the Get-ChildItem C:\" with what you're wanting to run.
Enter-PSSession -ComputerName Server1 -Credential Domain\UsrID
This command allows me to interact on a long term method. Basically if I'm sending more than a single command, this would work better.
I'm sure there are other methods. I like short concise code. How do you all connect?
20.1.14
Who Do You Rely On?
We tend to think of ourselves as self reliant. Sit and think for a second, how many people do you rely on?
Help Desk
Our angry callers hit them first. They let us know if something's down and our monitor doesn't see it. The more information we can give them, the easier they can make our lives. If you know you have a planned outage, give them a heads up. They will appreciate it immensely.
Developers
These are the people that build the applications your business is making it's money from. They connect customers with the data you're protecting. We need to get along with them. If we put ourselves on an even level and make sure we keep a good balance, we can get so much more done. Protect the data, but not at the cost of the business.
Network Admin
When you try and hit a server and can't because the routes were updated again, instead of just being mad at the Network Admin, consider this. All those times you had no problems? It's because they had it working. They're in the same boat as us. Noone notices how good of a job we're doing until it all catches fire.
System Admin
I haven't built a production system from the ground up in years. I love my Sys Admins. They get tasked to setup 50 new SQL boxes, 100 new IIS servers and half a dozen DC's for various projects. Once they do all the heavy lifting, we can use a single PS script to install SQL for us across all the new servers and just go to lunch. Give them credit for working on all the things they do so we can concentrate on what we actually want too.
Management
Good managers go to bat for you. They are the ones that argue for new budgets to get more tools, people, hardware and even that soda fountain in the break room. They're the ones trying to make sure no one calls you while you're on vacation or tries to get the SQL Saturday hotel room paid for when you sneak off on the weekends. The more you work for them, the more they'll work for you. (Bad managers excluded, terms and conditions apply, not available at all jobs, see supervisor for details)
Mend The Fence!
Take the time now. Think of some way you can make it easier to work with them and get the projects completed that needed to be done. The easier we make it to work together, the better we can make our companies. I hate seeing posts of an SA or NetAdmin trying to lock down the DBA just as much as I hate seeing the DBA lock out the Developers from even staging. There are times when it's required, but others it just simply isn't. I'm not saying give the DBA's Domain Admin or the password to the proxy servers, or even giving the Developers SA on production... But sometimes a little read only helps a lot when isolating a problem at 2 am on Christmas morning.
Final Thought
I'll end with this. Where I work now, our Network Admins are great. They go the extra mile to take care of us. Our first line support staff goes out of their way to get all the information they can. Our SA does what he can to make our lives easier and our Developers talk with us when they have a question. This may sound like everyone doing their job, but let's be honest. This is the exception. This helps raise retention. Get your people involved, make them feel equal and make sure to take a day and let them enjoy it. Maybe a little Dave and Busters with a long lunch. You'd be surprised what wonders that'll do for your staff.
Help Desk
Our angry callers hit them first. They let us know if something's down and our monitor doesn't see it. The more information we can give them, the easier they can make our lives. If you know you have a planned outage, give them a heads up. They will appreciate it immensely.
Developers
These are the people that build the applications your business is making it's money from. They connect customers with the data you're protecting. We need to get along with them. If we put ourselves on an even level and make sure we keep a good balance, we can get so much more done. Protect the data, but not at the cost of the business.
Network Admin
When you try and hit a server and can't because the routes were updated again, instead of just being mad at the Network Admin, consider this. All those times you had no problems? It's because they had it working. They're in the same boat as us. Noone notices how good of a job we're doing until it all catches fire.
System Admin
I haven't built a production system from the ground up in years. I love my Sys Admins. They get tasked to setup 50 new SQL boxes, 100 new IIS servers and half a dozen DC's for various projects. Once they do all the heavy lifting, we can use a single PS script to install SQL for us across all the new servers and just go to lunch. Give them credit for working on all the things they do so we can concentrate on what we actually want too.
Management
Good managers go to bat for you. They are the ones that argue for new budgets to get more tools, people, hardware and even that soda fountain in the break room. They're the ones trying to make sure no one calls you while you're on vacation or tries to get the SQL Saturday hotel room paid for when you sneak off on the weekends. The more you work for them, the more they'll work for you. (Bad managers excluded, terms and conditions apply, not available at all jobs, see supervisor for details)
Mend The Fence!
Take the time now. Think of some way you can make it easier to work with them and get the projects completed that needed to be done. The easier we make it to work together, the better we can make our companies. I hate seeing posts of an SA or NetAdmin trying to lock down the DBA just as much as I hate seeing the DBA lock out the Developers from even staging. There are times when it's required, but others it just simply isn't. I'm not saying give the DBA's Domain Admin or the password to the proxy servers, or even giving the Developers SA on production... But sometimes a little read only helps a lot when isolating a problem at 2 am on Christmas morning.
Final Thought
I'll end with this. Where I work now, our Network Admins are great. They go the extra mile to take care of us. Our first line support staff goes out of their way to get all the information they can. Our SA does what he can to make our lives easier and our Developers talk with us when they have a question. This may sound like everyone doing their job, but let's be honest. This is the exception. This helps raise retention. Get your people involved, make them feel equal and make sure to take a day and let them enjoy it. Maybe a little Dave and Busters with a long lunch. You'd be surprised what wonders that'll do for your staff.
15.1.14
Restore Master From Backup
There are a few posts running around talking about restoring master from backup. Thomas LaRock has a fantastic "How To: Restore The Master Database In SQL Server 2012" post. It's very well organized and has a lot of great examples and directions. SQLSkills also has a survey up referring to a survey about restoring or rebuilding master. I'm really curious what their post will entail.
I'm going to throw something in the hat here. It's not as detailed as what you'd see from SQLRockstar or SQLSkills. This is a document I put together about a year ago for my group so that when we had to restore from master, someone had a step by step on how to do so.
The word doc is located here.
This is not a training item or a deep dive into how the rebuilds are done and how you can map the stars with them. This is just simply a doc that you throw in your Disaster Recovery Plan folder and let it collect dust until something major happens. If your people know where you keep this... hopefully the instructions are simple enough that they can follow it. You may modify it some to fit your org better.
This is simply a run doc that sits and collects dust until a fire brews. Have your people do the training, have them read through SQLRockstar's post. It's great. Keep this document on the file share so if they forget ,you won't need to search through websites trying to remember something you've done once or twice a career.
I'm going to throw something in the hat here. It's not as detailed as what you'd see from SQLRockstar or SQLSkills. This is a document I put together about a year ago for my group so that when we had to restore from master, someone had a step by step on how to do so.
The word doc is located here.
This is not a training item or a deep dive into how the rebuilds are done and how you can map the stars with them. This is just simply a doc that you throw in your Disaster Recovery Plan folder and let it collect dust until something major happens. If your people know where you keep this... hopefully the instructions are simple enough that they can follow it. You may modify it some to fit your org better.
This is simply a run doc that sits and collects dust until a fire brews. Have your people do the training, have them read through SQLRockstar's post. It's great. Keep this document on the file share so if they forget ,you won't need to search through websites trying to remember something you've done once or twice a career.
13.1.14
What Tables Are Wasted?
Wasted space still seems to be on everyone's mind to some extent. I've wondered what tables do we use? We have our offenders, those are easy to track. What about that little database that's been around for 4 years that no one left even really knows what it does? I have a set of scripts that may help some with that. I'll include a download link as well.
It's listed in the comments, I'll mention it here as well. This list is only accurate to the last reboot of services. Do not base your delete decisions on this solely. If you track it over time, it should be reviewed to see about removing, then only remove it with a good backup so you can restore it if needed. Take caution any time you remove parts of your database.
You can download the full script Here. Any comments or suggestions are appreciated. Thanks again!
/*
Author Bill Barnes
Created 01/10/2014
Use: The purpose of this script is to show indexes and tables that have not been used
since the last restart of SQL services. This script can be quickly modified to show what
tables have been used and provide more useful data. This will ignore all system schema
based tables.
Note: an update or scan is counted per instance set not per scan. If you update a row once
you get one addition to update. If you update a table and change 50,000 records, that still
is only 1 update. Keep that in mind when reading the numbers provided.
*/
--This Version will only pull a list of tables that have shown no use.
select sch.name as SchemaName, obj.name as TableName, idx.name as IndexName, obj.object_id,
usage.user_lookups,usage.user_scans, usage.user_seeks, usage.user_updates, usage.system_lookups,
usage.system_scans, usage.system_seeks, usage.system_updates, usage.last_user_lookup, usage.last_user_scan,
usage.last_user_update, usage.last_system_scan, usage.last_system_seek, usage.last_system_update
from sys.indexes idx
full outer join sys.dm_db_index_usage_stats as usage on idx.object_id = usage.object_id
and idx.index_id = usage.index_id
inner join sys.objects as obj on idx.object_id = obj.object_id
inner join sys.schemas as sch on sch.schema_id = obj.schema_id
where usage.database_id is null
and sch.schema_id <> 4
and obj.object_id is not null
order by obj.name
-- This version provides a list of all tables that are in use.
select sch.name as SchemaName, obj.name as TableName, idx.name as IndexName, obj.object_id,
usage.user_lookups,usage.user_scans, usage.user_seeks, usage.user_updates, usage.system_lookups,
usage.system_scans, usage.system_seeks, usage.system_updates, usage.last_user_lookup, usage.last_user_scan,
usage.last_user_update, usage.last_system_scan, usage.last_system_seek, usage.last_system_update
from sys.indexes idx
full outer join sys.dm_db_index_usage_stats as usage on idx.object_id = usage.object_id
and idx.index_id = usage.index_id
inner join sys.objects as obj on idx.object_id = obj.object_id
inner join sys.schemas as sch on sch.schema_id = obj.schema_id
where usage.database_id is not null
and sch.schema_id <> 4
and obj.object_id is not null
order by obj.name
--This version shows a sum of all activity on these tables.
select sch.name as SchemaName, obj.name as TableName, idx.name as IndexName, obj.object_id,
sum(usage.user_lookups) UserLookups,sum(usage.user_scans) UserScans, sum(usage.user_seeks) UserSeeks,
sum(usage.user_updates) UserUpdates, sum(usage.system_lookups) SystemLookups,
sum (usage.system_scans) SystemScans, sum(usage.system_seeks) SystemSeeks, sum(usage.system_updates) SystemUpdates
from sys.indexes idx
full outer join sys.dm_db_index_usage_stats as usage on idx.object_id = usage.object_id
and idx.index_id = usage.index_id
inner join sys.objects as obj on idx.object_id = obj.object_id
inner join sys.schemas as sch on sch.schema_id = obj.schema_id
where usage.database_id is not null
and sch.schema_id <> 4
and obj.object_id is not null
group by sch.name, obj.name, idx.name, obj.object_id
order by obj.name
It's listed in the comments, I'll mention it here as well. This list is only accurate to the last reboot of services. Do not base your delete decisions on this solely. If you track it over time, it should be reviewed to see about removing, then only remove it with a good backup so you can restore it if needed. Take caution any time you remove parts of your database.
You can download the full script Here. Any comments or suggestions are appreciated. Thanks again!
/*
Author Bill Barnes
Created 01/10/2014
Use: The purpose of this script is to show indexes and tables that have not been used
since the last restart of SQL services. This script can be quickly modified to show what
tables have been used and provide more useful data. This will ignore all system schema
based tables.
Note: an update or scan is counted per instance set not per scan. If you update a row once
you get one addition to update. If you update a table and change 50,000 records, that still
is only 1 update. Keep that in mind when reading the numbers provided.
*/
--This Version will only pull a list of tables that have shown no use.
select sch.name as SchemaName, obj.name as TableName, idx.name as IndexName, obj.object_id,
usage.user_lookups,usage.user_scans, usage.user_seeks, usage.user_updates, usage.system_lookups,
usage.system_scans, usage.system_seeks, usage.system_updates, usage.last_user_lookup, usage.last_user_scan,
usage.last_user_update, usage.last_system_scan, usage.last_system_seek, usage.last_system_update
from sys.indexes idx
full outer join sys.dm_db_index_usage_stats as usage on idx.object_id = usage.object_id
and idx.index_id = usage.index_id
inner join sys.objects as obj on idx.object_id = obj.object_id
inner join sys.schemas as sch on sch.schema_id = obj.schema_id
where usage.database_id is null
and sch.schema_id <> 4
and obj.object_id is not null
order by obj.name
-- This version provides a list of all tables that are in use.
select sch.name as SchemaName, obj.name as TableName, idx.name as IndexName, obj.object_id,
usage.user_lookups,usage.user_scans, usage.user_seeks, usage.user_updates, usage.system_lookups,
usage.system_scans, usage.system_seeks, usage.system_updates, usage.last_user_lookup, usage.last_user_scan,
usage.last_user_update, usage.last_system_scan, usage.last_system_seek, usage.last_system_update
from sys.indexes idx
full outer join sys.dm_db_index_usage_stats as usage on idx.object_id = usage.object_id
and idx.index_id = usage.index_id
inner join sys.objects as obj on idx.object_id = obj.object_id
inner join sys.schemas as sch on sch.schema_id = obj.schema_id
where usage.database_id is not null
and sch.schema_id <> 4
and obj.object_id is not null
order by obj.name
--This version shows a sum of all activity on these tables.
select sch.name as SchemaName, obj.name as TableName, idx.name as IndexName, obj.object_id,
sum(usage.user_lookups) UserLookups,sum(usage.user_scans) UserScans, sum(usage.user_seeks) UserSeeks,
sum(usage.user_updates) UserUpdates, sum(usage.system_lookups) SystemLookups,
sum (usage.system_scans) SystemScans, sum(usage.system_seeks) SystemSeeks, sum(usage.system_updates) SystemUpdates
from sys.indexes idx
full outer join sys.dm_db_index_usage_stats as usage on idx.object_id = usage.object_id
and idx.index_id = usage.index_id
inner join sys.objects as obj on idx.object_id = obj.object_id
inner join sys.schemas as sch on sch.schema_id = obj.schema_id
where usage.database_id is not null
and sch.schema_id <> 4
and obj.object_id is not null
group by sch.name, obj.name, idx.name, obj.object_id
order by obj.name
9.1.14
New Baby Girl!
On the 7th of January, 2014 my baby girl Kaitlyn was born. She came in about a month early.
This is my mother and my wife using face time so that my sick grandmother can see the new grand baby. It wasn't an intentional 4 generation picture.
Sleep exists... but only for the one that cries the loudest. I swear she can cry and snore at the same time. She's personally adorable. She's got two older sisters who seem both interested and afraid of the "crying burrito".
A lot of my friends have said that you cannot be prepared for a newborn, you can only try. I was also told to sleep any chance I got just in case. So far though... she's fussy a little on eating, but she's sleeping well and not overly crying.
This year is starting off rather well so far. She's healthy and happy. The wife is doing well now. We didn't get attacked by snowmagedon. Lot's of good things so far. I'll be back with more posts once things calm down. For now though, I'm going to take a short break. If she's realllly good, I'll be back next week. ^.^
6.1.14
Posts I've Found Interesting:
This is a question I had until I came across a great script by The Scripting Guys. The script below is unaltered and entirely their creation. I just found it useful enough to forward it along. This is a short post. Prepping for a baby on the way has cut into my sleep. ^.^
select 'table_name'=object_name(i.id) ,i.indid ,'index_name'=i.name ,i.groupid ,'filegroup'=f.name ,'file_name'=d.physical_name ,'dataspace'=s.name from sys.sysindexes i ,sys.filegroups f ,sys.database_files d ,sys.data_spaces s where objectproperty(i.id,'IsUserTable') = 1 and f.data_space_id = i.groupid and f.data_space_id = d.data_space_id and f.data_space_id = s.data_space_id order by f.name,object_name(i.id),groupid go
SQLSoldier wrote a post recently on how DMV's cost him his job by helping him find a better one. I found it rather interesting and thought I should forward it along. A similar situation happened when the company I was with showed no interest in providing better training, adjusting pay any, and was overall just difficult to get a hold of. I was a contractor working as a DBA. A good friend of mine had an opening where he worked and I ran for happier times. Some times the grass really is greener.
There's an older post (2007) about joins. Let me say first, I'm not a coder. I've worked with some Java, C++, VB, and obviously T-SQL. I always prefer a visual I can work with or semi working code to build off of. Coding Horror had a post explaining SQL joins visually. It's worth looking into.
1.1.14
Final Giveaway And Some Stats
Today I'm honored to give the final prize in a month of free giveaways here, the book; Professional SQL Server 2012 Internals and Troubleshooting to Robert! I have been lucky enough to give away 4 months of PluralSight to Komal, Annapu Reddy Gayathri, Aadhar Joshi, and Daniel.
Things I've Learned Blogging
This is my 40th post so far. I've noticed that most of my audience is coming from India at almost a 2 to 1 ratio over America. I think that's amazing. I honestly thought it would take months or even a year for this to be read out of the states.
Most of my readers come on around 0600 Central time and again around 1400 Central time. I'm not sure if it's just that time of day or what. I normally schedule my post to show up at 0300 or 0500. I only post after that on days I'm announcing something such as a winner.
I don't think I can maintain a post a day and have anything worth reading. I Plan to do a Monday, Wednesday release with an optional Friday. I'm the sole poster on this site and that takes a tole. I know the wife is tired of hearing, "I plan on working some tonight". I'd like to get a second person who's interested in posting on here sooner or later. Maybe even guest posts. I wouldn't mind hosting a few posts for someone who wants to tell the world something interesting but doesn't want to start a blog.
Commentators and people in general are not that vicious. I was honestly nervous when this blog started getting posted to SQL Server Central. I was expecting some negative feed back such as my information was wrong, too low level, basic stuff, or even just not needed. Instead I've gotten a rather good response and helpful advice. Anyone who's considering writing... go for it.
Thank you all for your support. Next week I'll get back to more technical posts. I really appreciate the people who have come here, who have decided to start following the site and those that commented. Even if it was just to win something. ^.^
Things I've Learned Blogging
This is my 40th post so far. I've noticed that most of my audience is coming from India at almost a 2 to 1 ratio over America. I think that's amazing. I honestly thought it would take months or even a year for this to be read out of the states.
Most of my readers come on around 0600 Central time and again around 1400 Central time. I'm not sure if it's just that time of day or what. I normally schedule my post to show up at 0300 or 0500. I only post after that on days I'm announcing something such as a winner.
I don't think I can maintain a post a day and have anything worth reading. I Plan to do a Monday, Wednesday release with an optional Friday. I'm the sole poster on this site and that takes a tole. I know the wife is tired of hearing, "I plan on working some tonight". I'd like to get a second person who's interested in posting on here sooner or later. Maybe even guest posts. I wouldn't mind hosting a few posts for someone who wants to tell the world something interesting but doesn't want to start a blog.
Commentators and people in general are not that vicious. I was honestly nervous when this blog started getting posted to SQL Server Central. I was expecting some negative feed back such as my information was wrong, too low level, basic stuff, or even just not needed. Instead I've gotten a rather good response and helpful advice. Anyone who's considering writing... go for it.
Thank you all for your support. Next week I'll get back to more technical posts. I really appreciate the people who have come here, who have decided to start following the site and those that commented. Even if it was just to win something. ^.^
Subscribe to:
Posts (Atom)