Using PowerShell to Generate a Transaction Log Restore Script

This is a really basic PowerShell post—those of you who do lots of PowerShell automation can skip it, and I know I could have taken this three steps further and automated the whole process. Thanks Mike Fal (b|t)! (seriously, if you want to read some great SQL PowerShell tips hit Mike’s blog) J Anyway, I’m building an Availability Group for a client with a very large database. I had taken a full backup, and was going to take a differential, so I didn’t have to apply a full day’s worth of transaction log backups to the database. Well, it turned out the workload consisted largely of bulk insert operations, which are minimally logged—what that meant was that my differential backup was huge, but my transaction log backups were quite manageable. Except for the fact that I had about 100 of them, so I didn’t want to hand enter all of the file names.

One of my favorite techniques for automation is to use the concatenation operator in T-SQL (+) to dynamically create SQL statements (note: if your reading this and you’re a database developer, don’t do this, write stored procedures—it’s a valid technique for running one-off queries, but doesn’t scale well). We can do the same thing in PowerShell. But first things first—let’s figure out how to list the files.



    Get-childitem $backpath

What we are doing with those two commands, is defining a variable ($backupPath) and then doing a file listing on it. That listing isn’t useful to us yet—it has too much information.

What we want to do is parse the name out of that data and concatenate a restore statement.


Get-Childitem $backupPath|sort -property LastWritetime|select {“restore log dataBaseName from disk='”+$backupPath+$”‘ with norecovery”}

We are doing a few things here—we’re doing our original listing of files, but we are now piping it (the |) to the next command—from there we are taking advantage of PowerShell’s object model. You’ll notice the curly brackets after the select—those are specifying a script, which you will need in order to concatenate text with the piped in variable. Unlike in dynamic SQL there is no need to escape the single quotes, that our T-SQL command will ultimately need, if a string is in double quotes in PowerShell is treated just as a string and not evaluated. So that single quote will print as a single quote. You’ll note we are a doing a select and our value $ is the file name—this is one of the objects that the get-childitem cmdlet contains. We are using the $_. in front of “name” because we are in a script task. For example, if you executed the following code:


Get-Childitem $backupPath|select name


You will see this:


So what do we get when we run our statement?

I only had one backup here—on my client system I had over 200. I was able to just copy the code and paste into Management Studio and run the restores. You will note that this is the bare minimum of what you can do with this—you could use time intelligence to specify a time you wanted to restore from, and you could definitely automate the restores into PowerShell commands for SQL Server. I was keeping it simple and fast as this was a simple scenario. One thing I probably would do that I didn’t is use the write-output cmdlet to send the code to a SQL file. Happy Thanksgiving and Happy Coding!



A Long Day’s Journey to a Saints Game or #joeystravelstories


When I noticed the PASS Summit and MVP Summits were on back to weekends, and there was a New Orleans Saints home game on All Saint’s Day (01 November) and Alaska Airlines had introduced a new nonstop flight to Seattle that would let me watch that Saints game and make it back in time for MVP Summit, I jumped at the chance. (Apologies for the absurdly long run-on sentence—it’s late) So after a fantastic PASS Summit I went to Seattle’s airport early Saturday morning. For reasons I don’t remember now, I had booked myself on an American Airlines (I do have status with them) flight through Dallas rather than the inbound nonstop Alaska flight.


The Saints

In case you were wondering the significance of All Saint’s Day in relation to my favorite American football team, the New Orleans Saints, the franchise was granted to the city 49 years ago today, November 1, 1966. I’ve been to Saints games all over the United States, and even one in London. My dad keeps season tickets and I usually travel in for several games a year. It’s a fun bonding experience, and I’ve seen all sorts of miraculous things happen like Steve Gleason blocking the punt the night the Superdome reopened, Garrett Hartley kicking the Saints into the Super Bowl, and Tracy Porter closing the deal on that Super Bowl.


In general, I have pretty good travel luck. I suppose it helps that I live in a hub city (Philadelphia) so most of my flights are non-stop. Unlike my good friend Karen Lopez, who has her own hashtag (#karenstravelstories) I mostly have an occasional delay here or there, but I usually get where I am going without issue.

Today, however that would not be the case, we were a few minutes late leaving Dallas, and I knew there were some pretty bad thunderstorms around the New Orleans area. As we made our approach into Louis Armstrong airport the turbulence was incredible, and I was genuinely concerned about our ability to make a safe landing. As it turns out, so was the captain of the plane. We aborted the landing, and circled briefly before heading to scenic Shreveport, Louisiana (which is famous for Hope Foley having grown up there) to refuel and hopefully come back to New Orleans.

Oh, but American…

So we landed in Shreveport, and the pilot made mention of a minor mechanical issue that had to be checked out by a mechanic, in addition to our refueling, so to his credit he let everyone off of the plane. The flight crew was absolutely fantastic in general, but there was one problem, we were basically in a complete information blackout. I was chatting with the flight attendants and they were in the dark as well. After a couple of hours of this, I really just wanted them to cancel the flight so I could get my checked bags (yes, slap me now) and get a rental car. The on-call mechanic (who was a contractor and not an AA employee) was apparently trick or treating with his kid and couldn’t make it.

American didn’t officially cancel the flight until shortly after 10 PM (we landed at Shreveport at 6:30).

My DR Plan, or Why I’m Blogging This..

As a business traveler, you learn to plan for adversity and always have three fallback plans. It’s just like dealing with a major database issue—you need to have a plan of attack. As soon as I heard we were landing in Shreveport, I knew I was pretty screwed. It’s a tiny airport with next to no service to anywhere that’s not a hub. In addition to the Saints game, I need to make my Alaska flight tomorrow night at 7 so I can get to MVP Summit. So I had a couple options:

  • Wait out the delay—this was problematic, but a plan to where you are going is always the best bet
  • Get a rental car and drive to New Orleans—it’s a 4-5 hour drive, but I couldn’t get my baggage
  • Get a rental car and drive to Dallas and get AA to book me on the first flight to New Orleans tomorrow morning
  • Stand in line with the other people at the airport and try to get rebooked (LOL, don’t ever do this—you’ll get to where you’re going next Thursday)

After a certain point in time, I decided to book the rental car. I knew we weren’t going to get out on the plane until really late, and I started hearing murmurs of cancellation from the flight crew (pro tip—always stand next to the flight crew). As soon as the flight was officially cancelled, I took off to rental car, and got a car reserved before the hoard got there. I was in my car with my bags within 30 minutes.

Never Outdrive Your Headlights

This is a quote from “The Art of Racing in the Rain” that I used both metaphorically and in reality tonight. It was late, but I had the benefit of being on west coast time and napping on my flight from Seattle. So I didn’t feel tired at all, but my fallback plan was to stop and get a hotel if needed. It was also raining the whole way, which made the level of concentration required a little extreme, but I think it helped me stay awake. For example, I was mentally composing this post in my last hour. This same theory applies to solving a database problem—always look at what’s in front of you before trying to chase down other problems that might not be real, or that you can’t see in your vision.

It’s Not Always a SAN Problem

Tomorrow I’ll be doing a new webinar on how to identify performance problems related to storage in SQL Server. Sometimes, what looks like a SAN problem to SQL Server may just be related to missing indexes, or not enough memory, or something else that’s forcing SQL Server to do way too much I/O. In this session you’ll learn how to:

  • Identify when SQL Server is having storage related issues
  • Understand what’s happening from the perspective of the O/S
  • What performance counters are useless when you are on a SAN
  • How to talk to your SAN admin about performance

Join me and my friends at Embarcadero for this great webcast at 11:00am Pacific / 1:00pm Central / 2:00pm Eastern on Wednesday September 16.

SQL Saturdays are Not Your Personal PBS Pledge Drive

I’m a big fan of PBS (and NPR), I’m a member of our local station, but I am still annoyed as hell when I watch something purports itself to be a real show on PBS, and yet it is a pledge drive. I can almost always guess them to—it’s never the McNeil-Leher news hour. It’s usually a concert from some pretty solid artist who has broad appeal across the PBS viewing audience. Anyway, I’m rambling. Let’s get back to SQL Saturday—most of the speakers SQL Saturdays, Code Camps, and to an extent the PASS Summit are volunteers, we pay our own travel expenses, sometimes take time off of real jobs to be there. Sure some people work for vendors that fund their travel, and others (I fall into this category) have their own companies setup to make travel deductible, but no one is paying us to be there. I’m not complaining about that—it’s part of what makes SQL Saturday a great event. It’s an event run for the community by the community, and I hope it always stays that way.

I was at a recent SQL Saturday, where I happen to be extremely close with the organizer, and there was a speaker (from outside of the SQL community) who was making a ton of demands on the organizer. Wanted to be in a special room, wanted peak times (not for travel reasons, just for a larger audience), special equipment, etc. Listen, if there’s one person who has it much harder than the speakers, it’s the SQL Saturday organizer. You’re trying to manage 30 speakers, some of the most interesting (could you please read your GD email) questions from attendees, dealing with venue people and caterers, and you just don’t have time to deal with the needs of a single primadonna speaker. If you want a good way to never get invited to speak at that SQL Saturday again, I recommend submitting a list of the demands to the event organizer. No seriously, don’t do that, ever. If you do it in Philadelphia, one of our rooms will likely be under construction, but only during the hour you were scheduled to talk. But wait there’s more.

Figure 1 My SQL Saturday Rider

So after all of this happened before the event, I really wanted to see this session. After all, if this speaker had all of these special demands, they must be really amazing, right? The session was professional development—and while most of the advice was reasonably sound, there was a disturbing trend of the speaker answering questions with “it’s in my book” or “it’s in the DVD on my site.” My personal favorite was when someone asked for a copy of the slides (which speakers generally upload as a PDF or even PPT to the SQL Saturday website), the speaker responded “you can buy the DVD, they’re on there, they don’t pay us to come speak at these events, so I have to make some money somehow.” Yeah, buddy, we all paid to be here you jerk. There were a couple of other things that bothered me, like some mild insulting of the audience, but that’s beyond the scope of this. This felt like the informerical where they give a you a taste of a product, but if you want the real deal you have to call 1-800-SCAM-ME and give a credit card number and wait 4-6 weeks for your delivery. But wait there’s more.

So, I left a few minutes early, but it turns out the speaker was conducting cash sales of his book and DVD from the podium after his session. To the extent that it delayed the next speaker, which is always the classiest thing you can do as a speaker at a conference (it’s a good way to show people how important you are). At SQL Saturday’s we have these people that let us run the event—they are called sponsors and they don’t appreciate when someone who doesn’t pay for the right to sell stuff at the event does.

Those are the words of my friend Vicky, who works for a frequent SQL Saturday sponsor (and full disclosure, client of mine) Idera.


If you want to make sure you aren’t doing it wrong, just be sure to follow the Denny Cherry and Associate’s HR policy. Wheaton’s Law.

Why The Query Store is Awesome and You Should Immediately Use It

I haven’t been blogging enough recently—I’ve been busy writing, traveling, and speaking all around the world, and trying to get ready for my PASS PreCon on Power BI. It’s been a busy year—it’s barely September, and I’ve already requalified for my frequent flier status. Anyway, you don’t care about me. Let’s talk about SQL 2016—there are a myriad of new features and enhancements that are truly awesome. And for those of you who are all “old man yells at cloud” most of the new features and enhancements will help you in your on-premises environment. Anyway—there’s more time for that between now and whenever 2016 gets released. Let’s talk about the Query Store.

I got an email from one of my clients (these guys are brave and already live on 2016—they along with Microsoft have that much confidence in the code) about a query that was running poorly over the weekend (worse than in the older environment). A little bit about their environment—it’s largely a data warehouse type solution, with the goal of delivering data sets to their clients. In the upgrade to 2016, we did a rearchitecture that heavily leveraged clustered columnstore indexes, and took advantage of Availability Groups for scale out reads. So remember when one of your customers would email you about something that was slow over the weekend, and you would desperately scour the plan cache, possibly writing xQuery to look for the needle in the haystack of a query that was performing poorly?


Figure 1 Query Tuning in SQL 2014


Enter the Query Store. I opened up the database, and find the Query Store in Object Explorer.


I can see “Top Resource Consuming Queries”—if I click on that, a report will launch. (It will default to the last hour—click configure in the top right to change—the time interval and the resource you are curious about.)



Since I wanted to see queries with heavy logical reads from last weekend, I changed the resource type to logical reads, and the duration to “Last Week”. And then, I get this:


Well that was easy—I see the clear outlier of a query, and I see the index recommendation. I also get the execution plan—where I note the biggest expense is that SQL Server is doing a full scan of a fact table. Writing this post took me way longer than it did to isolate that query and propose a couple of options for tuning it to the client. The Query Store is awesome.





Picking the Right Fights With Your Server Team—Embarcadero Webinar

Today I’m going to be doing a webinar with my friends at Embarcadero entitled “Picking the Right Fights with Your Server Team”, I got to do this presentation in Indianapolis recently, and I was really happy with it. You’ll learn about what to ask for from the sys admins about the following topics:

  • SAN Configuration
  • Server Configuration
  • Windows Settings
  • Virtualization

These are topics a lot of DBAs typically don’t have a lot of insight into, so in this session you’ll learn about how to speak the language of the other teams. IOPs, thin provisioning, and OUs. Join us at 11 AM PDT/2 PM EDT/7 PM GMT here.

Idera Geek Sync—Dealing with Bad Roommates SQL Server Resource Governor

Next Monday at 1000 CDT/1100 EDT/1500 GMT I will be doing a webinar with Idera on SQL Server Resource Governor. In my time as a SQL Server DBA, it’s rare especially in small to medium size companies to have only one applications databases running on a given SQL Server. SQL Server licensing is expensive—frequently far more expensive than the cost of server hardware, so workloads get combined. It is not uncommon to see an application like SharePoint running on the same server that hosts a data warehouse. Since these applications have really diverse requirements for their system configurations, this can lead to performance problems and arguments between application teams. So what is an overworked DBA to do? Starting with SQL Server 2008 Microsoft introduced the resource governor a feature that let you control CPU and memory for a given workload—this was great for controlling applications like SharePoint that don’t want parallelism, but what about that report that your sales manager runs that overwhelms the storage array with its 15 million reads? Well there’s an answer for that too—in SQL 2014 Microsoft introduced an IO feature for resource governor.

In this one hour webinar, you will learn about:

  • Why you want to use resource governor
  • How resource governor works inside of SQL Server
  • How to implement resource governor in your environment
  • Why Microsoft will continue to enhance resource governor

I really like this feature and feel that it is underutilized—please join me next Monday. You can register here.

Webinar—Managing a Multi-Platform Environment

Today, I’ll be doing a webinar for Embarcadero on managing a multi-platform database environment. As some of you know, I’ve worked on both Oracle and SQL Server throughout my career—each RDBMS has it nuances, and they are just different enough to make transitioning to a new platform quite challenging.

In this webinar, you will learn about:

  • Challenges to working in a multi-platform environment
  • Compare and Contrast Oracle to SQL Server and vice versa
  • Benefits of tools to simplify management

I came across an interesting study recently that over 70% of organizations support more than one database platform. I see this in many organizations that I consult with, however I don’t see nearly as much crossover between DBAs—you will also learn how to build these skills for yourself, or for your organization.

I hope you can join us on Wednesday June 17th at 11 AM PDT, 2 PM EDT, 1800 GMT. You can register at the link above.


The Great Brain Robbery

Originally posted on Jacquie Phelan's Weblog:

Teatime at Offhand Manor. Photo: Carl Gooding


In 1999, Trek bike company parasitized my years of hard work, good will, and brand management to sell to women. I used to be able to produce and sell out women’s camps…now I barely exist. Women’s Mountain Bike & Tea Society™ was  fifteen years old when someone at Trek woke up and found out that half the population is not young white men.

Here’s a 1999 WOMBATS home page (above) and  Trek’s rip-off ad (below).

Our web site was designed by New York artist Hadley Taylor, who saw an opportunity to showcase her humor and talent for organizing information in an easy-to-follow. engaging way. WOMBATS website won a “25 best websites”  award from Wired Magazine that year (1997). Hadly made a Wombats Art Gallery, complete with “virtual cheese” and “virtual wine”, galleries to stroll through featuring black and white photos of women…

View original 825 more words

Webinar: Becoming a Top DBA—Learning Automation in SQL Server

I’m doing a webinar this Wednesday for Embarcadero at (11AM/1100 PDT,1 PM/1300 CDT, 2 PM/1400 EDT, 6 PM/1800 GMT) on a topic I’m quite passionate about—automation. The value you truly bring to your job as a database professional isn’t in doing mundane, repeatable tasks like installing SQL Server, running T-SQL scripts, or even putting out minor fires like a drive running out of space. The real value added by an outstanding DBA is the skill to manage way more servers by automating the mundane (potentially saving the firm the cost of a full time employee), tuning performance on critical business systems, and most importantly being able to recovery critical data in the event of failures.

In consulting, I see a lot of organizations were the DBAs are really under a ton pressure—there are never enough resources (have you tried to hire a DBA lately? It’s really hard), there’s more and more data coming from a wider variety of systems, and at the same time the DBA has new skills to try and learn, like cloud and virtualization. So how do you get you get yourself out of this endless cycle and raise your profile both in your company and in the community? Automation—if you can minimize the amount of time you have to spend on low value tasks, you can maximize and even find free time to learn new skills.

In this one hour webinar, I’m going to talk about a few things:

  • Tools used for automation in SQL Server
  • Examples of automation for the good
  • Techniques that you can use to automate nearly any task
  • I’m going to mention the phrase dev-ops

I hope you can join me—I think a lot of DBAs can get a great deal of benefit from learning this techniques and applying them on a day to day basis. I hope you can join us.


Get every new post delivered to your Inbox.

Join 2,368 other followers

%d bloggers like this: