Challenges of Speaking—Being Ready for Anything..

I have the good fortune to have a job that allows me to travel around the world speaking about technology. I love this—it gives me a way to see new places, experience other cultures, and meet a lot of wonderful people how have similar interests to me. I always say about half of my job is working on computers and databases, and the other half is talking and writing about them. I’ve been speaking professionally for almost 10 years now, and have had a lot of unique situations.

One of the biggest challenges is that I speak a lot about cloud computing—generally speaking this involves connecting over a myriad of ports to internet connections that may or may not be blocked. Here are some tips that I’ve used to deal with getting to the cloud in various bad network configurations.

  • Use a VPN to remote desktop to another machine that can get out
  • When you realize the network is blocking RDP packets use your phone to tether
  • Set up a VM in Azure with RDP running on Port 443 (https port) and RDP to that
  • Finally, use screenshots, if all else fails
    All of these are complicated by foreign countries, different networks, different network policies, etc.
    The one that takes the cake for challenges though is losing the projector. I had this happen for the first time at a SQL Saturday in Houston—I was doing an session on SQL Server High Availablity and Disaster Recovery and projector locked up. It was a session I easily could have done as a white board. However, the staff was able to get me up and running again.
    Recently, I had my machine lock up, during a session at SQL Saturday Cleveland, so while I was trying to get it to work, I calmly pulled out my second machine, fired it up while I got my other machine working, and kept going. I had another failure, so I restarted on my DR machine. I’m not saying you need to have two laptops in your bag, but it is a nice to have.
    Finally, this week while I was teaching in India, I lost video again—this time there weren’t good white boards. My topic was networking in Azure—so I worked with another presnter to do a live session with each of use representing parts of the network. The session went over quite well.

So what are the tips you need to know:

  • Always have a DR plan (this goes for travel, presenting, IT, and life in general)
  • Stay calm, getting overly nervous does you or the audience no good
  • If things go really sideways, don’t be scared to co-present with someone. It’s a good rescue mechanism
  • Your audience wants you to succeed, they aren’t against you. So relax.
    When all else fails, scotch.

Getting Started with Polybase—Incorrect Syntax near ‘EXTERNAL’ Error

I was playing around with configuring Polybase this morning. Nothing to exotic, just installing the feature getting started. I got everything installed and was about ready to created my data source. The command to that is as follows:

CREATE EXTERNAL DATA SOURCE AzureStorage
WITH (           
TYPE = Hadoop,        
LOCATION = ‘wasb://polybase@mystorageaccountnamehere.blob.core.windows.net’,
CREDENTIAL = AzureStorageCredential);

Unfortunately, it returned the following error:

Msg 102, Level 15, State 1, Line 1

Incorrect syntax near ‘EXTERNAL’.

So what was the fix? A reboot after installing Polybase. The official documentation refers to restarting SQL Server, which I did. Nothing warned me to reboot, but rebooting fixed the error.

Using SQL Sentry Plan Explorer to Shed Light on Estimated Execution Plans

Note: While I do have business relationships with many vendors, I do not have one with SQL Sentry aside from attending their great annual party at the PASS Summit.

SQL Server allows us to have access to two types of execution plans, estimated and actual. Actual execution plans give us the real row counts and statistics the optimizer gets when executing the query, while the estimated plan displays the estimates the SQL Server optimizer used based on statistics when it generated the plan. The ideal scenario for performance tuning is to have the actual plan, but when you are dealing with a long running query that runs hours, and a busy procedure cache, that can be a real challenge to get.

So yesterday, I was helping a friend tune a couple of queries, one of which was running hours, and one of them was running in minutes. So I got my hands on the estimated execution plans, and I loaded them quickly in SQL Sentry Plan Explorer (which has a free version). I can’t recommend this tool enough—for simple query plans Management Studio is ok, but the second you have any level of complexity, Plan Explorer allows you to quickly break it down and identify where the problems are. As in this case:

fig1

Figure 1 The Good Query

fig2

Figure 2 The Bad Query

I quickly honed into the fact that the bad query was doing a lazy spool with 6 BILLION rows versus the 229 million in the good query. However, my friend who was looking at Management Studio was asking where I was getting those numbers. This is what the XML from the original plan showed:

<RelOp AvgRowSize=”51″ EstimateCPU=”0.00889634″ EstimateIO=”0.01″ EstimateRebinds=”0″ EstimateRewinds=”139581″ EstimatedExecutionMode=”Row” EstimateRows=”48868″
LogicalOp=”Lazy Spool” NodeId=”55″ Parallel=”true” PhysicalOp=”Table Spool” EstimatedTotalSubtreeCost=”1242.86″>

I made the assumption that Plan Explorer was doing some math on the execution plan, so I contacted my good friend Aaron Betrand (b|t) who works at SQL Sentry and asked him about it. His response was that “Plan Explorer will sometimes adjust the displayed row counts for certain operators where we know execution count is greater than one”. This is a really great use case for getting better data out of estimated execution plans when that is all you have to work with.

Are You Still Installing SQL Server from the GUI?

This post was inspired by the following tweet from the Robert Davis.

A little bit of a story—I once had a client who was insanely worried about renaming SQL Servers for a migration. Note: It’s no big deal to rename an offline SQL Server—the ONLY thing you need to change is the internal server name by running sp_dropserver and sp_addserver. There is a single registry key called “ORIGINAL_SERVER_NAME” it stays the same across multiple server renames. Every other SQL registry key changes when you do a rename of the server. So there is nothing to worry about in the registry if you do a name change. This name change process was part of our automation strategy for cloud at Comcast, and if I had to guess, it’s also part of what Microsoft is doing in Azure.

Anyway, on to my original topic—I know most shops aren’t Comcast where we were deploying multiple servers a week. However, automating your installation process has a lot of other benefits—you don’t have to worry about misconfigurations, unless you put in into your script. It also forces you into adopting a standard drive letter approach to all of your servers. So when something breaks, you know exactly where to go. And more importantly you save 15 minutes of clicking next.

Courtesy of XKCD

I love this comic from XCKD. It doesn’t take a lot of time to automate this process. So how to you do it?

It’s easy—run through the installer, and use all of your best practices, service accounts like you would for any new install. When you get to the end, before you click install. You’ll notice this:

See that file—save it somewhere handy and then cancel out of the install. There are a couple of edits you’ll need to do to make it automation ready, the biggest is you want your install to either run in quiet or quiet simple mode (I prefer quiet simple—it shows the GUI, but doesn’t allow any interaction with it). From there it’s as simple as running the following command (assumes you have the SQL media mounted under D: )

D:\setup.exe \configurationfile=C:\temp\GoodConfigFile.ini

%d bloggers like this: