The Great Brain Robbery

Originally posted on Jacquie Phelan's Weblog:

Teatime at Offhand Manor. Photo: Carl Gooding

Trailer-2

In 1999, Trek bike company parasitized my years of hard work, good will, and brand management to sell to women. I used to be able to produce and sell out women’s camps…now I barely exist. Women’s Mountain Bike & Tea Society™ was  fifteen years old when someone at Trek woke up and found out that half the population is not young white men.

Here’s a 1999 WOMBATS home page (above) and  Trek’s rip-off ad (below).

Our web site was designed by New York artist Hadley Taylor, who saw an opportunity to showcase her humor and talent for organizing information in an easy-to-follow. engaging way. WOMBATS website won a “25 best websites”  award from Wired Magazine that year (1997). Hadly made a Wombats Art Gallery, complete with “virtual cheese” and “virtual wine”, galleries to stroll through featuring black and white photos of women…

View original 825 more words

Webinar: Becoming a Top DBA—Learning Automation in SQL Server

I’m doing a webinar this Wednesday for Embarcadero at (11AM/1100 PDT,1 PM/1300 CDT, 2 PM/1400 EDT, 6 PM/1800 GMT) on a topic I’m quite passionate about—automation. The value you truly bring to your job as a database professional isn’t in doing mundane, repeatable tasks like installing SQL Server, running T-SQL scripts, or even putting out minor fires like a drive running out of space. The real value added by an outstanding DBA is the skill to manage way more servers by automating the mundane (potentially saving the firm the cost of a full time employee), tuning performance on critical business systems, and most importantly being able to recovery critical data in the event of failures.

In consulting, I see a lot of organizations were the DBAs are really under a ton pressure—there are never enough resources (have you tried to hire a DBA lately? It’s really hard), there’s more and more data coming from a wider variety of systems, and at the same time the DBA has new skills to try and learn, like cloud and virtualization. So how do you get you get yourself out of this endless cycle and raise your profile both in your company and in the community? Automation—if you can minimize the amount of time you have to spend on low value tasks, you can maximize and even find free time to learn new skills.

In this one hour webinar, I’m going to talk about a few things:

  • Tools used for automation in SQL Server
  • Examples of automation for the good
  • Techniques that you can use to automate nearly any task
  • I’m going to mention the phrase dev-ops

I hope you can join me—I think a lot of DBAs can get a great deal of benefit from learning this techniques and applying them on a day to day basis. I hope you can join us.

Don’t Shoot Yourself in the Foot–Azure Hybrid Scenarios

I’m working with a great company named Opsgility to produce some training around using SQL Server and Azure Database. Some of the demos I’ve been producing are around using hybrid scenarios without Active Directory and/or a VPN connection to Azure. Some people may unfortunately encounter these scenrios and have to use them. I’m sorry!

I know this is a SQL Server focused blog and most of my readers aren’t AD or networking experts (heaven knows, I’m not). But let me state this—if you are building a hybrid environment follow the following two steps:

GET A $%^&ing VPN CONFIGURED BETWEEN YOUR ON-PREM SITE AND AZURE

This will make everything way easier—no firewalls (Except for the pesky windows one) to worry about, no opening SQL Server to the public internet, or any of that nonsense. You’ll be able to ping and connect to your servers just like they are in your data center. It costs $30/month roughly to have a network gateway, and my rate is way higher than that, and it took me most of the day to get log shipping working in this scenario yesterday.

USE ACTIVE DIRECTORY AND HAVE A SECOND DC IN AZURE

Not having AD makes everything really painful. In order to make log shipping work, I had to set both of my SQL Service accounts to the same user id and password—I don’t think that configuration is inherently insecure, but it just feels dirty. It also makes mirroring setup a lot more painful than it needs to be. If you are using Availability Groups of course, you’ll have AD, as it’s requirement of clustering, but some of these other scenarios (replication, log shipping, mirroring) don’t require it. Set it up anyway.

When you have AD and a VPN Azure feels like an extension of your data center. When you don’t have these things, everything feels like a painful kludge. Don’t build a Rube Goldberg machine.

How Much Do SQL Server Licenses Cost?

So I’m being a little bit of an SEO whore in this post because it took way to many google searches to get this data.

 

SQL Server 2014 Enterprise Edition—Core Licensing Only–$27, 495/4 cores, $13, 747.50 per additional 2 core packs.

SQL Server 2014 Standard Edition—Core licensing–$7171/4 cores, $3585.50 per additional 2 core packs,

Server plus CAL Licensing– $898/server (CALs are $170/named user)

SQL Server 2014 BI Edition—Server Plus CAL Licesning Only–$8592/server (CALs are $170/named user)

 

 

Managed Backup to Azure—Configuring on Your Server

Anything that simplifies backup is something that I’m a big fan of—as a DBA it’s your job to make sure your company can restore its critical data in the hardware failure, user error, or natural disaster. So making the process simple and easy is a good thing right? Another thing that is important is sending your backups to a second location—in the event of natural disaster, or storage failure (please don’t tell me you are backing up your databases to the same SAN that they live on) it’s nice to have a DR site. In the old days, we would backup to tape and then ship the tapes off to secure location. In doing so we opened the potential for a lot more people to potential steal our data, and managing tapes is a nightmare. Many companies now are taking advantages of the cloud to archive their backups, or in some cases be their primary backup location. Both Amazon and Microsoft have really good (and really cheap—as low as a penny a gigabyte per month) options for doing this. As you might expect, Microsoft’s solutions are tightly integrated with SQL Server, and pretty easy to use.

In SQL Server 2014, Microsoft introduced two features that bring this cloud backup solution to the fore—backup encryption (independent of transparent data encryption (TDE)) and managed backup to Azure. Managed backup to Azure is also know internally, smart_admin (that’s the schema that owns the object which run this feature). You simply specify a retention period (in days) for your backups, supply a credential, and SQL Server does the rest. There are a few limits—currently system databases, and there is a maximum size of 1 terabyte, and your databases have to be in full or bulk-logged recovery modes. Aside from that, SQL Server determines the frequency and the type of the backups based on the workload of your database. You can find the full details of how SQL makes these decisions here.

You’ll need a couple of things to configure managed backup:

  • A SQL Server Instance with databases (if you don’t have those, why are you here J ?)
  • A SQL Credential
  • An Azure storage account
  • An Azure Management

The first thing you’ll need to do is create your credential—this is where you’ll need your Azure publishing profile. You’ll need to work with whomever manages your Azure account to get permissions to get permissions for the this task, but to do it we’ll break out PowerShell (Microsoft Azure PowerShell which you’ll need to download here. Run the command get-azurepublishsettingsfile

 

From there, Azure will launch a webpage (and may prompt you to login with your credentials). In my case, I was logged into my account, so my file was automatically downloaded. Now to Management Studio. Note—if you want to encrypt your database, you’ll want to create a master key and a certificate for database backup. Please do me favor, back these up some place safe, immediately. Better yet—put them in a container in your Azure storage account. You won’t be able to restore your encrypted backups without these, so it’s pretty important.

Ok—so now you have a publish settings file and a certificate. So you are to set up Managed Backup. First find it, expand the management pane in SSMS and right click on Managed Backup and select configure.

 

You’ll notice that this reports that you had selected an invalid credential, so go ahead and click “Create”. Another screen will popup here

Browse to where you downloaded your publishsettings file. Next click the drop down box and select the storage account which you want to use for this. The drop down will enumerate any Azure blob storage accounts to which your account has access. The naming convention for this account:container name—you container name will be $SERVERNAME-$INSTANCENAME. Go ahead and click create.

Now you can click the check to box to “Enable Managed Backup” and the “Encrypt Backup” checkbox. Select the certificate you created above, and choose your encryption, I’m using AES 256 here.

 

 

Now, that we have managed backup enabled, we need to setup a retention period. This will be a T-SQL operation, and we have the option to either set retention at the instance level, or at the individual database.

 

You’ll need the name of your certificate and the name of your credential. Your backups are now officially starting. One last thing—configure monitoring, so you can get notified when your backups are taking place.

 

That procedure with that specific input and parameter will setup your notifications and you are now good to go. My backup started.

 

 

 

 

 

 

Troubleshooting Power BI Preview Tabular Data Refresh

Just before Christmas, Microsoft delivered a present to business intelligence (BI) geeks everywhere—we got access to the Power BI Dashboard preview, and additionally got a few new toys. A dedicated client that integrates all of the tools in the Power suite, and a new data gateway to allow us to refresh Power BI charts from an on-premises tabular model. In addition, to tabular Microsoft give us the option to connect to a wide variety of cloud data sources (see Figure 1), along with a new API for Power BI that enables developers to connect their own custom data sources with Power BI. All of these are great features.

Figure 1 Get Data Screen in Power BI Dashboard

Since Stacia Misner (b|t) and I are doing a precon on Hybrid Power BI in Nashville in a couple of weeks, I wanted to setup a demo environment. I thought this would be pretty easy—and the initial setup was. The first thing you need to do is download the Analysis Services connector from the Power BI website.

Figure 2 Download Analysis Services Connector

At first, I didn’t think too much about this. I had a small tabular database on my laptop and I tried to connect it. I’m not going to walk through the entire setup here (it’s pretty basic). When I went into Power BI to try get data from my on-premises source, I found the following error:

SQL Server Analysis Services

  

Test12

The Analysis Services you are trying to reach (or the gateway to it) appears to be down or your access is denied; please contact the publisher of the server! If this is an unexpected error, please refer to the technical information below!

Technical details:

Activity Id: be9f376f-ef6c-a916-054b-525699aaf4ef

Request Id: f986e722-308a-55ec-6377-173d6f5d5c88

Date: 2015-01-07 14:03:47Z (UTC)

 

I ran profiler against AS (which along with extended events are really the only way to get information from a Tabular instance)—and I got the following:

Figure 3 Error in SQL Server Profiler

I know my password was correct, as the client side tool makes a connection to AS and validates the connection. This initial configuration was running on my domain joined laptop, with single sign on (SSO) configured on the Office 365 side. I wasn’t responsible for the full setup of SSO and directory integration, so I decided to build out a test environment in my lab. This involved building a new Office 365 tenant, configuring Active Directory Certificate Services (ADCS) and Active Directory Federation Services (ADFS) to perform directory sync with Office 365. That’s beyond the scope of this post, but for more detail see Stacia and I’s white paper on the topic. So I did all of that—and then I got the following:

 

Figure 4 The Point When Joey punched his desk

At this point I started digging around the Power BI help—it’s pretty good, but it’s not very search engine savvy, you will have to search on your own. I came across a blog from Greg Galloway (b) where I realized the problem. Remember Analysis Services only uses domain authentication. And even though our directories are synced, I don’t have single sign on configured, so as opposed to coming in as username@corp.contoso.com they were coming in as username@nashprecon.onmicrosoft.com.

Figure 5 Users Synced with AD

Greg proposes a solid workaround here—add an alternative UPN suffix and add those users to AS. This allows me to create users in the corp.contoso.com domain that have

Figure 6 Alternative UPN Suffix Creation

So I created a user in that domain:

Figure 7 User Creation

And then, everything worked.

Figure 8 Connection to AS Tabular in Power BI

There are a couple of things I’d like to mention here—and I’ll add (and I’ll post them to the product team)

  • There’s no way to edit/remove/add the connection from the Power BI site—it was only through the connector. This is inconsistent with the behavior in legacy Power BI sites. Also, it makes it difficult to verify properties in your connection (and the client lets you change a few things, but not the server name), so there is a 1:1 relationship between gateways and tabular instances
  • The error messages could be way more helpful—I realize this is challenging in a hybrid world, but links to guidance on troubleshooting would be helpful
  • After I federated AD with Office 365, I no longer had the option to login to my Office 365 in the initial setup process for the connector. As far as I know, this isn’t documented anywhere.

Good luck in your testing!

 

 

 

 

 

 

 

 

SQL Saturday Nashville Precon

On Friday January 16th, I will be doing a precon for SQL Saturday Nashville with Stacia Misner (b|t) entitled “Finding Your Balance: BI in the Cloud, On Premises, or Both”. Stacia and I wrote a white paper for Microsoft last year on this topic, and we’ve continued to evolve the course material. This class will be aimed at a variety of IT folks—I’d really recommend it for IT managers, as we’ll cover a good overview of most of the cloud services available from Microsoft that are applicable to business intelligence (BI). For BI practitioners, in addition to the overview, we will demo Power BI, including the all new Power BI designer and dashboard features.

In addition to covering the BI side of data warehousing, we will also cover what you need to know from a network, active directory, and performance perspective to make cloud or hybrid data warehouse solutions work for you.

In this full-day workshop, you learn:

  • How to use Azure components to build a BI infrastructure completely or partially in the cloud
  • What changes to your solution architecture are necessary when you move components to the cloud
  • Best practices for configuring and networking Azure resources
  • How to integrate your cloud-based solution with your on-premises Active Directory (both Power BI and Azure IaaS and PaaS)
  • How to securely connect Power BI to your Azure and on-premises data sources
  • How to configure supported data sources for scheduled refresh
  • Best practices for administering Power BI components in Office 365

I hope you can join use in Nashville.

PASS Summit 2014 Speaker Idol—A Judge’s Perspective

One of the most interesting events I got to take part in during last week’s PASS Summit, was to be on the panel of judges for Speaker Idol. Denny Cherry (b|w) borrowed an idea from TechEd and sold to PASS—basically speakers who had never spoken before at PASS Summit would get the opportunity to give a five minute talk on a topic of their choosing, and the top three speakers plus one wildcard would advance to the final round on Friday, with the winner receiving a guaranteed slot to speak at the 2015 PASS Summit. Going into this as a judge, I wasn’t quite sure what to expect, but we saw some really amazing stuff.

As IT consultants and professionals, public speaking is just a part of our full time jobs, but it’s something to work on as part of overall career development. As a speaker, you will have good days and bad days (sometimes on the same day—this happened to me last Wednesday at Summit), but you learn to roll with the punches, and recover when things are bad. The five minute lightning talk format of Speaker Idol magnifies this—any mistakes tend to be glaring.

Keep Your Legs Together

As a regular speaker and a judge, I think you immediately tend to hone in on the mistakes you find yourself making. In my case, almost every time I speak, within the first 30 or so seconds I find myself either moving just a little bit, or rocking back and forth. When I took a class in public speaking during my MBA program, it was hammered into me to keep my feet together, which gives your body a more stable platform and prevents rocking. Movement in general isn’t bad—but you want to make movements that emphasize your points or garner engagement with the audience, not small movements that are distracting.

Jazz Hands

Being raised in an Italian family, I’m totally guilty of over-gesticulating when I speak—your hands are powerful, and are a fantastic way of emphasizing a key point, but at the same time need to be controlled. Another thing to note is that what may feel like a very large hand movement to you, when you are on stage can look small and just distracting to a large audience. Big hand movements emphasize your point—small ones are just distracting.

Why the Winners Won

Our top two speakers (and it was really close) Pieter Vanhove and Rob Volk were both really amazing. Both of them took some pretty big risks by leaving the stage to engage the audience. This can be a very powerful move, or it can flop. They both nailed it. Additionally, they were both spot on in timing, which while important in a regular talk, is critical in a lightning talk. Pieter in particular had extremely beautiful slides—while using some of the conference template, he didn’t let his slides be constrained by it—adding in some very excellent images. Additionally, both talks managed to compress a lot of information into five minutes—I think most of the judge’s panel did not know the SSMS had the ability to do regular expression replacement. Pieter gave a great talk on the benefits of using SQL Server Central Management Server to manage a large environment (I think that’s a very underutilized feature).

 

Denny’s post is here.

Karen’s post is here

Are There Data Egress Charges for Using Power BI Against AzureDB?

At PASS Summit last week (and if you weren’t there, you missed an amazing conference as usual) Stacia Misner (b|t), whom I co-authored the white paper “Using Power BI in a Hybrid Environment” asked me a question that she was asked in her session. If a user is using Microsoft Azure SQL Database as a data source in Power BI, does that data source incur data egress charges? A couple of background notes here—generally speaking in most cloud computing scenarios data ingress (that is loading your data) is free, whereas data egress (retrieving your data into, say a reporting solution) costs money (pricing details are here, and not terribly expensive). Another generality around most cloud scenarios is that data transfer within the same data center also does not count against your data egress charges.

Another thing to note is that Office 365/Power BI and Microsoft Azure are two completely different services—I tried to be as clear about this as I could in the white paper, and sometimes it does strike me as odd, that two services, quite possibly living in the same data center are not aware of each other, but this does again seem to be the case. So let’s walk through the scenario for this.

Power BI and Microsoft Azure SQL DB

Data refresh in Power BI, is only directly supported by two data sources, SQL Server and Oracle (either on-premises, or running in Azure), if you want to connect to Azure DB (or a vast array of other data sources) the method to use, is to bring the data into your Excel data model using Power Query, and then pass the connection string into the data sources within Power BI. (Note—for full details on this, read the above white paper). So to test this I built a very simple Azure DB using the Adventure Works (AW) 2012 database for Azure (currently, Azure DB is not fully feature compatible with in-box SQL Server, so a special edition of AW is needed). I then generated a data model, against two tables, just enough that auto refresh would work normally.

Figure 1 Workbook in Power BI

In case you were wondering why I called the workbook “Bingo” I was having some issues related to the build of Power Query I was on that may have caused a little bit of frustration (or maybe that was just Drew Brees’ performance in Sunday’s Saints game). Bingo was the workbook that finally worked correctly. If I go to the data refresh screen, and execute “refresh report now”

 

Figure 2 Data Refresh Screen

From here—you can execute a refresh on demand. There is no concept of an incremental refresh—it seems like all data is refreshed when data refresh happens. So how do we see if this incurs data egress?

Azure DB and DMVs

There is a DMV in AzureDB named sys.bandwith_usage that tracks bandwidth usage for a given database by the hour. So an entry is made for each hour, where there is data usage against a given DB. I haven’t figured out an easy way to show space used in Azure DB for a given table, but from looking at my on-premises version of AW, I can about 12 MB of data in the tables I am using in this workbook.

So let’s check the DMV.

Figure 3 Query Results

As you can see in the results—refreshing my workbook seems to “cost” about 17 MB. From what I can tell there is no differentiation between an Azure data source and an on-premises one (some of that data use, particularly master, was from an on-premises SSMS session). I was hoping to provide a little more detail, but DMVs and netstat let me down—any help in the comments would be greatly appreciated.

Conclusion

In the grand scheme of things this isn’t a big deal (you get 5 GB of egress for free a month), but it is something to think about when designing BI solutions in the cloud. A couple of things that would be great from a Power BI perspective would be the ability to incrementally refresh data, and some sort of Azure direct connect, which didn’t include data charges.

 

PASS Summit Surprise—Career Management Session

In a last minute replacement, Karen Lopez (twitter) and I are presenting a session today at 3PM, at the PASS Summit in rooms 615-617. This is a really fun session we’ve given at a number of SQL Saturdays and other events—on managing your career.

 

 You Wouldn’t Let HR Manage Your DBs…

…So don’t let them manage your career.

Do you know that you may have left tens of thousands of dollars on the table during your last negotiation? Do you know that you can ask for more than money when negotiating salary? Are you taking vacation just to be here at the PASS Summit?

In this session we will share our experiences working in a range of organizations from very formal giant corporate HR departments, government agencies and small tech startups. You will learn about how your HR organization works, what salary levels and midpoints are, negotiation strategies, when to say “no” and how to say “yes”.

 

We’ll talk about negotiation, career paths, and making the right career choices for you—come join us at 3 PM.

Follow

Get every new post delivered to your Inbox.

Join 2,089 other followers

%d bloggers like this: