Skip to content

Month: January 2019

SQL Server Container Instances via Cloudshell

Every now and then (actually, quite often now I think of it) I need to test code on a bunch of different versions of SQL Server. And I prefer those SQL Server instances to be fresh every time. Quickest way that works for me, fire up my Azure Cloud Shell and start a script. The script creates all containers asynchronously, takes about 15 minutes because it needs to pull them.

First things first, make sure you have your Azure Shell setup.
Then make sure you an empty resource group.
My little solution consists of 3 files:

sqlcontainers.config, which holds the name of the resource group where the containers should land, the dns-suffix that my container names will have to make sure they are unique, and the sa password that they will have.

resource_group=my-containers-test-rg
dnssuffix=mygeneratedsuffix
sapwd="mygeneratedpassword"

Tip: Generate the suffix and the sa password with a password manager, don’t use known simple passwords and / or pre-fixes that have a meaning and would be perhaps easy to guess. These SQL instances will available on the internet and accessible by anyone if you don’t choose a complex password. Also, don’t use them to store sensitive data, this should be used for testing purposes only!

sqlcontainers.csv, they contain a comma separated list of the publicly available container images with SQL Server that I’m using, the name that they should have (which will also get the suffix that’s in the config file) and the operating system type. (windows or linux)

dbafromthecold/sqlserver2012dev:sp4,sql2012,windows
dbafromthecold/sqlserver2014dev:sp2,sql2014,windows
dbafromthecold/sqlserver2016dev:sp2,sql2016,windows
mcr.microsoft.com/mssql/server:2017-latest-ubuntu,sql2017,linux
mcr.microsoft.com/mssql/server:vNext-CTP2.0-ubuntu,sql2019,linux

Microsoft provides us with the (linux) containers for sql2017 and 2019. The older version containers have been created by Andrew Pruski (b/t). Andrew blogs a lot about containers and SQL Server, do take a look if you want to know more about it!

sqlcontainers.sh. The bash shell file that loads the config parameters and then loops through the items in the csv file to create containers.

#!/bin/bash

. ${BASH_SOURCE%/*}/sqlcontainers.config

while IFS=, read image name ostype
do
  dnsname="${name}-${dnssuffix}"
  command="az container create --resource-group ${resource_group} --image ${image} --name ${name} --ports 1433 --dns-name-label ${dnsname} --os-type ${ostype} --environment-variables ACCEPT_EULA=Y SA_PASSWORD=${sapwd}"
  echo "Creating container from: ${image}"
  $command &>/dev/null &
done < ${BASH_SOURCE%/*}/sqlcontainers.csv

For us SQL Server people, writing a bash script is not something we do everyday, I’m therefore not ashamed in admitting that I’ve googled quite a bit before I figured all the moving parts 😉 It basically loads the config parameters, then the while loop loops through the items in the csv file and creates the containers. (the “&/dev/null &” part makes sure it happens in the background and asynchronously)

Just create a new directory in the cloud portal and upload your files. (If you created them in a windows editor make sure they have linux line endings and not the windows CR/LF, the cloud shell does not like that and will give a cryptical error message)
Then all you need to do is start the script and wait around 15 minutes for the containers to be downloaded and installed.

A quick way to start a bunch of SQL Server versions via cloud shell using Azure Container Instances

After that, a good idea would be to create registered servers so you can use them easily from Management Studio

Don’t worry about the name suffix that you can see, I’m not using that anymore after posting this image 😉
Creating a “New Query” window from the group makes it possible to query all of them in one go.

Azure Container Instances are more expensive than running a vm with the same resources, make sure you delete the containers when you’re done!

1 Comment

Invoke-DbaDiagnosticQuery


Gary Bargsley (b|t) is the host of this months
T-SQL Tuesday #110 – ” Automate All the Things”
The question he asks is easy:

What do you want to automate or what automation are you proud of completing?

My answer to that is simple, I’m a major contributor to the awesome Powershell library dbatools. What I’ve contributed to that library are commands that will help automate the running and processing of queries from the DMV library of Glenn Berry
At some point in the life of a DBA we’ve all come accross his scripts. For the longest time I would advise people to google “Glenn Berry DMV”, and it will be the top result.
The scripts however, come in a single file per SQL Server version and you can’t run them all in one go. You would have to select a script, run it, and paste the result from Management Studio into an Excel sheet. Glenn provides an empty sheet with tabs ready to paste the various result sets in. I’ve automated this part, hope you like it!

Let’s get started with something simple and run a single script.
First install dbatools though, and to make things more interesting for later on, install a module called importexcel while you’re at it.
All you need to do after that is open a powershell window and type something like the following:

Invoke-DbaDiagnosticQuery -SqlInstance localhost -QueryName "Configuration Values" | Select -ExpandProperty result | Format-Table -AutoSize

Want to know which scripts are available?

Invoke-DbaDiagnosticQuery -SqlInstance localhost -UseSelectionHelper
Just select one or more scripts and hit “ok” to run them

So, where is this Excel sheet you promised us?

Invoke-DbaDiagnosticQuery -SqlInstance localhost | Export-DbaDiagnosticQuery -ConvertTo Excel -Path c:\demo
Besides the instance scripts, there are also scripts that run for every database you have!
One Excel sheet for the instance and one for every database.
(My laptop is called XPS, it shows that even though I specified “localhost” in the query)
Presto! How cool is this!

The directory will be filled with a lot of other files besides the Excel sheets. Some of Glenn’s scripts have columns which contain the query or even the query plan. That doesn’t play nice with an Excel sheet so they will be saved in separate files.

Good to know:

  • There are options available to exclude the .sql and .sqlplan file from the export
  • You can include specific scripts, but also exclude specific scripts
  • Want a more portable format? Pipe the result to Export-CliXml. That compresses nicely if you 7zip it and can be used as input for the Export-DbaDiagnosticQuery later on.
  • The latest version of Glenn Berry’s DMV scripts are always included in dbatools, you don’t have to download them separately. (but you can certainly do that if you want to for whatever reason)
Haven’t seen dbatools yet? You’re in for a treat! Check out not just my little contribution but the literally 100’s of other amazing features!
6 Comments

It’s been 30 years!

January 1st, 1989 was my first day at work for my first job in IT.

You would think that everything was completely different 30 years ago. The advance in sheer computing power and the evolution of languages and frameworks had me think for a bit that this is the case as well. But actually, the difference is not that big!

A “Point 4 – Mark 12”, this doesn’t exist anymore.
Those keys are not for opening the front, they could litteraly be used to start and stop the thing.

My work consisted of operating one of these things in the picture above, they were called “Mini computers”, sized like a fridge, so not really what we consider “mini” these days.

What was different in those days?

So, this really happened. Debugging in the early 90’s:
Client calls because he has an error on his screen: “I have a stop at 5654”
I would walk to a closet and get the printed source code of the customer so I can look at the line number he just gave me to see what that line of code does!
Turns out, he was printing invoices and that specific line of code would get a customers information from the system. But unfortunately he deleted that customer that morning while still having sold goods in the system that were not invoiced yet.
The client would now have to go a different terminal to re-enter the customer into the system with the same number. After that he would then change a few lines of code in such a way that processing the invoices would restart but only actually start printing at the exact line where it broke. (and the client would have to move the printer back exactly 3-clicks to printing would start at the right place)

A Line Printer, a box of paper would go in the front and invoices would come out the back. We had 3 of these, filling the room with a constant rattling noise. Think “matrix” but a whole line in one go.

The client would run the program (without saving the code ofcourse) and then we would wait until we would hear the printer. I asked the client if he could check if it started printing at the right place with the right data but he wouldn’t hear that anymore, having already dropped the phone to quickly run to the printer to see if all was ok.

If you were at all impressed with me using printed source code, never mind that, check out this amazing woman. Margaret Hamilton is standing next to a the code of Nasa’s Apollo project she and here colleagues created a whopping 20 years before my first day at work. We have her to thank for naming our work “Software Engineering”.

What was similar to today?

The software was written in something called Business Basic. This was old-school stuff with line numbers. Every program that could print something would have a “Printer Choice” menu. The code for this was never exactly the same and always in a different part of the software. At some point that printer choice had to be changed (because we got a new printer!). I decided to create a single set of code to be used as the standard printer choice code for all programs. I picked a set of line numbers that was not in use in any program and loaded it into everything after thouroughly testing it.
Today re-using code by creating modules, functions or whatever your language of choice supports is a common thing.

and

A specific program was slow and would be in the way of daily work, we couldn’t start it earlier because the data would not have arrived yet. I discovered that finding data took too long. So I created an extra dataset containing only the account number we needed to have and the data we used to look for the account number. Searching through this much smaller dataset was way quicker than using the main dataset. After that we still had to go to the main dataset to get the rest of the information, but now we had the key already, so this could be a direct lookup.
Database people will immediately recognize the “non-clustered index” I made to get rid of the “table scan”. (I’m putting the terms in quotes because you couldn’t call it a database what we were using)

Am I trying to say that I was born ready as a database tuner? far from it! I’m only saying that there’s no magic involved in what we all do. Things haven’t changed much, not really. Common sense and experience are good skills to have!

The most important thing that hasn’t changed at all though is that people keep telling me that in a few years our work will be fully automated and we would be out of a job. I have never been busier! As long as we keep learning, keep investing in our (soft) skills, we will have no problem to stay ahead and stay relevant and busy!

Leave a Comment