Skip to content

Dutch Geek Posts

SQL Server Container Instances via Cloudshell

Every now and then (actually, quite often now I think of it) I need to test code on a bunch of different versions of SQL Server. And I prefer those SQL Server instances to be fresh every time. Quickest way that works for me, fire up my Azure Cloud Shell and start a script. The script creates all containers asynchronously, takes about 15 minutes because it needs to pull them.

First things first, make sure you have your Azure Shell setup.
Then make sure you an empty resource group.
My little solution consists of 3 files:

sqlcontainers.config, which holds the name of the resource group where the containers should land, the dns-suffix that my container names will have to make sure they are unique, and the sa password that they will have.

resource_group=my-containers-test-rg
dnssuffix=mygeneratedsuffix
sapwd="mygeneratedpassword"

Tip: Generate the suffix and the sa password with a password manager, don’t use known simple passwords and / or pre-fixes that have a meaning and would be perhaps easy to guess. These SQL instances will available on the internet and accessible by anyone if you don’t choose a complex password. Also, don’t use them to store sensitive data, this should be used for testing purposes only!

sqlcontainers.csv, they contain a comma separated list of the publicly available container images with SQL Server that I’m using, the name that they should have (which will also get the suffix that’s in the config file) and the operating system type. (windows or linux)

dbafromthecold/sqlserver2012dev:sp4,sql2012,windows
dbafromthecold/sqlserver2014dev:sp2,sql2014,windows
dbafromthecold/sqlserver2016dev:sp2,sql2016,windows
mcr.microsoft.com/mssql/server:2017-latest-ubuntu,sql2017,linux
mcr.microsoft.com/mssql/server:vNext-CTP2.0-ubuntu,sql2019,linux

Microsoft provides us with the (linux) containers for sql2017 and 2019. The older version containers have been created by Andrew Pruski (b/t). Andrew blogs a lot about containers and SQL Server, do take a look if you want to know more about it!

sqlcontainers.sh. The bash shell file that loads the config parameters and then loops through the items in the csv file to create containers.

#!/bin/bash

. ${BASH_SOURCE%/*}/sqlcontainers.config

while IFS=, read image name ostype
do
  dnsname="${name}-${dnssuffix}"
  command="az container create --resource-group ${resource_group} --image ${image} --name ${name} --ports 1433 --dns-name-label ${dnsname} --os-type ${ostype} --environment-variables ACCEPT_EULA=Y SA_PASSWORD=${sapwd}"
  echo "Creating container from: ${image}"
  $command &>/dev/null &
done < ${BASH_SOURCE%/*}/sqlcontainers.csv

For us SQL Server people, writing a bash script is not something we do everyday, I’m therefore not ashamed in admitting that I’ve googled quite a bit before I figured all the moving parts 😉 It basically loads the config parameters, then the while loop loops through the items in the csv file and creates the containers. (the “&/dev/null &” part makes sure it happens in the background and asynchronously)

Just create a new directory in the cloud portal and upload your files. (If you created them in a windows editor make sure they have linux line endings and not the windows CR/LF, the cloud shell does not like that and will give a cryptical error message)
Then all you need to do is start the script and wait around 15 minutes for the containers to be downloaded and installed.

A quick way to start a bunch of SQL Server versions via cloud shell using Azure Container Instances

After that, a good idea would be to create registered servers so you can use them easily from Management Studio

Don’t worry about the name suffix that you can see, I’m not using that anymore after posting this image 😉
Creating a “New Query” window from the group makes it possible to query all of them in one go.

Azure Container Instances are more expensive than running a vm with the same resources, make sure you delete the containers when you’re done!

1 Comment

Invoke-DbaDiagnosticQuery


Gary Bargsley (b|t) is the host of this months
T-SQL Tuesday #110 – ” Automate All the Things”
The question he asks is easy:

What do you want to automate or what automation are you proud of completing?

My answer to that is simple, I’m a major contributor to the awesome Powershell library dbatools. What I’ve contributed to that library are commands that will help automate the running and processing of queries from the DMV library of Glenn Berry
At some point in the life of a DBA we’ve all come accross his scripts. For the longest time I would advise people to google “Glenn Berry DMV”, and it will be the top result.
The scripts however, come in a single file per SQL Server version and you can’t run them all in one go. You would have to select a script, run it, and paste the result from Management Studio into an Excel sheet. Glenn provides an empty sheet with tabs ready to paste the various result sets in. I’ve automated this part, hope you like it!

Let’s get started with something simple and run a single script.
First install dbatools though, and to make things more interesting for later on, install a module called importexcel while you’re at it.
All you need to do after that is open a powershell window and type something like the following:

Invoke-DbaDiagnosticQuery -SqlInstance localhost -QueryName "Configuration Values" | Select -ExpandProperty result | Format-Table -AutoSize

Want to know which scripts are available?

Invoke-DbaDiagnosticQuery -SqlInstance localhost -UseSelectionHelper
Just select one or more scripts and hit “ok” to run them

So, where is this Excel sheet you promised us?

Invoke-DbaDiagnosticQuery -SqlInstance localhost | Export-DbaDiagnosticQuery -ConvertTo Excel -Path c:\demo
Besides the instance scripts, there are also scripts that run for every database you have!
One Excel sheet for the instance and one for every database.
(My laptop is called XPS, it shows that even though I specified “localhost” in the query)
Presto! How cool is this!

The directory will be filled with a lot of other files besides the Excel sheets. Some of Glenn’s scripts have columns which contain the query or even the query plan. That doesn’t play nice with an Excel sheet so they will be saved in separate files.

Good to know:

  • There are options available to exclude the .sql and .sqlplan file from the export
  • You can include specific scripts, but also exclude specific scripts
  • Want a more portable format? Pipe the result to Export-CliXml. That compresses nicely if you 7zip it and can be used as input for the Export-DbaDiagnosticQuery later on.
  • The latest version of Glenn Berry’s DMV scripts are always included in dbatools, you don’t have to download them separately. (but you can certainly do that if you want to for whatever reason)
Haven’t seen dbatools yet? You’re in for a treat! Check out not just my little contribution but the literally 100’s of other amazing features!
6 Comments

It’s been 30 years!

January 1st, 1989 was my first day at work for my first job in IT.

You would think that everything was completely different 30 years ago. The advance in sheer computing power and the evolution of languages and frameworks had me think for a bit that this is the case as well. But actually, the difference is not that big!

A “Point 4 – Mark 12”, this doesn’t exist anymore.
Those keys are not for opening the front, they could litteraly be used to start and stop the thing.

My work consisted of operating one of these things in the picture above, they were called “Mini computers”, sized like a fridge, so not really what we consider “mini” these days.

What was different in those days?

So, this really happened. Debugging in the early 90’s:
Client calls because he has an error on his screen: “I have a stop at 5654”
I would walk to a closet and get the printed source code of the customer so I can look at the line number he just gave me to see what that line of code does!
Turns out, he was printing invoices and that specific line of code would get a customers information from the system. But unfortunately he deleted that customer that morning while still having sold goods in the system that were not invoiced yet.
The client would now have to go a different terminal to re-enter the customer into the system with the same number. After that he would then change a few lines of code in such a way that processing the invoices would restart but only actually start printing at the exact line where it broke. (and the client would have to move the printer back exactly 3-clicks to printing would start at the right place)

A Line Printer, a box of paper would go in the front and invoices would come out the back. We had 3 of these, filling the room with a constant rattling noise. Think “matrix” but a whole line in one go.

The client would run the program (without saving the code ofcourse) and then we would wait until we would hear the printer. I asked the client if he could check if it started printing at the right place with the right data but he wouldn’t hear that anymore, having already dropped the phone to quickly run to the printer to see if all was ok.

If you were at all impressed with me using printed source code, never mind that, check out this amazing woman. Margaret Hamilton is standing next to a the code of Nasa’s Apollo project she and here colleagues created a whopping 20 years before my first day at work. We have her to thank for naming our work “Software Engineering”.

What was similar to today?

The software was written in something called Business Basic. This was old-school stuff with line numbers. Every program that could print something would have a “Printer Choice” menu. The code for this was never exactly the same and always in a different part of the software. At some point that printer choice had to be changed (because we got a new printer!). I decided to create a single set of code to be used as the standard printer choice code for all programs. I picked a set of line numbers that was not in use in any program and loaded it into everything after thouroughly testing it.
Today re-using code by creating modules, functions or whatever your language of choice supports is a common thing.

and

A specific program was slow and would be in the way of daily work, we couldn’t start it earlier because the data would not have arrived yet. I discovered that finding data took too long. So I created an extra dataset containing only the account number we needed to have and the data we used to look for the account number. Searching through this much smaller dataset was way quicker than using the main dataset. After that we still had to go to the main dataset to get the rest of the information, but now we had the key already, so this could be a direct lookup.
Database people will immediately recognize the “non-clustered index” I made to get rid of the “table scan”. (I’m putting the terms in quotes because you couldn’t call it a database what we were using)

Am I trying to say that I was born ready as a database tuner? far from it! I’m only saying that there’s no magic involved in what we all do. Things haven’t changed much, not really. Common sense and experience are good skills to have!

The most important thing that hasn’t changed at all though is that people keep telling me that in a few years our work will be fully automated and we would be out of a job. I have never been busier! As long as we keep learning, keep investing in our (soft) skills, we will have no problem to stay ahead and stay relevant and busy!

Leave a Comment

SQL Saturday Holland 2016

I’m a big fan of SQL Saturdays. I frequently speak at them and I’m part of the team behind SQL Saturday Holland.
This year (2016), the Data Discovery team from InfoSupport made a great video about our event.
I’m so proud to be a part of our event, can’t wait until we start planning the 2017 edition!

Leave a Comment

Managing Ola Hallengrens scripts with Powershell

I presented for the PASS Powershell Virtual Chapter on June 15th, 2016
It was an online session and it was recorded, the youtube video of it is here

I figured out what was wrong with the two little demo fails.
I was pressing F8 to only execute my selection but because of a CLS I didn’t see the error message.
That function that I use to load a script from a relative path doesn’t work when you press F8, only when you run the whole script with F5
Apparently F8 gives you a different execution context.

And the reason why the Excel demo didn’t produce 2 paths is because I specified the -SHOW option in the first one, the second export option failed because the excel file was in use by Excel itself

The slides and scripts can be downloaded here

Leave a Comment

A fun way to learn about Database Corruption

corruptionchallengeshirt-300x336For 10 weeks in a row now I’ve been participating in Steve Stedman’s “Database Corruption Challenge”.

I’ve learned a ton of stuff and I was able to add some new scripts to my toolbelt because I needed them to solve the challenges.

In the coming months I will be blogging about some of the interesting things I learned, I can’t do that just yet right now because the final weeks is still running. I look forward to meet Steve in Seattle, where I’ll hopefully get a t-shirt for ending in the top 10.
(Which depends on my final submission which is underway as I write this).

Congratulations to my fellow participants, there are just a few who have been able to solve all of them.
And of course congrats to Rob Farley for winning!

Interested in trying this out yourself? The weekly challenge may be over but they are all still available for you to try out, here is the link: http://stevestedman.com/server-health/database-corruption-challenge/

Leave a Comment

Building a Monitoring System with Powershell Remoting

For people who have seen my presentation and are interested to play with the demo code.
Please find the PDF of the presentation and the demo scripts here

Pre-Requisites:

  • Create a database called BlitzOutput and run sp_blitz, make sure it safes the result in a table called spBlitz like this:

exec sp_blitz
@OutputDatabaseName =’BlitzOutput’,
@OutputSchemaName =’dbo’,
@OutputTableName =’spBlitz’

You can get sp_blitz and lots of other cool stuff from http://brentozar.com/blitz/

  • Unpack the 7z file to c:\remoting
  • Create the DBA database with tables, demo entries and sp’s with the db_database.sql script
  • Run Enable-PSRemoting in a Powershell window under admin rights to be able to remote to the localhost

Now you should be able to go through the 4 Demo scripts that I showed which are in the c:\remoting directory.

Let me know if you have any question, I welcome any feedback…
My contact details are in the PDF of the presentation which is in the zip file.

Leave a Comment

SQL Server 2014 PreCon at SQLSaturday Exeter 2014

With just a couple of weeks to go I wanted to emphasise something that is not completely clear from the title and abstract.
It says that the session is for DBAs but that title is not complicated enough, it should read something like :

“What’s new in SQL Server 2014 for (Involuntary) DBAs, (DevOps type) Developers, Large Site DBAs and ‘Every Other Size Site, for instance Standard Edition only with just a few servers’ DBAs, basically everyone except for BI people who want to see lots of BI content”

But I assume that this title would have been deemed too long…
So, why casting such a wide net?

Well, that’s because I love discussions between the various types of people in roles I just mentioned and I respectfully think that my upcoming pre-con is perfect for that. I think DBAs should be interested in Dev features and vice versa and while it’s good to know what the power of Enterprise Edition brings, with the price difference these days I see more and more sites trying to get the most out of Standard Edition. During the day I will show what’s advertised, namely what is new in SQL Server 2014 but I will make sure that the material will teach developers about the use of certain DBA features and so on.

What will we be talking about ?

Always On – Failover Clustering & Availability Groups
What’s the difference? Can you combine them? Are AG’s the ultimate scale out mechanism for Developers?
I will show you how it works and we’ll talk about the good stuff and the interesting challenges. I’m also going to explain a bit about what a Quorum is and what has changed in this area because I see it causing a lot of confusion sometimes.

In-Memory OLTP, aka Hekaton.
If you’re thinking that SQL Server has had a caching mechanism forever and your hot pages should be In-Memory already then you are right. This feature is a lot (a whole lot) more than that and not to be mistaken with a re-incarnation of DBCC PINTABLE. 

Updatable Clustered Column store Index
If you have studied how the compression mechanism behind Column Store works you are probably wondering how it can suddenly be updatable?

Buffer Pool Extensions
Have a part of your buffer pool on fast disks. 
We will take a look at how this works and we will talk about why you would use this
(You’re probably thinking “Why would you not just buy more memory?”)

Resource Governor for IO.
With RG you can control CPU, Memory and now also IO. I’ll demo how this works, and we’ll talk about the various scenarios where this would be useful. And also discuss where you could be shooting yourself in the foot.

Improved Cardinality Estimator.
This this will make better query plans for you. I have a few samples and will explain why poor cardinality estimations could result in a query plan. Don’t expect me to explain how the engine was improved under the covers.

Separation of Duties enhancements. 
I’ve heard this so many times, “can we prevent the DBAs from seeing the data”. BOL seems to suggest that that should be finally possible. We will talk about wether this is actually the case. (the short answer is no!) It is a step in the right direction though and if you combine this with some other settings could give you what you are looking for. 

Sysprep Enhancements.
I have yet to see a site where Sysprep was actively used to roll out servers. These enhancements might just do the trick though. 
Lazy developers should also pay attention to this one 🙂

Windows Azure related features.
We’re going to have a lot of laughs with these ones. 
Would you consider running your instance on your own in-house server while placing the data and log files in Azure? Me neither, but why does this feature exist? We’ll look into it and talk about use cases where it does make sense.
There’s more: Backup to Azure and even “fully automagically backing up whenever needed to Azure” are also interesting features to say the least

I want to get to my personal favourites without dragging this blog post on too long so here’s what else I’ve got something to say about:

  • sys.dm_exec_query_profiles
  • parallel select into
  • Delayed Durability
  • Backup Encryption
  • Rebuilding a single partition online
  • Managing the lock priority of online operations
  • Incremental Statistics

Last but not least: Windows Server 2012R2!
Huh, I thought this was a day about SQL Server?
Well, as it turns out, some of my potentially favourite features of SQL Server are actually Windows Server 2012 R2 features. If you don’t have Enterprise Edition and/or an expensive SAN you may be somewhat jealous at all the cool stuff you can do regarding HA & DR for instance. Let me show you how Data Spaces in combination with Clustered Shared Volumes and Storage Tiering let’s you store any SQL Server version on highly available network shares (!) and how it automatically will place the most used portions of your datafiles on the fastest storage (!!!) 
Needless to say that I leave it up to you if you really want to be the first one to use this in production but that’s not what this day is about…
If we have time we will look at some other R2 features that can be useful icm with SQL Server.

And of course, we’ll end the day just as last year, if the hotel will let me, with a Q&A where a round of pints is on me! 😉
Oh, and did I mention that you can win an exam voucher and other cool stuff ? Sign up here, do it now.

Leave a Comment