Search Results

When to use CmdletBinding in PowerShell?

Clean Code At Sabin.IO we are big proponents of clean code. We use PowerShell a lot for automation, and want our code to be clean. You are automating everything, right? If not, please see a slide from a recent meetup: For me, clean code in PowerShell means (and not limited to): Small self-contained functions that have a single responsibility Number of arguments to a function kept as small as possible Consistent formatting No duplication of code Modules that hide internal functions, and only expose what’s needed Common Parameters One way to make code a bit cleaner is to make use

SqlServer PowerShell Modules NuGet Package Now Available

Hello! Back in the July Update of SSMS 2016 , a bunch of new SQL PowerShell functions were added, plus two neat additions to Invoke-Sqlcmd : -outputas, which allowed you to output the result set into a data object (eg, data row, data table etc), and -ConnectionString, which allows you to pass in a connection string instead of using the pre-defined parameters. All very useful stuff, go and have a read . However, this update has two issues: firstly, it's not updating the classic sqlps module, but rather has created a new module: sqlserver. This new module will be regularly

SqlServer PowerShell Modules NuGet Package Now Available

Hello! Back in the July Update of SSMS 2016 , a bunch of new SQL PowerShell functions were added, plus two neat additions to Invoke-Sqlcmd : -outputas, which allowed you to output the result set into a data object (eg, data row, data table etc), and -ConnectionString, which allows you to pass in a connection string instead of using the pre-defined parameters. All very useful stuff, go and have a read . However, this update has two issues: firstly, it's not updating the classic sqlps module, but rather has created a new module: sqlserver. This new module will be regularly

Create an Azure Active Directory Application and Key using PowerShell

I’ve been a SQL developer for a good few years now, and have also developed numerous web applications, web services and various console apps. However, lately I find myself getting into the world of DevOps, Azure, and necessarily, PowerShell. Whilst familiar with PowerShell to a degree, I’ve learnt a lot over the past few weeks about the Azure PowerShell module, and how we can use it to script tasks that you might not want to do manually in the Azure portal if you’re thinking about automation. This post should help if you want to create an Azure Active Directory application

Azure Powershell 4.0 may break your scripts

Ensuring backwards compatibility is something that one has to consider very carefully when doing continuous delivery. We are all to well aware of the challenges of this with database systems as, generally, the database lives much longer than the apps that interact with it and thus one has to maintain the data. SQL has far too many “legacy features” that can’t be changed due to potential breaking changes. Thankfully the SQL team now have a more robust way of managing change and that’s through the compatibility level for the database. This allows you to upgrade to the latest runtime but

Running a SQL Server workload using PowerShell

In February 2018, myself and Paul Anderton gave a presentation on how to correlate database deployments with performance issues within the context of a DevOps pipeline. We used Sentry One as our monitoring tool in a Performance Test environment so that we could catch badly performing deployments before they got to production and caused havoc. If you would like to see the recorded video, then you can download it from here: http://info.sentryone.com/partner-webinar-performance-problems As part of this presentation we had a workload running on a workstation, which executed a couple of stored procedures repeatedly, and we’ve had some requests from people

Uploading Files To Data Lake Storage With PowerShell Part Three

Picking up from where we left off last month, we’re today we’re looking at setting the Azure Data Lake Storage account. This post is part of a series on automating the process of uploading files to Azure Data Lake Store , Although the entire script is available on Git (posted below) I’m going to go into one function per post so that I can go in greater depth. Part One of this blog series focused on logging in to an Azure Subscription. Part Two focused on setting the Resource Group. As mentioned, today’s function starts on row 74 and is

Uploading Files To Data Lake Storage With PowerShell Part Two

Carrying on from our previous post on automating the process of uploading files to  Azure Data Lake Store , we will check if a Resource Group exists, and if it does not then it will create it. Although the entire script is available on Git (posted below) I’m going to go into one function per post so that I can go in greater depth. Part One of this blog series focused on logging in to an Azure Subscription. Today’s function starts on line 42 and is called Set-AzureResourceGroup. Before we go into it though, I want to take a moment

Uploading Files To Data Lake Store With PowerShell Part One

Hello!   I’ve recently been working on uploading files to Azure Data Lake Store . It’s quite straightforward and I think a decent introduction into automating a deployment with Azure, as well as a good example of writing scripts that are idempotent, so I’m going to go through them from beginning to end. I’m going to go into one function per day, so this will take 5 days to cover. But I’m hoping that by focusing a bit more in-depth as opposed to trying to cram it all into one post it will be more informative, and both yourselves and

PowerShell Workflow Script To Stop VM’s In A Resource Group

Recently I needed to make sure that all the VM’s in a given resource group were stopped, and so I looked around the Runbooks available to download from the Azure Marketplace. Some of these were ridiculously complex: one was over 500 lines long! Just to stop a VM! Naturally there is a need to setup: we need to get the names of the VM’s in the Resource Group and, if they are running, then stop them. However the command to stop a VM is straightforward: “Stop-AzureRmVM”, followed by the name of  the VM and the resource group. Quite frankly I’m

Automating adding servers to Sentry One

Overview Sentry One is a great tool for monitoring many servers. For new installations, it can be a bit of a bind to add your existing servers into the tool to be monitored. I have written a PowerShell module to make this much easier and to validate that servers that you thought were being monitored, are in fact monitored. There is full documentation for the module in the Sentry One user guide which explains how to use the functions within it, but a brief explanation is shown below. it is worth mentioning that all the PowerShell cmdlets are doing is

VSTS Hosted Build Specs: The Script

Some months back, I published a post about the VSTS Hosted Build Agent’s specs. One thing I didn’t add was the PowerShell script that I used to get these details. Mainly because I couldn’t find the script anymore… So by popular demand here is the script I used to get the build specs. I ran it as an in-line PowerShell script as part of a build that was being run on the Hosted Build Agent.     Here is the output from the script:     The CPU has changed since I last gathered the data about this: previously it

How To Compile SQLProj Files Using Cmdline MSBuild... Errors Included!

I recently needed to build and deploy about 40 small database projects that were in 4 or 5 different database solutions. And I needed to do this several times a day, so compiling via Visual Studio would be a boring and tedious process. So to speed up the process I decided to write the build process in an MSBuild target file and call initiate the build process through PowerShell. The targets file was simple enough to put together. This would be saved in the root location of all the solution folders as "BuildAllDBProjects.targets.xml". Then the PowerShell would be simple enough;

Azure Automation Module Import: Sorry! You’re not my type.

I’ve just burnt several hours on an issue with importing powershell modules into an Azure automation account using the New-AzureRmAutomationModule cmdlet.  Although I could import the module with no issue using the portal, I was getting an error of “Import newer version failed” when trying to do it using the cmdlet, and when I checked the module in the portal the message was: “Error importing the module cAsosSQLAddons. Import failed with the following error: Orchestrator.Shared.AsyncModuleImport.ModuleImportException: No content was read from the supplied ContentLink. [ContentLink.Uri=https://xxxxx.blob.core.windows.net/dscresources/DSCResources .cXXXXSQLAddons/1.0.9/cXXXXSQLAddons.zip]” When importing a module using New-AzureRmAutomationModule , one of the parameters you must supply is

Notes From The Field: Using Invoke-Sqlcmd

Lately I’ve been working quite a bit with Invoke-Sqlcmd and there’s a few issues with how it handles errors that I feel make it a poor choice of tool to connect and execute SQL. Let’s take a look at a script that will return an expected error:   Funnily enough, this does not return an error. The “$?” is LASTEXITCODE, which means that as it returned as “True” (ie no issues) so PowerShell considers the query to be a success. This is a real problem. Even if we add error handling to the script, we still see the same result.

How to install SQL Server on Windows Server Core?

As part of automation of database and application deployments, it makes sense to be able to create new SQL Server instances quickly and with minimal resources. I have already explored containers and written about it on this blog, but I’d like to turn your attention to setting up SQL Server on Windows Server Core for those of you that run SQL Server on-premise or within VMs in the cloud. In a domain environment it should be pretty simple to just create a PowerShell session to your target Windows Server where your account is a local administrator and then simply run

Assist Deploy Is Available on GitHub

Hello! For some time now I have been working on automating SSIS deployments, and earlier this week I published my efforts on GitHub . But before I get into the what/how, let’s focus on the why and let me catch you up on how I got here… The task to take an ispac and deploy in and of itself is quite a straightforward process as there are multiple ways to do this . For those of you who want the abridged version of the linked post, the choices are as follows: Integration Services Deploy Wizard SSIS Catalog T-SQL API PowerShell

When To Use Octopus Deploy Script Modules

Hello! Lately I've been thinking a lot about script modules in Octopus Deploy. Script Modules, for those of you not ITK, are a collection of PowerShell cmdlets or functions that exist outside of a script step that can be used across multiple steps across different projects/environments. They don't have to be used to contain modules exclusively; they can just contain that will get executed upon the initial import of the module. The issue here is that these script modules are imported or every step in a process. So unless the code you write is idempotent, you'll probably cause an error

Feedback requests to Microsoft

If you didn’t know Microsoft has a number of channels to provide feedback. Most historically user connect (connect.microsoft.com), it integrated with their internal bug tracking systems and meant that items flowed from the users to engineering and back. Well supposed to.   The SQL product group still use connect https://connect.microsoft.com/sql with a few teams also using Trello https://trello.com/b/NEerYXUU/powershell-sql-client-tools-sqlps-ssms and or Slack Slack - sqlcommunity.slack.com Visual studio is moving to https://developercommunity.visualstudio.com/spaces/8/index.html from connect and also has https://visualstudio.uservoice.com/forums/121579-visual-studio-ide for ideas VSTS has a great support and also uses MSDN, and takes requests on Uservoice https://visualstudio.uservoice.com/forums/330519-team-services PowerBI has forums and uses user voice

VSTS Hosted Build Agent Specs

I was interested to know just what the hardware specifications of the hosted build agent is. So I added some PowerShell to read out the info below: 2016-06-29T09:23:31.3935358Z systemname      Name                                      DeviceID NumberOfCores NumberOfLogicalProcessors Addresswidth 2016-06-29T09:23:31.3935358Z ----------      ----                                      -------- ------------- ------------------------- ------------ 2016-06-29T09:23:31.3935358Z TASKAGENT5-0010 Intel(R) Xeon(R) CPU E5-2673 v3 @ 2.40GHz CPU0                 2                         2           64 2016-06-29T09:23:31.4095356Z Total memory:  7167.55078125 What piqued my interest greater was that this is the exact same spec for a D2 v2 box that is available via Azure. Clearly, Microsoft have a build agent template which is built, stored in a pool, and provisioned whenever a build

SQL Supper Scripts

Hello! Thanks to everyone who turned up yesterday at SQL Supper: there was a good turnout of both new and familiar faces. The Demo Gods were with me and I was able to log on to both my Azure VM and able to deploy to SQL Azure. I’ve uploaded the scripts to gist and shared below. I also spoke about raising a Connect Issue so that Microsoft.Build.Utilities.Core NuGet package will work with Microsoft.Data.Tools.Msbuild. I’d like to see this so that we do not have to install the Microsoft Build Tools 2015 MSI on the box. And this is important because

AssistDeploy 1.2 Is Now Live

Evening! AssistDeploy , our attempt to fully automate SSIS Deployments, is not yet a week old, yet we’re already on release 1.2 , which is our 4th release. If you’re wondering how we can be on a 4th release, when the number is .2, I suggest you have a read up on SemVer . Unlike the previous post , this will be brief. But like that post, I’m going to delve into why I’ve made the changes before what, so that the context is understood. What most IT projects attempt to achieve is take some knowledge of a subject matter

Team City Meta Runner - Get Build Status

Hello! When building a deployment pipeline, the choice of tool is less important than the use of the tool: do you go for a tool that centrally controls the flow of a release, from build to running tests to actual deployment, or do you choose separate tools that are loosely hung together and execute a particular part of a release? From personal experience, I have preferred to use a tool to act as a control flow of the deployment pipeline, but leverage tools where there is clear sense to use them. A case in point is using TeamCity to run

SSDT 16.5 Released part 2: Using the DacFx API and Samples!

Hello! Yesterday I posted about the new release of SSDT from the SQL Tools Team at Microsoft. Two of the big changes are the ability to create the deployment report, deployment script and execute the deployment all in one command. The other change is that now for Azure two scripts are generated: one for any changes that need a connection to master, and the other script for changes to the user database. The samples yesterday showed how to execute the new method using SQLPackage, but a lot of people, myself included, have automated the deployment using the DacFx API through

Visual Studio Code Extensions and Settings

I primarily work in Visual Studio 2017 and Visual Studio Code, using VS2017 for SSDT work, and VS Code for pretty much everything else. VS code is highly configurable, and as it’s a rainy Sunday, I thought I’d share my settings with you in case you are interested. A few colleagues at work have asked me what extensions and settings I have so, here they are as of Feb 2018. Extensions In a console session within vs code, you can do this to list them: PS C:\> code --list-extensions codezombiech.gitignore DotJoshJohnson.xml eamodio.gitlens gerane.Theme-Blackboard mohsen1.prettify-json ms-mssql.mssql ms-vscode.PowerShell ms-vsts.team PeterJausovec.vscode-docker secanis.jenkinsfile-support yzhang.markdown-all-in-one

Using a Cloud Witness for Clusters

On a client site recently a question was asked about the file share witness in a SQL Server failover cluster on-premise, and where to put it if you only have 2 sites. As always, it depends! Let’s look at some scenarios. Bear in mind that use of the Azure Cloud Witness requires Windows Server 2016 or later. Topology 1 Three node cluster with 2 nodes at primary and 1 at disaster recovery (DR). Most people want high availability at their “primary site” and are happy to have standalone capability at the business continuity (DR) site. To save storage costs, I

SQL Agent depLoyment Tasks Out Now!

Hello! In my time since leaving university, I’ve worked across the spectrum, from tester to DBA, it has always been abundantly clear that SQL Agent Jobs are one of those things that are really difficult to get into source control and deploy. Sure, you can script them out, but that doesn’t really factor in changes. And in many places, the phrase “whatever is in prod” has often been the answer to the question “what are the SQL Agent Jobs supposed to look like?” And frequently, relying on msdb being backed up as been the backup process for SQL Agent Jobs.

Automating SQL Server Performance Testing

You run performance tests as well as functional tests when deploying new code changes to SQL Server, right? Not many people do, I think you should, and this article will show you how to do it by harnessing an existing performance tool, rather than writing your own monitoring infrastructure from scratch. Any good performance monitoring tool that records information to a database will do fine, and we prefer to use Sentry One . Here are the steps to accomplish this. Create a baseline database When you release your database change, you want to have something to compare against as an

VMWare network performance bug - Getting a repro

If you’ve read my previous post about an issue with VMware ESX 6 and connecting to SQL and 500ms latency , you might be interested in the process we went through to get to the repro. Getting a repro (being able to reproduce a bug/feature) is often a complex and time consuming task. The challenge is like being Sherlock Holmes and using your experience to focus on the aspects of the situation that is important. The challenge is that without a repro, You can’t give anything to a supplier to enable them to triage and find a fix for it

Running SQL Server in an Azure Container Instance

Azure Container Instances are still in Preview and not officially available for Windows yet, which made me smile. It took me a while to figure out how to get this working so I thought I’d share what I’ve found. Containers are great for lightweight testing of code before deployment to production servers because they can be created so quickly and they give the same environment to test in very reliably. Now that Microsoft is offering container instances in Azure it means you don’t have to worry about provisioning and configuring your own docker host/cluster. The options for deploying SQL Server

VSTS Git Repos no longer show as connected in Visual Studio

I hit something this week where one of my Git repos showed as disconnected in Visual Studio and Team Explorer was showing me the option to clone the repo. Which clearly I didn’t want to do as it was already cloned. You can see in the picture that I have the repo locally (at the bottom) but in the list of projects I get the clone button.   The root cause of this was that VSTS has been cleaning up an old artefact of its TFS history. For on premise TFS you have the ability to have collections, and one

Keeping The Database Dev Ops Overhead Lightweight

Hello! One very important aspect of Dev Ops that is perhaps over-looked is the overhead that comes with adopting Dev Ops practices. To help explain what I mean, let’s break that sentence down a bit. What Do I Mean By “Dev Ops Practices” I have a strong suspicion that for each of the posts for this T-SQL Tuesday on Database Dev Ops, everyone will have a slightly different take. Or rather, they are going to articulate what Dev Ops means to them. And so here is my take: broadly speaking, Dev Ops is about increasing the cadence of a feature

Access Denied –How To Prevent a Failure Mid-Deploy

Hello! As part of any decent path-to-live, it is obviously crucial to deploy to other environments. This is crucial because not only do we need to test the changes being made, but just as importantly that the deployment will succeed. An ideally these environments should match as closely to production as possible. The less difference there is between production and all environments up to production, the greater the chances a deployment will succeed. Sounds simple, and perhaps even trivial, but if there is one thing that always, and I mean always catches deployments out, it’s permissions. Chances are production permissions

Getting started with Azure Policy

Recently as part of a data classification implementation certain aspects were implemented using Azure Policy, since is supports auditing of existing resources and can prevent non-compliant resources from being created in the first instance. Like most things, the documentation reads well and the samples seem useful, until one has to do something different and go off-piste. There are plenty of VM and tagging samples though not so many for Azure SQL Database. Therefore as Sabin is a data engineering consultancy, all examples given here will be Azure SQL Database focused and not the canonical VM as found in many examples.

Continuous Integration with Jenkins, SQL Server and Windows Containers

Why use Windows Containers? When creating database applications we need consistency in all our environments to ensure quality releases. Traditionally developers might have their own instance of SQL Server on their workstation to develop against. Database projects would be created in SSDT and pushed to source control when ready for testing. If you’re not using SSDT for database development already, then you should seriously consider it to make your life easier and increase the quality of your releases. Ed Elliot explains why in this blog post . A problem with CI for databases is that databases are a shared resource

How to check Sentry One requirements

I was at a client site recently and implemented Sentry One for them, a great monitoring system for SQL Server. It proved challenging because some servers were in a DMZ on a separate network and domain and some servers were in the same domain. All servers connected via a router and were firewalled off from each other with only the minimum ports open required for them to fully function and communicate. Sentry One operates in two modes, Full and Limited. Full mode allows Sentry One to gather Windows metrics as well as SQL Server metrics Limited mode does not allow

SQL Server Container Performance

Is SQL Server in a container faster than a VM? I briefly looked at SQL Server containers when Windows Server 2016 was released. Containers offer the ability for rapid provisioning, and denser utilization of hardware because the container shares the base OS’s kernel. There is not a need for a Hyper-Visor layer in between. As a recap for those that are not up speed with containers, the traditional architecture of databases in a VM is like so: The Hyper-Visor OS is installed onto the host hardware, a physical server in the data centre. Many VMs are created on the Hyper-Visor

How to move a replication subscriber to a new server with no downtime to the publisher?

In a recent data centre migration for a client we had a problem where we needed to move a subscriber to a new data centre without incurring any downtime to the publisher or loss of data after the subscription migration. The application was sending hundreds of transactions per second to the publisher. An additional complication was an upgrade to SQL Server 2016 from SQL Server 2008 R2 on the subscriber. The first phase of the migration was to move the subscriber to a new server in a different domain, but without incurring any downtime to the publishing application. How to

Migrating SSIS Packages to SSIS Azure

Hello! In case you missed the announcement (and there were a lot of announcements during MSIgnite), SQL Server Integration Services is in Public Preview on Azure! I’ve written about it elsewhere in greater depth , but here are the headlines: It makes use of SSIS Scale Out , which was released as part of SQL Server 2017 . Although it is based on SSIS Scale Out, you can’t actually configure SSIS Scale Out to run on the instance. If this confuses you then read my in-depth post. SSISDB is installed in either SQL Azure or on a Managed Instance. You

SSIS Package Execution In Azure Is Now Available

Well, it’s been some time coming but SSIS packages are the latest product to make the move from on premise to Azure. You can now take your SSIS projects and deploy them to the new Platform as a Service (PaaS) offering in Azure. The aim of the team at Microsoft was for users to take their current SSIS packages and just “lift and shift” these to Azure. So in development terms that means that there are minimum to no changes to be made in the solution at least. But before we get into the deployment and running of SSIS packages