Search Results

Automating SQL Server Performance Testing

You run performance tests as well as functional tests when deploying new code changes to SQL Server, right? Not many people do, I think you should, and this article will show you how to do it by harnessing an existing performance tool, rather than writing your own monitoring infrastructure from scratch. Any good performance monitoring tool that records information to a database will do fine, and we prefer to use Sentry One . Here are the steps to accomplish this. Create a baseline database When you release your database change, you want to have something to compare against as an

Overview of Azure Virtual Machine IO performance and throttles

In this post we are going to look at the IO performance of a Virtual Machine in Azure. We are specifically talking about the GS 4 machines with premium managed disks. The theory should apply to all classes of machine but some such as the L series have a different configuration for the temporary drive which is important. The data in this post has been gathered using a mixture of this excellent post https://blogs.technet.microsoft.com/xiangwu/2017/05/14/azure-vm-storage-performance-and-throttling-demystify/ and generating IO using diskspd and measuring using perfmon. Speed vs Throughput It is worth pointing out that Azure specifies the performance of the Disks /

SQL Server Container Performance

Is SQL Server in a container faster than a VM? I briefly looked at SQL Server containers when Windows Server 2016 was released. Containers offer the ability for rapid provisioning, and denser utilization of hardware because the container shares the base OS’s kernel. There is not a need for a Hyper-Visor layer in between. As a recap for those that are not up speed with containers, the traditional architecture of databases in a VM is like so: The Hyper-Visor OS is installed onto the host hardware, a physical server in the data centre. Many VMs are created on the Hyper-Visor

VMWare network performance bug - Getting a repro

If you’ve read my previous post about an issue with VMware ESX 6 and connecting to SQL and 500ms latency , you might be interested in the process we went through to get to the repro. Getting a repro (being able to reproduce a bug/feature) is often a complex and time consuming task. The challenge is like being Sherlock Holmes and using your experience to focus on the aspects of the situation that is important. The challenge is that without a repro, You can’t give anything to a supplier to enable them to triage and find a fix for it

Webinar: 8th Feb 2018

Preventing SQL Server Performance Problems Before They Hit Production Join Mark Allison, Paul Anderton and Kevin Kline at 3pm UTC on 8th Feb 2018 Wouldn’t it be great if performance problems in your SQL estate could be detected BEFORE they reach your production databases? In this demo-centred webinar we will review: How to detect and prevent releases of code that could reduce performance of your SQL Server database Ways to prevent the most common performance problems before they reach production: missing indexes, deadlocks and excessive key lookups How effective SentryOne can be in a DevOps pipeline both on-premise and in

Running a SQL Server workload using PowerShell

In February 2018, myself and Paul Anderton gave a presentation on how to correlate database deployments with performance issues within the context of a DevOps pipeline. We used Sentry One as our monitoring tool in a Performance Test environment so that we could catch badly performing deployments before they got to production and caused havoc. If you would like to see the recorded video, then you can download it from here: http://info.sentryone.com/partner-webinar-performance-problems As part of this presentation we had a workload running on a workstation, which executed a couple of stored procedures repeatedly, and we’ve had some requests from people

Don’t use Storage Spaces…

… if you care about performance in the slightest bit. That’s it really. You don’t need to read any further. What are storage spaces? Have a read of this quick overview: https://www.windowscentral.com/how-use-storage-spaces-windows-10 I had some spare computer parts laying around so I thought I’d rebuild my Windows 10 desktop at home. I have 4 x 4TB Hitachi SATA drives and a hardware RAID controller spare so decided to put them in my desktop. I had heard of storage spaces and wanted to try it out to see how performance would be considering there was no extra hardware involved in creating

Why is Sqlpackage Using All The Build Server Memory?

Sqlpackage can be particularly resource-intensive when scripting a database that has a considerable amount of objects. In this post I'm going to discuss the options available when scripting out a database deployment file from a dacpac when using sqlpackage.exe. I'm also going to investigate how resource intensive they are and what we can do to limit the hardware resources used and how much of an impact this has on our waiting times, with some interesting results on where we were taking the performance hit. Recently I've noticed that when we have more than one build running at the same time

Log Shipping: It's Better Than Bad It's Good!

I'm probably showing my age by quoting an old Ren and Stimpy cartoon here, but to be fair it probably sums up log shipping pretty well. This post is focusing on using a read-only log shipping database for reporting purposes, and the limitations of read-only log shipped databases. I also share some monitoring scripts and a few ideas on how to improve restore performance without having to upgrade the hardware/software. Despite the development of AlwaysOn in recent releases of SQL Server, log shipping is still a great way to set up a copy of databases to be used for reporting.

Automatic Tuning Enabled By Default In Azure SQL Database Happens Today!

Hello! From January 15th (ie, this Monday) Automatic Tuning will be enabled by default and gradually rolled out to ALL Azure subscriptions. If you are the owner of a subscriber you would have received an email two weeks ago alerting you of this fact. However if you’re not, and this has not been communicated out to you, this may be something of a surprise. With regards to how the rollout impacts you, this blog post states that “All servers that do not have automatic tuning explicitly configured will inherit Azure defaults, making automatic tuning enabled. Similarly, all databases that do

Are you running VMware ESX 6 Update 1 and connecting to SQL - be warned

If you are running any of your application on windows 2012 and up and on VMware ESX 6 with Update 1 connecting to SQL you must read on. We’ve been working at a client recently on an issue where requests to SQL have a 500ms latency between requests. What’s really odd is that the time seems to be lost between the client code making a request and the request coming to SQL. Profiler shows a gap of 500ms between the end time of one request and the start time of another request. The issue would only affect some applications and

Home

Plan Develop Deploy Measure Sabin.io is a Data Engineering practice. Our focus is on helping companies deliver data systems. We help companies build sustainable applications that consider the support and ongoing development. The use of agile methodologies is fantastic but needs to be aligned with the engineering practices of testing, continuous integration/delivery and a feedback loop to enable continual improvement. Our experience stretches from windows server management and automation through to the support and management of BI systems. We have strong application development experience which means we are able to deliver and support solutions that work with all those involved

In a partitioned world, don’t violate core directive

This is another short post steming from a recent talk I gave on Azure Cosmos Db vs. SQL Database, and there will be more based on discussion and feedback I received and things I learnt along the way. The point I want to make is that when implementing a scale out data storage then regardless of whether you are considerng Azure SQL Database, Cosmos Db or another storage engine, you have to think differently about your read and write patterns. To paraphrase Conor Cunningham linkedin | blog from his excellent OLTP Sharding Techniques for Massive Scale presetation at SQL PASS

How to check Sentry One requirements

I was at a client site recently and implemented Sentry One for them, a great monitoring system for SQL Server. It proved challenging because some servers were in a DMZ on a separate network and domain and some servers were in the same domain. All servers connected via a router and were firewalled off from each other with only the minimum ports open required for them to fully function and communicate. Sentry One operates in two modes, Full and Limited. Full mode allows Sentry One to gather Windows metrics as well as SQL Server metrics Limited mode does not allow

Do Azure SQL Database External Tables have a place in a micro-service?

I was recently in discussions on using External Tables to link Azure SQL Databases across micro-service boundaries. This has led to some challenging discussions with a client and unexpected opinions internally here at sabin.io . My simple view of a micro-service is of a data store fronted by code, which is in turn behind an API or message subscriber. Importantly only this code accesses the store. I have arrived at this opinion though many (often heated) discussions with developers implementing services, and though working with teams breaking large services into micro-services to clarify ownership and responsibility, remove dependencies and simplify

CosmosDb and CRUD with a S

In preparation for my talk on Azure SQL Database vs. CosmosDb - How do you choose   at giving a talk at sqlbits I spent a lot of time thinking about create, read, update and delete (CRUD) operations, and concluded that with the advent of CosmosDb and its multi-model abstraction layer, CRUD needs a S. CRUD as an acronym has served software engineering very well by encapsulating the core actions of data persistence and retrieval. In the transition from the “ye olde” closed and monolithic systems to the current open, loosely coupled and distributed micro-service architectures via client server and

Feedback requests to Microsoft

If you didn’t know Microsoft has a number of channels to provide feedback. Most historically user connect (connect.microsoft.com), it integrated with their internal bug tracking systems and meant that items flowed from the users to engineering and back. Well supposed to.   The SQL product group still use connect https://connect.microsoft.com/sql with a few teams also using Trello https://trello.com/b/NEerYXUU/powershell-sql-client-tools-sqlps-ssms and or Slack Slack - sqlcommunity.slack.com Visual studio is moving to https://developercommunity.visualstudio.com/spaces/8/index.html from connect and also has https://visualstudio.uservoice.com/forums/121579-visual-studio-ide for ideas VSTS has a great support and also uses MSDN, and takes requests on Uservoice https://visualstudio.uservoice.com/forums/330519-team-services PowerBI has forums and uses user voice

Automating adding servers to Sentry One

Overview Sentry One is a great tool for monitoring many servers. For new installations, it can be a bit of a bind to add your existing servers into the tool to be monitored. I have written a PowerShell module to make this much easier and to validate that servers that you thought were being monitored, are in fact monitored. There is full documentation for the module in the Sentry One user guide which explains how to use the functions within it, but a brief explanation is shown below. it is worth mentioning that all the PowerShell cmdlets are doing is

PowerQuery – The power of M

I love PowerBI, actually I love PowerQuery. It's a great way to combine data from around your business, across the web from anywhere. What I really like is very little is done automatically, i.e. it doesn't do the nice data type detection you get with Excel that screws your imports if the data type in a column differs from the first few rows to the rest of the file. Does that make it difficult to use. No its not. The nice thing is that its very easy to add additional columns, change data types, use expressions, combine datasets, and do

Improving Azure Functions throughput

I recently ran into an Azure Functions throughput problem which I logged on Stack Overflow as regular-throughput-troughs-in-azure-functions-requests-per-second . The product group were pretty quick to respond and pointed me to their Azure App Service Team Blog post processing-100000-events-per-second-on-azure-functions/ . The post lists five notable configuration choices: functions process [event hubs] messages in batches webJobs dashboard is disabled in favor of using Application Insights for monitoring and telemetry each event hub is configured with 100 partitions data is sent to the event hubs without partition keys events are serialized using protocol buffers Of these, the second and third are most interesting.

Hardening Sentry One for Security

If you have an environment where you need to lock down Sentry One as much as possible, then this article should help. It is well known that the Sentry One service account needs to be a member of the Local Administrators group in each server it monitors, and also a member of the sysadmin role for each SQL Server instance. At the moment this is still a requirement, but if you can live without the Windows metrics, then you could run Sentry One in Limited Mode which will only gather SQL Server specific metrics. Some of our clients run in

SSIS Package Execution In Azure Is Now Available

Well, it’s been some time coming but SSIS packages are the latest product to make the move from on premise to Azure. You can now take your SSIS projects and deploy them to the new Platform as a Service (PaaS) offering in Azure. The aim of the team at Microsoft was for users to take their current SSIS packages and just “lift and shift” these to Azure. So in development terms that means that there are minimum to no changes to be made in the solution at least. But before we get into the deployment and running of SSIS packages