T-SQL Tuesday #130 – Recap: Automate Your Stress Away

T-SQL Tuesday Logo - A blue database with a calendar showing days of the week wrapped artfully around the database.

This month, I hosted #TSql2sDay for September 2020. If you’re unfamiliar with T-SQL Tuesday, it’s a blogging party. The idea is a host comes up with a topic and encourages other members of #SqlFamily to join in and write a blog on the topic. T-SQL Tuesday was stared by Adam Machanic and Steve Jones has carried on this fun tradition!

I’m really grateful for all the bloggers that took part this month. Especially since automation has been a topic discussed before. However, it’s hard for me to get too much of my day to day work automated. And I was really looking forward to these topics so that I could learn new tasks I could automate myself. With that said, let’s see all the wonderful ideas people contributed this month. And if you’re like me, you’re going to want to put some of this automation in place as soon as possible.

I’m right there with Greg Dodd regarding automation. Greg blogs about how automation can be a great way to minimize making mistakes. I also like how Greg points out that automation can mean we can all take a vacation without worrying that someone will call us with an emergency. Greg’s done some pretty interesting stuff including automating the creation of test servers using Chef. Greg keeps things realistic by stating that often automation is not a time saver at first, but over time you can really see the benefit.

Eitan Blumin is very familiar with automation, and Eitan has blogged about the topic extensively. As such, Eitan approached this month’s topic with a different perspective. Instead of discussing a project using automation, Eitan walks us through what steps we can use to start automating tasks around us.

This was Magda‘s first blog post ever (yay, Magda!), and I’m so glad that Magda joined in the fun to write about how we can all use automation. Magda might describe themselves as a non-technical worker, but I think Magda’s approach to automation is the same as the rest of us. Magda found that some manual tasks in Excel could be improved with automation. I’m also very glad that Magda reminds us all to celebrate the victories… and even a cup of tea with the time you’ve saved automating your work.

Kevin Chant lets us know what benefits he’s found with automation, which includes making our lives easier. Kevin shares information about the first steps you can make in creating your first Azure DevOps pipelines. I really like how Kevin reassures us all to not get stuck in perfectionism paralysis. Kevin lets us all know that sometimes its ok to have an idea and implement a proof of concept quickly.

We each have a preference when using SQL Server Management Studio (SSMS). Some of us prefer to write the code outright and others prefer the GUI. Rob Farley reminds of us one of the option available in SSMS. We can use the GUI to create a T-SQL script for you. You also have this option with Azure Data Studio (ADS) for many (but not all) actions.

When things go wrong, one of the more tedious tasks we must undertake is reviewing error logs. Aaron Bertrand shares his solution on how to aggregate various error logs using PowerShell. I think the idea of combining various error logs can help us all. There is even a copy of the current version of code available on the blog post as well as a link to the source control so you can keep up with any changes as Aaron improves the code over time.

Another common issue in our day-to-day life is keeping things as they are because they’ve always been that way. I know I am all too guilty of this at times. In this month’s post, Mikey Bronowski gives examples of manual tasks that need some rework. Mikey covers how to get rid of paperwork and standardize SQL Server installations. Mikey found an outdated process due to old technology and after some internal conversations, Mikey updated the process to use newer technology and saved time.

One of the common tasks we often must deal with is patching our SQL Servers. It is one of those tasks that is monotonous but also requires additional analysis. Volker Bachman shares how to use PowerShell to control when and what updates are being deployed to your servers. I love that Volker has also included the functionality to automatically send an email if the server is restarted. That can really save time troubleshooting a server restart the next business day.

Richard Swinbank found that sometimes that you may want to generate database objects based on tables already in the database. Using metadata and dynamic SQL, Richard shows us how we generate scripts that will create multiple database objects. I like how Richard took a problem dealing with multiple different systems and found a way to automate a way to create consistent and reliable code.

One of the areas I haven’t spent much time with in SQL Server is trace flags. However, I completely understand wanting to ensure that trace flags are applied consistently across SQL Server instances. Taiob Ali shows us how we can keep trace flags consistently applied. With startup procedures and configuration tables, we can decrease our stress and know that our SQL Servers are configured as expected.

Every environment is different and each have their own challenges. For Deepthi Goguri those challenges include managing hundreds of servers with multiple versions of SQL Server. Deepthi has found that PowerShell is the right tool for the job. With PowerShell, Deepthi has saved hours making job owners, SQL Server editions, and recovery models consistent. Thank you, Deepthi for writing your first blog post for #tsql2sdy.

Another first time T-SQL Tuesday blogger is Travis Page. Travis is also using PowerShell to help automate stress away. I’m not new to database source control, and I want all the things in source control. Travis is tackling one aspect of source control that can be quite tricky. Using dbatools, Travis has found a way to put SQL Server Agent jobs into source control.

I’ve recently started using Azure this year, and Drew Skwiers-Koballa‘s blog post addresses the issue of how to save money in Azure. One common pitfall can be creating resources and forgetting to clean them up. Drew shows us how to use PowerShell Workflow runbooks to automatically delete resources. I also like how he showed us not only how to deallocate resources but also introduced me to Azure Automation.

One of the things Database Administrators are tasked with is ensuring that the data for our company is protected. In this month’s T-SQL Tuesday, Gregg Barsley shows us how to use dbatools and PowerShell to generate an audit file of user permissions to SQL Server. This is one of those topics that I never knew I needed; now that I’ve seen it, I know I have to have it.

Among many other things, knowing that our SQL Server instances are properly configured and running peak efficiency can really help reduce stress. Glenn Berry shows various options for how we can automate our health checks. You have options to how to run Glenn’s diagnostic queries that include PowerShell and Jupyter Notebooks. As a bonus, Glenn also shared how you can setup SQL Server Agent Alerts for various errors.

Steve Jones mentions that he uses automation to do tedious work so that he can focus on problem solving, and Steve is busy solving various problems. He has automated projects ranging from collecting data for SQL Saturday events to deploying development databases. This kind of variety shows the power available with automation.

We database people love our backups. And aside from testing our backups many of us have found that developers often ask for fresh data from Production. Hugo Kornelis found himself in that exact situation. Using T-SQL, Hugo was able to create a script to backup a database from one server and restore it to a location and name of his choosing.

I’ve been looking forward to this month’s T-SQL Tuesday for quite a while because I was so eager to see what you are were automating. One thing I did not expect to see was Rob Sewell‘s post. Rob has a creative solution for how to easily get the details about SSIS failures. I’m not surprised that Rob is using PowerShell, but what is really cool is that Rob is sending the detailed messages to Teams!

I’m glad that Josh Handler could also join us a first time #tsql2sday blogger. As a data professional, there are some times where we can take our time performing a task and other times where we need to get things done quickly. Josh writes about one such tasks which involves installing SQL Server fast. By using the SQL Server configuration ini file and some additional PowerShell scripts, Josh has been streamline SQL Server installations.

Mike Donnelly shares his experiences creating a deployment pipeline for Redshift. Mike discusses some common pitfalls that we can all fall into. Sometimes we start out with what seems to be a simple process but as time goes on, we find that a little manual work can become overwhelming. Mike identified the challenges and came up with the solution of using Python to improve the deployment pipeline.

One of the more dreaded things that can happen as a data profession is getting reports that a system is slow, especially hours after the slowness occurred. John McCormack has a solution that can help you find what was running. John shows us how he automated recording query activity using sp_WhoIsActive and SQL Server Agent.

Many of us may use T-SQL are our most frequent programming language. However, Todd Kleinhans has found a new interest with Python as it pertains to Todd’s passion for UE4, Epic Games Unreal Engine. Knowing the power of Python, Todd looks into the usefulness of Python with SSIS for ETL and other projects.

When working with PCI data, one of the most frequent and important tasks involves finding where credit card data may be stored in the database. In this month’s blog, Jason Brimhall shows how to automate detecting credit card data in your database. With just some T-SQL and SQL Server Agent job, you will be well on your way.

Jess Pomfret discusses a frequent task that many database professionals perform to ensure the health of their database environments. Jess shows how she has automated daily health checks using dbachecks. With dbachecks 2.0, you are now able to easily save your results to a database. It looks like I’m going to need to get the newest version of dbachecks!

This was my first time hosting T-SQL Tuesday, and I want to thank you all for joining me. I hope you get a chance to check out all the blogs on this recap post and put some of these ideas into practice.

T-SQL Tuesday #130 – Automate Your Stress Away

T-SQL Tuesday Logo - A blue database with a calendar showing days of the week wrapped artfully around the database.

Life as a data professional is stressful. This year is even more stressful. We have so many responsibilities and so may demands coming at us every day. I’ve found over the years that I love the stress, but I also want to make my life and the lives of those around me easier, calmer, more peaceful. I can’t change everything about my job or what is expected of me. After a particularly stressful summer many years ago, I wanted to figure I could change in my day to day tasks. How could I make my life easier?

At the time, the largest hurdle I had was around deployments. Our deployments consisted of a collection of SQL scripts collected from individual user stories. Each script was executed separately as we didn’t have a good process for what to do if any particular script did fail. In addition, we created rollbacks for each script that was deployed. The process worked for us for quite some time until one day we deployed our first new application in years.

Eventually, we moved a handful of our databases to source control. That part was easy as you can use Visual Studio and SQL Server Data Tools (SSDT) to import a schema for an existing database. The next steps took quite a bit of communication between the teams. A few missteps later, we had 2 of our 10 or so databases deploying through Continuous Integration Continuous Delivery (CICD) pipeline. We still deploy once every two weeks, but our deployments our generally quicker and less tedious.

The time savings is nice. We have about 24-26 deployments a year, and we easily save at least an hour on average per deployment. That’s a full day a year! But the best part for me is the next day. I still check my email as soon as I wake up to ensure that aren’t any issues reported, but even if there are issues, they are usually quickly resolved and we go on about our day.

My invitation to you is I want to know what you have automated to make your life easier? This can be anything creating a SQL Server Agent job to automated running a daily report or using dbatools to manage your servers. I’m curious what challenges you’ve found at your job and what you’ve done to make things better. If you haven’t had a chance to automate some part of your job, what would you like to automate and what are your hurdles? If you’re interested in some help or advice, let us know. I love #SqlFamily, and I’d love to see what we can do to help out.

The Rules

  1. Your post should be published and time Tuesday, September 8th, 2020
  2. Include the T-SQL Tuesday logo in your blog post
  3. Link back to this blog post or pingback if you’re that cool
  4. Tweet about your post using the #tsql2sday hashtag
  5. Share what you have automated or would like to automate

Missing Reports Folder in SSRS Project

This past week, I made the goal of automating the deployment of our first SSRS report at work. I created the report and after adding the report to source control, my Object Explorer looked like the image below.

A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.
A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.

I added my solution to source control and synced the project up to Github. However, when my colleague tried to clone the repo and open the Report Project, they saw an image like the one below.

A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Folders.]
A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Folders.

However, this method of interacting with SSRS reports is not what we are used to seeing. I tried several internet searches, but I was unable to find why the Reports folder was missing from the solution. I finally started clicking around happened to find an icon that displayed Solutions and Folders.

A screenshot of an SSRS Report in Solution Explorer. The icons for Solutions and Folders has been outline in a red box.
A screenshot of an SSRS Report in Solution Explorer. The icons for Solutions and Folders has been outline in a red box.

Clicking on the Solutions and Foldersicon returned the Reports folder in the Solution Explorer window.

A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.
A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.

If you find yourself looking at an SSRS Report in Visual Studio, but you don’t see the Shared Data Sources, Shared Data Sets, or Reports folders try to selecting Solutions and Folders. You might be using looking at the project

Stuck Solving a Problem

I’ve always loved solving problems. As a kid, I would get books of Logic Puzzles to solve. My favorite video game was Minesweeper. In the beginning of my career, I would get stuck on a problem one night and wake up in the morning with a possible solution. Things seemed to be easy. I had been warned by friends that this wouldn’t always be the case, but I still wasn’t ready for when everything changed.

A couple of years ago, I started presenting on the concept of tSQLt (https://tsqlt.org/) for database unit testing. It can from a place of need. My team needed a way to test their code, and I wanted to help them. I was also starting to embrace automation for all things database. I decided that the best way to buy-in for unit testing was to automate the process. The next step was to figure out how.

That started me on a multi-year journey to solve this riddle. What I ignored at the time was that not only was I trying to solve this riddle, but I also had all sorts of things changing in my life at the same time. I had some health issues pop-up and some of them caused a deep level of self-doubt in my ability to think through problems. I also spent the better part of a year writing a book. Unfortunately, that led me to only focus on how much time had passed without a solution.

As I started 2020, I was growing increasingly frustrated that I could not solve this problem. I had even presented on tSQLt at PASS Summit and most likely heard the answer from Sebastian Meine (w|t) during the Q&A portion of my session. But it still wasn’t clicking for me. As the frustration grew, the imposter syndrome started to spike. Reaching out on Twitter, I got some advice from SQLGrrl (t|b) reminding me to work on solving one small step at a time.

Within two weeks of implementing this advice, I had the solution I needed. I still remained frustrated at how long it took me to solve the problem. I ended up sharing my frustration with myself to another IT professional and was reassured that this is fairly common. Being on the other side of this situation, I wanted to share with others that there’s hope if you’re in the same situation. And I hope I find this article myself if I end up in the same situation again.

T-SQL Tuesday #126 – Do What You Can


It’s that time again for T-SQL Tuesday! This month’s host is Glenn Berry (b|t). He’s organized a Folding@Home (FAH)  team for SQL Family to help with biomedical research. We, as a world, are in the midst of a SARS-CoV-2 (also referred to as COVID-19) pandemic. For many of us, this pandemic has changed many aspects of our day to day life. Glenn’s invite asks us what we are doing as a response to COVID-19.

These are interesting times indeed, and many of us are trying to do what we can to help. I’m really thankful for those that are able to help during this time. My sister-in-law has been busy sewing masks for her family and community. A friend of mine printed a face shield for a nurse I know. In addition to Glenn Berry using FAH, Tim Radney (b|t) has also been 3D printing ear savers for healthcare workers. This is all amazing work!

I wish I could say I had been as altruistic or as helpful. I’ve been focusing on ensuring that the loved ones in my house are relatively unaffected by quarantine life. There’s been a number of changes in our house. The largest of changes is that I have been working from home for the past five weeks. While we did start on a project to add on to our house, that project is still underway.

Working from home full-time is a change in and of itself, but to do so while also having a construction underway in the house has been an additional challenge. There have also been two high school students that have been at home going to school. Our house is not large, and we’re doing the best to make do. I’ll be honest, I wish I could say I was doing more to help those outside of my immediate house but that wouldn’t be accurate.

My focus has been on keeping us comfortable while not taking others for granted. We have our face masks from my sister-in-law. And a friend of ours 3D printed a face shield for a nurse we know. The nurse is a good friend and has been performing COVID-19 testing as one of the drive-thru locations in our area. For those at home that are working to keep your family and loved ones fed and cared for, you are doing enough. I know I am trying to do enough. These are not usual times, and if we can accept that then we can accept our own best efforts.

T-SQL Tuesday #123 – Improve Focus through Speech


It’s the second Tuesday of the month, and that means it’s time for T-SQL Tuesday. This month Jess Pomfret (b|t) asked us what we use as life hacks to make our lives easier. There are many different ways that I look to streamline my day-to-day tasks. Some of the methods I use include automating repetitive tasks or learning how to break up complex tasks into smaller components. However, over the past year I undertook a significant project outside of my day job. This project was something I decided to do after hours.

The objective of this opportunity was to write a book. When I initially started writing the book I began by typing everything I wanted to say. I quickly discovered that I was thinking of what I wanted to say faster than I could get the words written on the computer. I also remembered that over the past couple of years I have spent quite a bit of time speaking at SQL Saturdays and PASS Summit. When I’m speaking I often let the flow of what I’m trying to say come naturally. I decided to try that method while writing the book.

Once I purchased the dictation software and a wireless headset, I was able to more easily express the information that I wanted to share. I will say I have found that dictating still works differently than speaking. I often think more thoroughly through what I want to say and pause more frequently than I would if I were presenting in front of a group. I have found dictating what I want to write does reduce my frustration with trying to get the words out of my head and into text.

I’m still developing a method as to how I would like to dictate text for blog posts. One of the reasons I consider this a life pack for my purposes is dictating my blog posts helps me get in the right mindset where I am focused on the task at hand. If you’ve tried to get into technical writing or writing a blog post, I would recommend looking at alternative methods to accomplish that goal.

T-SQL Tuesday #122 – Am I Doing this Right?


This month’s T-SQL Tuesday is hosted by Jon Shaulis (b|t). The topic for month asks us to consider have we experienced imposter syndrome and if so, how did we work through those feelings. This imposter syndrome can be problematic. I’ve found this level of self-doubt is miserable. In that past I have struggled with imposter syndrome to the point that it has affected my confidence in my ability to do a good job.

Thankfully, #SqlFamily has been there for me. I’m not entirely certain if I would have given up without that support, but I do know I am thankful every day for those individuals that have helped me along the way. Rob Volk (b|t) first referred me to a presentation by a woman in IT. Unfortunately, I am unable to remember her name, but the name of the presentation was something like “Why I suck”.

While I’d love to say that presentation solved all of my problems, it didn’t right away. What it did teach me is that it was impossible for me to know everything, to have all the answers. The lesson was that there is too much information for any of us to know all of it. Because we are surrounded by the knowledge of what we know and don’t know, we are incapable of accurately determining how much knowledge we actually have as compared to the whole.

Either way, that was a beginning. The lesson didn’t sink in until I worked for Rie Irish (b|t) a while ago. I was on a team that believed in me and encouraged me. Even when I doubted myself. Each new challenge, I may have doubted myself. But each time I completed that task, I chipped away at all of the fear and self-doubt.

Over time I got to the point where I learned I could believe in myself. In general, the rest of my self-doubt mattered less and less. I hope you’ve found this blog post helpful. If you have questions or you are struggling with imposter syndrome reach out to me or to your data professional peers. You may find we have a very differeny opinion of your skills that you have yourself.

T-SQL Tuesday #110 – Automate All the Things

Garry Bargsley (b|t) is hosting this month’s T-SQL Tuesday, a monthly blog party for the SQL Server community. Garry Bargsley kicks the year off right asking the following questions:

Kicking off the T-SQL Tuesday season for 2019, I would like to ask, what does “Automate All the Things” mean to you?

 

So technically there are two tasks for this month:

  • What do you want to automate or what automation are you proud of completing
  • What is your go-to technology for automation?

 

“Automate All the Things” is a methodology, a change in perspective. I recently had a friend ask me which came first consistency or automation. It’s a valid point. I believe automation is the what is accomplished as a result of consistency. That consistency is built by defining processes and having discipline to keep those processes. Consistency not only makes it easier to standardize processes (and train new employees), but it also means that I’m less likely to make mistakes.

One of my first automation projects was to implement continuous integration for database deployments. However, I did not realize how automation was going to fundamentally change how I see software development. When picking my tools I considered what tools were already used for our application deployment. I ended up using Visual Studio, SSDT, TeamCity, Octopus Deploy, and PowerShell to create, build, and deploy database projects.

Once realized the power of automation, I couldn’t get enough. My next goal is to get more familiar with PowerShell and dbatools to not only automate creating a distributed Availability Group but also building and configuring the VMs necessary to create my home lab.

T-SQL Tuesday #108 – Learning Tech Beyond SQL Server

Malathi Mahadevan (b|t) is hosting this month’s T-SQL Tuesday, a monthly blog party for the SQL Server community.

Malathi has asked us to:

Pick one thing you want to learn that is not SQL Server. Write down ways and means to learn it and add it as another skill to your resume. If you are already learning it or know it – explain how you got there and how it has helped you. Your experience may help many others looking for guidance on this.

Earlier this year, I decided I wanted to create a home lab. I originally planned to build a domain controller and other virtual machines for the home lab. Then I realized that SQL Server on containers may be able to help me solve several issues regarding licensing and resources. At that time, I realized I could also use containers to create full build pipeline including TeamCity and Octopus Deploy.

At first I tried to start in sequential order and create a TeamCity container. I had previously used TeamCity on my desktop, so I figured it would be easy. I quickly realized I had no idea what to do to even get the TeamCity image running.

I was reminded that years ago I decided I wanted to be a Database Administrator. At the time I learned that I had no idea how to learn. I had to figure out how I learned. I ultimately came to realize that my primary learning method is auditory.

I also began to realize that I see all knowledge as interconnected building blocks. Some building blocks already have a foundation or points where I can join them with knowledge I already have. In other cases, I have no or little previous knowledge. In these cases, it takes me significantly longer to learn that topic.

Using this information, I started watching videos on pass.org. My next step was to install Docker for Windows. Once that was done, I started spinning up some images for SQL Server. Now that I’ve gotten comfortable with containers, I’m starting to look into using Kubernetes and potentially building an Availability Group on containers.