My First Data Warehouse: Intro

DEAR READER: I want to make you aware this blog series is intended to share my journey mistakes and all, to creating my first data warehouse.

I have been wanting to create a data warehouse at my company for several years. I have made progress here and there. Some table design, some SSIS packages but I’ve never been able to get a data warehouse setup in Production. I had failed to complete the most important step; I didn’t have an internal advocate. This time was different! We had hired someone new at the company that was used to performing data analytics and creating visualization.

We needed to analyze both data that did and did not exist in our systems. When I first started this project, the goal was to get data imported in the shortest amount time possible. Due to the time constraint, I was asked to import the data without any transformations. For the readers that have been on a journey like this before, I think you’ll know where this story is headed.

Even though the data was in the database, it wasn’t easy to analyze. It was at this point that I realized I made a couple of beginner mistakes. There were some key design elements that needed to be completely redone. Over the next couple of weeks, we’ll go over what I learned and how I fixed things.

Multiple Database Projects in Azure Data Studio

In many companies, you’re often working with more than one database on the same server. When I use Visual Studio, I have all the databases that exist on the same server in the same solution. You can do this same thing when you’re working with Azure Data Studio.

In this week’s YouTube series, I show you one trick that will let you have multiple database projects in the same workspace. This becomes especially important if you have database code that references objects in a separate database.

Deploy Database Projects Manually

Earlier in this series, I showed you how to create a Database Project using Azure Data Studio (ADS) and how to add a Database Project to GitHub. Getting a database project setup and into source control is often the beginning. I’ve found one of the hardest things to do is to keep your database in source control. This is often because it hard to keep using source control when you don’t have a way to deploy your changes.

This week, we’ll talk about one of the easier ways to deploy your database changes. One of the benefits of database projects is that they can generate data-tier applications (DAC). The data-tier applications can be bundled into what is called a DACPAC. This is a collection of files that can be used to deploy your database.

DACPACs are definitely not the only way to deploy your database. However, they are one of the main ways to deploy database projects. If you’re using to running a series of script files for your deployments, this is going to be a bit different. When deploying database projects, the database project becomes the source of truth. Generally speaking, whatever is in the database project will be deployed to the SQL Server instance.

If you’re like to figure out how you can manually deploy your database projects, you can check out the video below. If you’re looking for a more automated method to deploy database code, check back over the next couple of weeks where I’ll start showing you how you can use Azure DevOps to build and deploy your database code.

T-SQL Tuesday #133 – Filling in the Gaps

T-SQL Tuesday Logo - A blue database with a calendar showing days of the week wrapped artfully around the database.

I’m so glad Lisa Bohm (b|t) is hosting this month’s T-SQL Tuesday. I’m also thankful to Steve Jones for hosting T-SQL Tuesday and helping hosts share their ideas with the SQL community. I also want to thank Adam Machanic for starting T-SQL Tuesday as a blog party. I know I often have a difficult time coming up with topics, and it helps to be provided a topic.

Let’s get back to this month. Lisa says, “I’d like those of you who have presented, or written a presentation, to share something technical THAT DID NOT RELATE to the topic of the presentation”. I’m glad to respond to this month’s blog topic. The main reason is, like Lisa, I want to encourage people to present and share their experiences. One of the best things happened to me as a result of presenting, and it was something that I needed.

A couple of years ago, the company I was at was planning a data center migration using distributed Availability Groups. My boss at the time, Rie Irish (b|t), encouraged me to get a SQL Saturday presentation together. At the time, one of my largest concerns as a DBA was that I was not familiar with Server Manager. I often felt like I didn’t belong as a DBA because I came from what I perceived as a non-conventional route.

For those of you data professionals that came to this profession from outside of IT, I understand you. I have a B.B.A. in Accounting, and when I started my career, I was in business operations. I was introduced MS Access, and I was mesmerized with databases. I eventually found that if I wanted real database action, I needed to work with technology with Microsoft SQL Server.

Fast forward to this presentation for distributed Availability Groups. Prior to this presentation, I barely used PowerShell, and I had no idea how to create VM. But thanks to the encouragement I received, I figured out how to build multiple VMs using PowerShell. I made Domain Controllers and all that jazz. Honestly, I think all the things I had to do to create distributed Availability Group really helped me build confidence I didn’t realize I was missing.

Now I accept that I don’t need to know everything about infrastructure and networking. I still want to learn. But I am comfortable with Server Manager, building VMs, and I have a passing understanding of Domain Controllers. But really, the best thing I learned is that I don’t have to know all the answers. What I love is automating database deployments and understanding how applications interact with databases. And that’s ok!

T-SQL Tuesday #130 – Recap: Automate Your Stress Away

T-SQL Tuesday Logo - A blue database with a calendar showing days of the week wrapped artfully around the database.

This month, I hosted #TSql2sDay for September 2020. If you’re unfamiliar with T-SQL Tuesday, it’s a blogging party. The idea is a host comes up with a topic and encourages other members of #SqlFamily to join in and write a blog on the topic. T-SQL Tuesday was stared by Adam Machanic and Steve Jones has carried on this fun tradition!

I’m really grateful for all the bloggers that took part this month. Especially since automation has been a topic discussed before. However, it’s hard for me to get too much of my day to day work automated. And I was really looking forward to these topics so that I could learn new tasks I could automate myself. With that said, let’s see all the wonderful ideas people contributed this month. And if you’re like me, you’re going to want to put some of this automation in place as soon as possible.

I’m right there with Greg Dodd regarding automation. Greg blogs about how automation can be a great way to minimize making mistakes. I also like how Greg points out that automation can mean we can all take a vacation without worrying that someone will call us with an emergency. Greg’s done some pretty interesting stuff including automating the creation of test servers using Chef. Greg keeps things realistic by stating that often automation is not a time saver at first, but over time you can really see the benefit.

Eitan Blumin is very familiar with automation, and Eitan has blogged about the topic extensively. As such, Eitan approached this month’s topic with a different perspective. Instead of discussing a project using automation, Eitan walks us through what steps we can use to start automating tasks around us.

This was Magda‘s first blog post ever (yay, Magda!), and I’m so glad that Magda joined in the fun to write about how we can all use automation. Magda might describe themselves as a non-technical worker, but I think Magda’s approach to automation is the same as the rest of us. Magda found that some manual tasks in Excel could be improved with automation. I’m also very glad that Magda reminds us all to celebrate the victories… and even a cup of tea with the time you’ve saved automating your work.

Kevin Chant lets us know what benefits he’s found with automation, which includes making our lives easier. Kevin shares information about the first steps you can make in creating your first Azure DevOps pipelines. I really like how Kevin reassures us all to not get stuck in perfectionism paralysis. Kevin lets us all know that sometimes its ok to have an idea and implement a proof of concept quickly.

We each have a preference when using SQL Server Management Studio (SSMS). Some of us prefer to write the code outright and others prefer the GUI. Rob Farley reminds of us one of the option available in SSMS. We can use the GUI to create a T-SQL script for you. You also have this option with Azure Data Studio (ADS) for many (but not all) actions.

When things go wrong, one of the more tedious tasks we must undertake is reviewing error logs. Aaron Bertrand shares his solution on how to aggregate various error logs using PowerShell. I think the idea of combining various error logs can help us all. There is even a copy of the current version of code available on the blog post as well as a link to the source control so you can keep up with any changes as Aaron improves the code over time.

Another common issue in our day-to-day life is keeping things as they are because they’ve always been that way. I know I am all too guilty of this at times. In this month’s post, Mikey Bronowski gives examples of manual tasks that need some rework. Mikey covers how to get rid of paperwork and standardize SQL Server installations. Mikey found an outdated process due to old technology and after some internal conversations, Mikey updated the process to use newer technology and saved time.

One of the common tasks we often must deal with is patching our SQL Servers. It is one of those tasks that is monotonous but also requires additional analysis. Volker Bachman shares how to use PowerShell to control when and what updates are being deployed to your servers. I love that Volker has also included the functionality to automatically send an email if the server is restarted. That can really save time troubleshooting a server restart the next business day.

Richard Swinbank found that sometimes that you may want to generate database objects based on tables already in the database. Using metadata and dynamic SQL, Richard shows us how we generate scripts that will create multiple database objects. I like how Richard took a problem dealing with multiple different systems and found a way to automate a way to create consistent and reliable code.

One of the areas I haven’t spent much time with in SQL Server is trace flags. However, I completely understand wanting to ensure that trace flags are applied consistently across SQL Server instances. Taiob Ali shows us how we can keep trace flags consistently applied. With startup procedures and configuration tables, we can decrease our stress and know that our SQL Servers are configured as expected.

Every environment is different and each have their own challenges. For Deepthi Goguri those challenges include managing hundreds of servers with multiple versions of SQL Server. Deepthi has found that PowerShell is the right tool for the job. With PowerShell, Deepthi has saved hours making job owners, SQL Server editions, and recovery models consistent. Thank you, Deepthi for writing your first blog post for #tsql2sdy.

Another first time T-SQL Tuesday blogger is Travis Page. Travis is also using PowerShell to help automate stress away. I’m not new to database source control, and I want all the things in source control. Travis is tackling one aspect of source control that can be quite tricky. Using dbatools, Travis has found a way to put SQL Server Agent jobs into source control.

I’ve recently started using Azure this year, and Drew Skwiers-Koballa‘s blog post addresses the issue of how to save money in Azure. One common pitfall can be creating resources and forgetting to clean them up. Drew shows us how to use PowerShell Workflow runbooks to automatically delete resources. I also like how he showed us not only how to deallocate resources but also introduced me to Azure Automation.

One of the things Database Administrators are tasked with is ensuring that the data for our company is protected. In this month’s T-SQL Tuesday, Gregg Barsley shows us how to use dbatools and PowerShell to generate an audit file of user permissions to SQL Server. This is one of those topics that I never knew I needed; now that I’ve seen it, I know I have to have it.

Among many other things, knowing that our SQL Server instances are properly configured and running peak efficiency can really help reduce stress. Glenn Berry shows various options for how we can automate our health checks. You have options to how to run Glenn’s diagnostic queries that include PowerShell and Jupyter Notebooks. As a bonus, Glenn also shared how you can setup SQL Server Agent Alerts for various errors.

Steve Jones mentions that he uses automation to do tedious work so that he can focus on problem solving, and Steve is busy solving various problems. He has automated projects ranging from collecting data for SQL Saturday events to deploying development databases. This kind of variety shows the power available with automation.

We database people love our backups. And aside from testing our backups many of us have found that developers often ask for fresh data from Production. Hugo Kornelis found himself in that exact situation. Using T-SQL, Hugo was able to create a script to backup a database from one server and restore it to a location and name of his choosing.

I’ve been looking forward to this month’s T-SQL Tuesday for quite a while because I was so eager to see what you are were automating. One thing I did not expect to see was Rob Sewell‘s post. Rob has a creative solution for how to easily get the details about SSIS failures. I’m not surprised that Rob is using PowerShell, but what is really cool is that Rob is sending the detailed messages to Teams!

I’m glad that Josh Handler could also join us a first time #tsql2sday blogger. As a data professional, there are some times where we can take our time performing a task and other times where we need to get things done quickly. Josh writes about one such tasks which involves installing SQL Server fast. By using the SQL Server configuration ini file and some additional PowerShell scripts, Josh has been streamline SQL Server installations.

Mike Donnelly shares his experiences creating a deployment pipeline for Redshift. Mike discusses some common pitfalls that we can all fall into. Sometimes we start out with what seems to be a simple process but as time goes on, we find that a little manual work can become overwhelming. Mike identified the challenges and came up with the solution of using Python to improve the deployment pipeline.

One of the more dreaded things that can happen as a data profession is getting reports that a system is slow, especially hours after the slowness occurred. John McCormack has a solution that can help you find what was running. John shows us how he automated recording query activity using sp_WhoIsActive and SQL Server Agent.

Many of us may use T-SQL are our most frequent programming language. However, Todd Kleinhans has found a new interest with Python as it pertains to Todd’s passion for UE4, Epic Games Unreal Engine. Knowing the power of Python, Todd looks into the usefulness of Python with SSIS for ETL and other projects.

When working with PCI data, one of the most frequent and important tasks involves finding where credit card data may be stored in the database. In this month’s blog, Jason Brimhall shows how to automate detecting credit card data in your database. With just some T-SQL and SQL Server Agent job, you will be well on your way.

Jess Pomfret discusses a frequent task that many database professionals perform to ensure the health of their database environments. Jess shows how she has automated daily health checks using dbachecks. With dbachecks 2.0, you are now able to easily save your results to a database. It looks like I’m going to need to get the newest version of dbachecks!

This was my first time hosting T-SQL Tuesday, and I want to thank you all for joining me. I hope you get a chance to check out all the blogs on this recap post and put some of these ideas into practice.

T-SQL Tuesday #130 – Automate Your Stress Away

T-SQL Tuesday Logo - A blue database with a calendar showing days of the week wrapped artfully around the database.

Life as a data professional is stressful. This year is even more stressful. We have so many responsibilities and so may demands coming at us every day. I’ve found over the years that I love the stress, but I also want to make my life and the lives of those around me easier, calmer, more peaceful. I can’t change everything about my job or what is expected of me. After a particularly stressful summer many years ago, I wanted to figure I could change in my day to day tasks. How could I make my life easier?

At the time, the largest hurdle I had was around deployments. Our deployments consisted of a collection of SQL scripts collected from individual user stories. Each script was executed separately as we didn’t have a good process for what to do if any particular script did fail. In addition, we created rollbacks for each script that was deployed. The process worked for us for quite some time until one day we deployed our first new application in years.

Eventually, we moved a handful of our databases to source control. That part was easy as you can use Visual Studio and SQL Server Data Tools (SSDT) to import a schema for an existing database. The next steps took quite a bit of communication between the teams. A few missteps later, we had 2 of our 10 or so databases deploying through Continuous Integration Continuous Delivery (CICD) pipeline. We still deploy once every two weeks, but our deployments our generally quicker and less tedious.

The time savings is nice. We have about 24-26 deployments a year, and we easily save at least an hour on average per deployment. That’s a full day a year! But the best part for me is the next day. I still check my email as soon as I wake up to ensure that aren’t any issues reported, but even if there are issues, they are usually quickly resolved and we go on about our day.

My invitation to you is I want to know what you have automated to make your life easier? This can be anything creating a SQL Server Agent job to automated running a daily report or using dbatools to manage your servers. I’m curious what challenges you’ve found at your job and what you’ve done to make things better. If you haven’t had a chance to automate some part of your job, what would you like to automate and what are your hurdles? If you’re interested in some help or advice, let us know. I love #SqlFamily, and I’d love to see what we can do to help out.

The Rules

  1. Your post should be published and time Tuesday, September 8th, 2020
  2. Include the T-SQL Tuesday logo in your blog post
  3. Link back to this blog post or pingback if you’re that cool
  4. Tweet about your post using the #tsql2sday hashtag
  5. Share what you have automated or would like to automate

Missing Reports Folder in SSRS Project

This past week, I made the goal of automating the deployment of our first SSRS report at work. I created the report and after adding the report to source control, my Object Explorer looked like the image below.

A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.
A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.

I added my solution to source control and synced the project up to Github. However, when my colleague tried to clone the repo and open the Report Project, they saw an image like the one below.

A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Folders.]
A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Folders.

However, this method of interacting with SSRS reports is not what we are used to seeing. I tried several internet searches, but I was unable to find why the Reports folder was missing from the solution. I finally started clicking around happened to find an icon that displayed Solutions and Folders.

A screenshot of an SSRS Report in Solution Explorer. The icons for Solutions and Folders has been outline in a red box.
A screenshot of an SSRS Report in Solution Explorer. The icons for Solutions and Folders has been outline in a red box.

Clicking on the Solutions and Foldersicon returned the Reports folder in the Solution Explorer window.

A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.
A screenshot of an SSRS Report in Solution Explorer. The Solution Explorer view is set to Solution.

If you find yourself looking at an SSRS Report in Visual Studio, but you don’t see the Shared Data Sources, Shared Data Sets, or Reports folders try to selecting Solutions and Folders. You might be using looking at the project

Stuck Solving a Problem

I’ve always loved solving problems. As a kid, I would get books of Logic Puzzles to solve. My favorite video game was Minesweeper. In the beginning of my career, I would get stuck on a problem one night and wake up in the morning with a possible solution. Things seemed to be easy. I had been warned by friends that this wouldn’t always be the case, but I still wasn’t ready for when everything changed.

A couple of years ago, I started presenting on the concept of tSQLt (https://tsqlt.org/) for database unit testing. It can from a place of need. My team needed a way to test their code, and I wanted to help them. I was also starting to embrace automation for all things database. I decided that the best way to buy-in for unit testing was to automate the process. The next step was to figure out how.

That started me on a multi-year journey to solve this riddle. What I ignored at the time was that not only was I trying to solve this riddle, but I also had all sorts of things changing in my life at the same time. I had some health issues pop-up and some of them caused a deep level of self-doubt in my ability to think through problems. I also spent the better part of a year writing a book. Unfortunately, that led me to only focus on how much time had passed without a solution.

As I started 2020, I was growing increasingly frustrated that I could not solve this problem. I had even presented on tSQLt at PASS Summit and most likely heard the answer from Sebastian Meine (w|t) during the Q&A portion of my session. But it still wasn’t clicking for me. As the frustration grew, the imposter syndrome started to spike. Reaching out on Twitter, I got some advice from SQLGrrl (t|b) reminding me to work on solving one small step at a time.

Within two weeks of implementing this advice, I had the solution I needed. I still remained frustrated at how long it took me to solve the problem. I ended up sharing my frustration with myself to another IT professional and was reassured that this is fairly common. Being on the other side of this situation, I wanted to share with others that there’s hope if you’re in the same situation. And I hope I find this article myself if I end up in the same situation again.

T-SQL Tuesday #126 – Do What You Can


It’s that time again for T-SQL Tuesday! This month’s host is Glenn Berry (b|t). He’s organized a Folding@Home (FAH)  team for SQL Family to help with biomedical research. We, as a world, are in the midst of a SARS-CoV-2 (also referred to as COVID-19) pandemic. For many of us, this pandemic has changed many aspects of our day to day life. Glenn’s invite asks us what we are doing as a response to COVID-19.

These are interesting times indeed, and many of us are trying to do what we can to help. I’m really thankful for those that are able to help during this time. My sister-in-law has been busy sewing masks for her family and community. A friend of mine printed a face shield for a nurse I know. In addition to Glenn Berry using FAH, Tim Radney (b|t) has also been 3D printing ear savers for healthcare workers. This is all amazing work!

I wish I could say I had been as altruistic or as helpful. I’ve been focusing on ensuring that the loved ones in my house are relatively unaffected by quarantine life. There’s been a number of changes in our house. The largest of changes is that I have been working from home for the past five weeks. While we did start on a project to add on to our house, that project is still underway.

Working from home full-time is a change in and of itself, but to do so while also having a construction underway in the house has been an additional challenge. There have also been two high school students that have been at home going to school. Our house is not large, and we’re doing the best to make do. I’ll be honest, I wish I could say I was doing more to help those outside of my immediate house but that wouldn’t be accurate.

My focus has been on keeping us comfortable while not taking others for granted. We have our face masks from my sister-in-law. And a friend of ours 3D printed a face shield for a nurse we know. The nurse is a good friend and has been performing COVID-19 testing as one of the drive-thru locations in our area. For those at home that are working to keep your family and loved ones fed and cared for, you are doing enough. I know I am trying to do enough. These are not usual times, and if we can accept that then we can accept our own best efforts.

T-SQL Tuesday #123 – Improve Focus through Speech


It’s the second Tuesday of the month, and that means it’s time for T-SQL Tuesday. This month Jess Pomfret (b|t) asked us what we use as life hacks to make our lives easier. There are many different ways that I look to streamline my day-to-day tasks. Some of the methods I use include automating repetitive tasks or learning how to break up complex tasks into smaller components. However, over the past year I undertook a significant project outside of my day job. This project was something I decided to do after hours.

The objective of this opportunity was to write a book. When I initially started writing the book I began by typing everything I wanted to say. I quickly discovered that I was thinking of what I wanted to say faster than I could get the words written on the computer. I also remembered that over the past couple of years I have spent quite a bit of time speaking at SQL Saturdays and PASS Summit. When I’m speaking I often let the flow of what I’m trying to say come naturally. I decided to try that method while writing the book.

Once I purchased the dictation software and a wireless headset, I was able to more easily express the information that I wanted to share. I will say I have found that dictating still works differently than speaking. I often think more thoroughly through what I want to say and pause more frequently than I would if I were presenting in front of a group. I have found dictating what I want to write does reduce my frustration with trying to get the words out of my head and into text.

I’m still developing a method as to how I would like to dictate text for blog posts. One of the reasons I consider this a life pack for my purposes is dictating my blog posts helps me get in the right mindset where I am focused on the task at hand. If you’ve tried to get into technical writing or writing a blog post, I would recommend looking at alternative methods to accomplish that goal.