Cloud SQL and PowerShell working together on Linux
Bryan Hamilton
Database Engineer, Cloud SQL
Try Google Cloud
Start building on Google Cloud with $300 in free credits and 20+ always free products.
Free trialPowerShell is a powerful scripting tool often used by database administrators for managing Microsoft SQL Server. This blog will focus on the aspects of using PowerShell for common database tasks and management on a Cloud SQL for SQL Server instance. We will also look at dbatools.io and how this can be used on instances with cross-region replicas, external replication, and other key features enabled.
Google Cloud Tools for PowerShell also lets you run various cmdlets from the gcloud CLI - you can learn more in our documentation - but the focus of this post is on running PowerShell from a standalone virtual machine. PowerShell now supports both Windows and Linux, which means you can install it on a Compute Engine Linux Virtual Machine (VM).
Initial setup and getting started
You can install PowerShell on a Compute Engine VM, just as you can install SQL Server Management Studio on a VM for managing a Cloud SQL instance. PowerShell is installed by default and requires no setup on any Windows Compute Engine VM that you create, and you can also install it in a Compute Engine Linux VM. The 7 steps below are needed to get the PowerShell environment set up on a Compute Engine Linux VM with dbatools.io
Create a VM
Connect to the VM
Install PowerShell
Launch PowerShell
Verify PowerShell setup
Install dbatools.io
Verify dbatools.io setup
Step 1. Create a VM
Step 2. Connect to the VM
Connect to your Linux VM following these instructions.
Step 3. Install PowerShell
Follow the steps from here to install PowerShell
Step 4. Launch PowerShell
Now start PowerShell using the command below # Start PowerShell
pwsh
You should get a command prompt similar to the one below.
Step 5. Verify PowerShell setup
You can verify PowerShell is working by running the command below
Step 5. Install dbatools.io
Next install dbatools.io using the command below, this is also documented here # run this command
Install-Module dbatools
Step 6. Verify dbatools.io setup
In these examples I will be using SQL Server authentication to connect to each database. To do this, we need to create a PowerShell credential so that we can authenticate to the database server.
Now, let's run a test query to verify that our setup is working as expected. We can use the Get-DbaDatabase cmdlet to connect to our SQL Server instance and list all the user databases as below. This helps verify connectivity between source and destination.
dbatools.io has a lot of cmdlets provided out of the box that can be used to manage your Cloud SQL instance. You may even use this to complete a few of the DBA tasks recommended in our best practices. The next section will cover the scenarios listed below for TempDB.
Viewing the number of TempDB files.
Adding/removing more files to TempDB after instance resize.
Updating TempDB
There are certain best practices for TempDB to achieve optimal performance. One of the main recommendations is having an equal number of files for TempDB (up to 8) matching the number of cores available. You can easily review and manage TempDB configurations using powershell.
Viewing the number of TempDB files
To review your TempDB files for your Cloud SQL instance, use the Get-DbaDbFile cmdlet like the example below.
Adding/removing more files to TempDB after instance resize
If additional files are needed (for example, after resizing your Cloud SQL instance), you can add more files using the Set-DbaTempDbConfig
command shown below. You may also need to add TempDB files based on contention observed in TempDB.
In this example, we have resized the instance to have 6 vCPUs, so we need to add four more TempDB data files to have 6 data files in total. This step can also be done outside of PowerShell as documented here as well.
You may get a warning message like the one above that the logical filename is already in use. This happens because the powershell script tries to use a filename that already exists. To address this warning, you can remove all the TempDB files except the primary files (tempdev and templog).
In our case we will use the script below to complete this action.
After the files have been cleared you will have two files remaining.
Then you can try adding the appropriate amount of TempDB files again. Once that is done you will need to restart your Cloud SQL instance for the changes to take effect.
Review DB wait statistics
If you are experiencing performance issues or want to see what your Cloud SQL instance is waiting on, you can use the Get-DbaWaitStatistic
cmdlet and check wait stats with a single command.
Sync objects between replicas
If you are using a Cloud SQL for SQL Server read replica or Cloud SQL as a publisher for transactional replication, there are a few tasks that you should continue to perform, like keeping the SQL agent jobs in sync between instances. In this example, use the steps in Cloud SQL documentation to create a read replica. At the initial creation, objects are in sync on both the primary and secondary. We need to make sure to sync objects created after the replica is set up.
SQL Agent Jobs
Let's create a sample job on the primary instance that we will later sync to the replica instance.
You can use the New-DbaAgentJob cmdlet as below
Now create a job step called test-step using New-DbaAgentJobStep
Now let's sync the replica with this new job created In the previous step using Copy-DbaAgentJob
We should see a job get copied that was created on the primary. You can use Get-DbaAgentJob to list jobs on the replica if necessary as well.
If you made any changes on the primary and want to sync the secondary, you can use the –Force option to sync the changes. To demonstrate this we will make two changes listed below on the primary instance.
Create a second sql agent job called second-job
Add a second job step called second-step to the job named test-job
We will review then sync these changes above to the secondary server in these next steps.
Create a new job
On the primary add another job step to the first job
Now let's review the jobs steps on the primary
Now let’s sync the secondary server with the updates we made using the –Force option. You should see the second-job added and the test-job successfully updated as below.
Importing data
You can also use powershell to import data (for example, a CSV file). You can choose your own CSV file or create a sample one using docs.google.com/spreadsheets/ Here is one with a small sample that I created
Using cat we can see the contents below as well.cat ./import/States.csv
id,name,capitol
1,washington,olympia
2,oregon,Salem
3,california,Sacramento
4,montana,Helena
5,idaho,Boise
Use Import-DbaCsv to import this file to your Cloud SQL instance as shown below. This can also be used as an alternative to BULK INSERT.
Now we can also list the table that was imported using Get-DbaDbTable
You can see a table with 5 rows was created.
This can also be used to transfer tables between instances as well. For example if you have two databases that are replicating data you can transfer objects between the primary and replica of a publisher setup. This could be used as a way to do an initial sync of objects that do not support replication such as tables without a primary key.
This can be done using Copy-DbaDbTableData
We will copy the states table that we imported above from the source to a destination database called newtest.
You can see the table was copied to the destination and 5 rows were copied in 141.02 ms
Performing common DBA tasks
There are certain tasks that a DBA/DBE may need to perform to keep their Cloud SQL for SQL Server instance healthy, many of which can be done using PowerShell.
Unused indexes and Duplicate indexes
In many cases, having indexes improves the performance of selects, but they also cause some overhead to inserts and updates. It is normally recommended to review unused indexes and duplicated indexes. The two cmdlets listed below can be used to do this.
Find-DbaDbDuplicateIndex
Find-DbaDbUnusedIndex
Diagnostic queries on Cloud SQL
There is a common set of diagnostic queries provided by SQL Server MVP Glen Berry here.
We can use Invoke-DbaDiagnosticQuery to automatically execute and return the results for a specific set of queries or all these queries. There are a lot of queries and information that this returns so it could take a while. It might be a good idea to limit this to specific queries or target certain databases.
Here is an example of what a partial output looks like.
Here are some examples of queries you can execute on Cloud SQL to get Cross Region Replica Availability Group status and DB backup status. The output can also be formatted to a table as below for better readability.
Example Executing Query : AG Status
Example Executing Query : Last Backup by Database
In this blog you learned how to use PowerShell on a Compute Engine Linux VM to manage your Cloud SQL instances. We covered only some of the more common scenarios, but there is much more that can be done using powershell and dbatools.io. To learn more and see the full list of commands available, you can visit https://dbatools.io/commands/.