Pianifica le esportazioni dei file di log delle transazioni
Mantieni tutto organizzato con le raccolte
Salva e classifica i contenuti in base alle tue preferenze.
I file di backup dei log delle transazioni contengono le modifiche che si verificano nel database di origine dopo il backup completo. I backup dei log delle transazioni sono obbligatori per la fase di caricamento continuo del job di migrazione.
Questa pagina descrive come pianificare esportazioni e caricamenti regolari dei log delle transazioni per i database SQL Server di origine.
Pianifica i caricamenti dei file di log delle transazioni per Amazon RDS
Puoi pianificare trasferimenti regolari dei file di log delle transazioni
dall'istanza di origine Amazon RDS al bucket Cloud Storage in cui
memorizzi i file di log delle transazioni.
Procedi nel seguente modo:
Assicurati che i backup automatici siano abilitati nell'istanza Amazon RDS.
Consulta
Attivare i backup automatici nella documentazione di Amazon RDS.
Configura i trasferimenti continui dei file dal bucket S3 al bucket Cloud Storage. Puoi utilizzare qualsiasi soluzione per spostare i file, ad esempio i job di trasferimento basati su eventi in Storage Transfer Service. Consulta
Configurare i trasferimenti basati su eventi da AWS S3.
Pianificare i caricamenti dei file di log delle transazioni per le istanze SQL Server on-premise
Puoi pianificare trasferimenti regolari dei file dei log delle transazioni dalla tua istanza di origine self-managed al bucket Cloud Storage in cui memorizzi i file dei log delle transazioni. Una delle soluzioni consigliate è
Salva il seguente script in un file nell'istanza SQL Server di origine.
Questo script automatizza la creazione di un file di log delle transazioni e il relativo caricamento nel bucket Cloud Storage utilizzando il comando gcloud storage cp.
Configura uno strumento di pianificazione a tua scelta per eseguire regolarmente lo script.
Puoi anche utilizzare questo script manualmente e passare l'argomento posizionale "final"
per creare il file del log delle transazioni il cui nome termina con il suffisso .trn.final. Questa opzione è utile quando vuoi completare la migrazione e
promuovere il job di migrazione.
[[["Facile da capire","easyToUnderstand","thumb-up"],["Il problema è stato risolto","solvedMyProblem","thumb-up"],["Altra","otherUp","thumb-up"]],[["Difficile da capire","hardToUnderstand","thumb-down"],["Informazioni o codice di esempio errati","incorrectInformationOrSampleCode","thumb-down"],["Mancano le informazioni o gli esempi di cui ho bisogno","missingTheInformationSamplesINeed","thumb-down"],["Problema di traduzione","translationIssue","thumb-down"],["Altra","otherDown","thumb-down"]],["Ultimo aggiornamento 2025-09-05 UTC."],[[["\u003cp\u003eTransaction log backups capture changes in your source database post-full backup and are essential for continuous data loading during migration.\u003c/p\u003e\n"],["\u003cp\u003eYou can schedule regular transaction log file uploads from Amazon RDS instances to a Cloud Storage bucket, by ensuring automated backups are enabled and transaction log access is active.\u003c/p\u003e\n"],["\u003cp\u003eFor Amazon RDS, a SQL Agent job should be created to regularly execute a script that utilizes \u003ccode\u003emsdb.dbo.rds_tlog_backup_copy_to_S3\u003c/code\u003e to send log files to S3.\u003c/p\u003e\n"],["\u003cp\u003eFor on-premise SQL Server instances, you can schedule transaction log file uploads to Cloud Storage by using scripts that leverage the \u003ccode\u003egcloud storage cp\u003c/code\u003e command.\u003c/p\u003e\n"],["\u003cp\u003eA provided bash and Powershell script automates the process of creating a transaction log file and then uploading it to Cloud Storage for on-premise SQL Server.\u003c/p\u003e\n"]]],[],null,["# Schedule transaction log file exports\n\nTransaction log backup files contain the changes changes that occur in your source\ndatabase after you take the full backup. Transaction log backups are required\nfor the continuous load phase of your migration job.\n\nThis page describes how to schedule regular transaction log exports and uploads\nfor your source SQL Server databases.\n\nSchedule transaction log file uploads for Amazon RDS\n----------------------------------------------------\n\nYou can schedule regular transfers of your transaction log files\nfrom the Amazon RDS source instance to the Cloud Storage bucket where you\nstore the transaction log files.\n\nPerform the following steps:\n\n1. Ensure automated backups are enabled on your Amazon RDS instance. See [Enable automated backups](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ManagingAutomatedBackups.html#USER_WorkingWithAutomatedBackups.Enabling) in the Amazon RDS documentation.\n2. Enable access to transaction log files in your Amazon RDS instance. See [Access to transaction log backups with RDS for SQL Server](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER.SQLServer.AddlFeat.TransactionLogAccess.html). You can also check the following sample command: \n\n #### Example stored procedure call to enable transaction log access in Amazon RDS\n\n ```sql\n export DATABASE=YOUR_DATABASE_NAME;\n export S3_Bucket=YOUR_S3_BUCKET;\n exec msdb.dbo.rds_tlog_copy_setup\n @target_s3_arn='arn:aws:s3:::${S3_Bucket}/${DATABASE}/log/';\n ```\n3. Create a SQL Agent job that runs on a regular schedule. The job should execute the following script: \n\n ```sql\n declare @DATABASE varchar(100);\n SELECT @DATABASE=YOUR_DATABASE_NAME;\n\n USE @DATABASE;\n declare @startTime varchar(100);\n declare @endTime varchar(100);\n SELECT\n @startTime = CONVERT(VARCHAR(100), DATEADD(hour, -1, GETUTCDATE()), 120),\n @endTime = CONVERT(VARCHAR(100), GETUTCDATE(), 120);\n\n exec msdb.dbo.rds_tlog_backup_copy_to_S3 \n @db_name=@DATABASE,\n @backup_file_start_time=@startTime,\n @backup_file_end_time=@endTime;\n ```\n4. Configure continuous file transfers from your S3 bucket to the Cloud Storage bucket. You can use any solution to move your files, for example event-driven transfer jobs in Storage Transfer Service. See [Set up event-driven transfers from AWS S3](/storage-transfer/docs/event-driven-transfers#set_up_event-driven_transfers_from_aws_s3).\n\nSchedule transaction log file uploads for on-premise SQL Server instances\n-------------------------------------------------------------------------\n\nYou can schedule regular transfers of your transaction log files\nfrom your self-managed source instance to the Cloud Storage bucket where you\nstore the transaction log files. One of the recommended solutions is\n\nPerform the following steps:\n\n1. On the system where you want to run the script, ensure you have initialized the Google Cloud CLI with authentication and a project by running either [gcloud init](/sdk/gcloud/reference/init); or [gcloud auth login](/sdk/gcloud/reference/auth/login) and [gcloud config set project](/sdk/gcloud/reference/config/set).\n2. Save the following script to a file on your source SQL Server instance. This script automates creating a transaction log file and uploading it to\n your Cloud Storage bucket by using the [`gcloud storage cp`](/sdk/gcloud/reference/storage/cp) command.\n\n ### Bash\n\n ```bash\n #!/bin/bash\n\n NOW=\"$(date +%s)\"\n\n EXT=\".trn\"\n if [[ \"$1\" == \"final\" ]]\n then\n EXT='.trn.final'\n fi\n\n NAME=\"{DATABASE}.${NOW}.${EXT}\"\n FULL_NAME=\"/SQLServerBackups/log/${NAME}\"\n\n\n QUERY=\"BACKUP LOG ${DATABASE} TO DISK = '${FULL_NAME}'\"\n /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P \"${SA_PASSWORD}\" -d master -Q \"${QUERY}\"\n\n gcloud storage cp \"${FULL_NAME}\" \"${GCS_BACKUPS}/log/\"\n ```\n\n ### PowerShell\n\n ```bash\n # Get the current timestamp\n $NOW = [int](Get-Date -UFormat '%s')\n\n # Set the file extension based on the command-line argument\n $EXT = '.trn'\n if ($args[0] -eq 'final') {\n $EXT = '.trn.final'\n }\n\n # Construct the backup file name\n $NAME = \"{0}.{1}{2}\" -f $DATABASE, $NOW, $EXT\n $FULL_NAME = \"X:\\SQLServerBackups\\log\\$NAME\"\n\n # Construct the SQL backup query\n $QUERY = \"BACKUP LOG $DATABASE TO DISK = '$FULL_NAME'\"\n\n # Execute the SQL backup command\n Invoke-Sqlcmd -ServerInstance 'localhost' -Username 'SA' -Password $env:SA_PASSWORD -Database 'master' -Query $QUERY\n\n # Upload the backup file to Google Cloud Storage\n gcloud storage cp \"$FULL_NAME\" \"$GCS_BACKUPS/log/\"\n ```\n3. Configure a scheduling tool of your choice to regularly run the script. You can also use this script manually and pass the `\"final\"`\n positional argument to create the transaction log file whose name ends in\n the `.trn.final` suffix. This is useful when you want to\n finish your migration and\n [promote the migration job](/database-migration/docs/sqlserver/finalize-migration)."]]