How to work migrate backups files and scripts to from the cloud using the command line

How to work migrate backups files and scripts to from the cloud using the command line

How to work migrate backups files and scripts to from the cloud using the command line

SQLShack

SQL Server training Español

How to work migrate backups files and scripts to from the cloud using the command line

May 26, 2015 by Daniel Calbimonte

Introduction

Sometimes we need to move our local files, SQL scripts, backups from our local machine to Azure or vice versa. This can be done manually by accessing to Azure and using a browser, but there are other methods to automatically do this. This article describes the Microsoft Azure Storage Tool. This is a command line tool used to upload data to Azure from a local machine or to download data from Azure to our local machine. The article will describe step by step how to work with this tool.
Figure 0.

Requirements

An Azure subscription. A local machine with Windows installed. It comes with the PowerShell installer, which can be downloaded here. You will also require having a storage in Azure. For more information about creating a storage in Azure, refer to our article to create storage section.
Req 1. The Azure storage In the storage, it is requited a container. For more information, refer to our article related to storage and containers.
Req 2. The storage container The storage contains an option to manage access keys. This option will let you handle the keys.
Req3. The manage access keys The access keys will be used to connect to azure from the AzCopy command line.
Req 4. The different access keys.

Getting started

Once installed the PowerShell as specified, open the Microsoft Azure Storage command line.
Figure 1. The Microsoft Azure Storage command line Let’s start copying files from the local machine to Azure. Imagine that we have the following folder with SQL notes and SQL Server scripts in our local machine:
Figure 2. The local folder In order to copy the information from the local folder to our container created on the requirements section, use the following command: AzCopy /Source:c:\scripts3\
/Dest:https://mysqlshackstorage.blob.core.windows.net/cont1/
/DestKey:pGFSbfuUMjDBQZqIbfJuwksVp5 AzCopy is the command to copy file(s) from/to Azure. It is similar to the copy command line in the cmd. It requires a source and a destination. In this example, source is the c:\scripts3 folder. The destination is the URL of the container (see the req 2 picture on requirements). The container will store the files of the scripts folder. Finally, the DestKey is the key to access to Azure. You can find the key in the req 4 picture of the requirements section. Once you run the commands, you will have a transfer summary result similar to this one:
Figure 3. The AzCopy results In order to verify, go to storage and click on the storage created on the requirements.
Figure 4. The storage In the storage, go to the container created on the requirements.
Figure 5. The storage container You will find the files of the local folder of the Figure 2 copied to Azure.
Figure 6. The file copied to the container. In order to obtain more information about the azcopy command, write this commands at the command line: Azcopy /? The command will show information about the parameters and some useful examples.
Figure 7. The Azcopy help Let’s try another example. In this example we will copy the myfile.txt from Azure to a local folder: AzCopy
/Source:https://mysqlshackstorage.blob.core.windows.net/cont1/
/Dest:c:\test\
/SourceKey:pGFSbfuUMjDBQZqIbfJuwksV/Pattern:”myfile.txt” To source is the url of the azure storage container. The destination is the c:\test folder. This forder is empty. The Azure container contains multiple files. In this example we only one to copy the file myfile.txt. We will use the Pattern parameter for this purpose. You will receive a message similar to this one after executing the command:
Figure 8. Downloading results You will be able to see the file copied from azure to your local machine:
Figure 9. The local folder with the file copied from Azure There are other parameters that the command line includes: /S which means recursive mode and includes the subfolders.
/Y is used to supress the confirmation prompts
/L is used to list operations
/A is used to upload files with the Archive Attribute Set
/MT is used to keep the source modified date time
/XN excludes files if they are newer than the destination
/XO excludes the source files if they are older than the destination
/V:[verbose log-file] this parameter is used to save the output in a log file, by default the log file is in the %LocalAppData%\Microsoft\Azure\AzCopy In this new example, we have a folder with multiple files. We want to copy only the PowerShell scripts:
Figure 10. Several files in a local folder The following command will copy all the files with ps (PowerShell) extension and store the results in a log file: AzCopy /Source:c:\scripts\
/Dest:https://mysqlshackstorage.blob.core.windows.net/cont1/
/DestKey:pGFSbfuUMjDBQZqIbfJuwksVp5 == /Pattern:”*.Ps” /V If everything is OK, you will receive a message similar to this one:
Figure 11. The copy results You will also be able to see the results in a log file. The /V parameter is used to store the information in this folder %LocalAppData%\Microsoft\Azure\AzCopy.
Figure 12. The Azure Log file You will also be able to see the the ps files copied in Azure.
Figure 13. The PowerShell script copied to Azure Finally, you can execute the commands from a file. Let’s create a file with AzCopy commands named AzCommand.txt: /Source:c:\scripts\
/Dest:https://mysqlshackstorage.blob.core.windows.net/cont1/
/DestKey:pGFSbfuUMjDBQZqIbfJuwksVp5BkD7CFv/GxvdrOwiWvAYGLc5D5J5ZKjtIpipb2djiaEmOX3QhExWVOHSC0sQ== /Pattern:”*.Ps” /V Now, execute the txt file with the following command line: azcopy /@:”C:\azcommand.txt” The command will execute the txt content. If everything is OK you will have the following result:
Figure 14. The transfer summary You can also verify in Azure that the files were copied

Conclusion

As you can see, the Azure Storage Tools can be used to automate some tasks using the command line. Once you have a Storage with a container with Azure, the process to upload or download files is very simple. We also learned different parameters and how to specify patterns to choose specific files. Note: This AzCopy command line is currently in the version 4.1 (Current Previous Version). The new versions may include new commands in the future. Make sure to have the last version. Author Recent Posts Daniel CalbimonteDaniel Calbimonte is a Microsoft Most Valuable Professional, Microsoft Certified Trainer and Microsoft Certified IT Professional for SQL Server. He is an accomplished SSIS author, teacher at IT Academies and has over 13 years of experience working with different databases.

He has worked for the government, oil companies, web sites, magazines and universities around the world. Daniel also regularly speaks at SQL Servers conferences and blogs. He writes SQL Server training materials for certification exams.

He also helps with translating SQLShack articles to Spanish

View all posts by Daniel Calbimonte Latest posts by Daniel Calbimonte (see all) SQL Partition overview - September 26, 2022 ODBC Drivers in SSIS - September 23, 2022 Getting started with Azure SQL Managed Instance - September 14, 2022

Related posts

Use AzCopy to upload data to Azure Blob Storage Accessing Azure Blob Storage from Azure Databricks Provisioning SQL Server 2019 Azure Container Instance using PowerShell Introduction and FAQs about Microsoft Azure technologies Learn AWS CLI: An Overview of AWS CLI (AWS Command Line Interface) 4,615 Views

Follow us

Popular

SQL Convert Date functions and formats SQL Variables: Basics and usage SQL PARTITION BY Clause overview Different ways to SQL delete duplicate rows from a SQL Table How to UPDATE from a SELECT statement in SQL Server SQL Server functions for converting a String to a Date SELECT INTO TEMP TABLE statement in SQL Server SQL WHILE loop with simple examples How to backup and restore MySQL databases using the mysqldump command CASE statement in SQL Overview of SQL RANK functions Understanding the SQL MERGE statement INSERT INTO SELECT statement overview and examples SQL multiple joins for beginners with examples Understanding the SQL Decimal data type DELETE CASCADE and UPDATE CASCADE in SQL Server foreign key SQL Not Equal Operator introduction and examples SQL CROSS JOIN with examples The Table Variable in SQL Server SQL Server table hints – WITH (NOLOCK) best practices

Trending

SQL Server Transaction Log Backup, Truncate and Shrink Operations Six different methods to copy tables between databases in SQL Server How to implement error handling in SQL Server Working with the SQL Server command line (sqlcmd) Methods to avoid the SQL divide by zero error Query optimization techniques in SQL Server: tips and tricks How to create and configure a linked server in SQL Server Management Studio SQL replace: How to replace ASCII special characters in SQL Server How to identify slow running queries in SQL Server SQL varchar data type deep dive How to implement array-like functionality in SQL Server All about locking in SQL Server SQL Server stored procedures for beginners Database table partitioning in SQL Server How to drop temp tables in SQL Server How to determine free space and file size for SQL Server databases Using PowerShell to split a string into an array KILL SPID command in SQL Server How to install SQL Server Express edition SQL Union overview, usage and examples

Solutions

Read a SQL Server transaction logSQL Server database auditing techniquesHow to recover SQL Server data from accidental UPDATE and DELETE operationsHow to quickly search for SQL database data and objectsSynchronize SQL Server databases in different remote sourcesRecover SQL data from a dropped table without backupsHow to restore specific table(s) from a SQL Server database backupRecover deleted SQL data from transaction logsHow to recover SQL Server data from accidental updates without backupsAutomatically compare and synchronize SQL Server dataOpen LDF file and view LDF file contentQuickly convert SQL code to language-specific client codeHow to recover a single table from a SQL Server database backupRecover data lost due to a TRUNCATE operation without backupsHow to recover SQL Server data from accidental DELETE, TRUNCATE and DROP operationsReverting your SQL Server database back to a specific point in timeHow to create SSIS package documentationMigrate a SQL Server database to a newer version of SQL ServerHow to restore a SQL Server database backup to an older version of SQL Server

Categories and tips

►Auditing and compliance (50) Auditing (40) Data classification (1) Data masking (9) Azure (295) Azure Data Studio (46) Backup and restore (108) ►Business Intelligence (482) Analysis Services (SSAS) (47) Biml (10) Data Mining (14) Data Quality Services (4) Data Tools (SSDT) (13) Data Warehouse (16) Excel (20) General (39) Integration Services (SSIS) (125) Master Data Services (6) OLAP cube (15) PowerBI (95) Reporting Services (SSRS) (67) Data science (21) ►Database design (233) Clustering (16) Common Table Expressions (CTE) (11) Concurrency (1) Constraints (8) Data types (11) FILESTREAM (22) General database design (104) Partitioning (13) Relationships and dependencies (12) Temporal tables (12) Views (16) ►Database development (418) Comparison (4) Continuous delivery (CD) (5) Continuous integration (CI) (11) Development (146) Functions (106) Hyper-V (1) Search (10) Source Control (15) SQL unit testing (23) Stored procedures (34) String Concatenation (2) Synonyms (1) Team Explorer (2) Testing (35) Visual Studio (14) DBAtools (35) DevOps (23) DevSecOps (2) Documentation (22) ETL (76) ►Features (213) Adaptive query processing (11) Bulk insert (16) Database mail (10) DBCC (7) Experimentation Assistant (DEA) (3) High Availability (36) Query store (10) Replication (40) Transaction log (59) Transparent Data Encryption (TDE) (21) Importing, exporting (51) Installation, setup and configuration (121) Jobs (42) ▼Languages and coding (686) Cursors (9) DDL (9) DML (6) JSON (17) PowerShell (77) Python (37) R (16) SQL commands (196) SQLCMD (7) String functions (21) T-SQL (275) XML (15) Lists (12) Machine learning (37) Maintenance (99) Migration (50) Miscellaneous (1) ►Performance tuning (869) Alerting (8) Always On Availability Groups (82) Buffer Pool Extension (BPE) (9) Columnstore index (9) Deadlocks (16) Execution plans (125) In-Memory OLTP (22) Indexes (79) Latches (5) Locking (10) Monitoring (100) Performance (196) Performance counters (28) Performance Testing (9) Query analysis (121) Reports (20) SSAS monitoring (3) SSIS monitoring (10) SSRS monitoring (4) Wait types (11) ►Professional development (68) Professional development (27) Project management (9) SQL interview questions (32) Recovery (33) Security (84) Server management (24) SQL Azure (271) SQL Server Management Studio (SSMS) (90) SQL Server on Linux (21) ►SQL Server versions (177) SQL Server 2012 (6) SQL Server 2016 (63) SQL Server 2017 (49) SQL Server 2019 (57) SQL Server 2022 (2) ►Technologies (334) AWS (45) AWS RDS (56) Azure Cosmos DB (28) Containers (12) Docker (9) Graph database (13) Kerberos (2) Kubernetes (1) Linux (44) LocalDB (2) MySQL (49) Oracle (10) PolyBase (10) PostgreSQL (36) SharePoint (4) Ubuntu (13) Uncategorized (4) Utilities (21) Helpers and best practices BI performance counters SQL code smells rules SQL Server wait types © 2022 Quest Software Inc. ALL RIGHTS RESERVED. GDPR Terms of Use Privacy
Share:
0 comments

Comments (0)

Leave a Comment

Minimum 10 characters required

* All fields are required. Comments are moderated before appearing.

No comments yet. Be the first to comment!