Issue this command then following onscreen instructions;
az login
If working on an Azure VM that can use identity based login, then do;
az login --identity
Download the JSON credentials file for a service account that has permissions to read from the source bucket. Note that service account is required. xcopy doesn’t recognise anyother form of GCP credentials.
Once the file is downloaded, set this env variable;
export GOOGLE_APPLICATION_CREDENTIALS="path-to-service-account-secrets-file.json"
Also set the source buket name in the env;
export BUCKET_NAME=source-bucket-in-gcp-storage
Set target storage account name in the env.
export STORAGE_ACCOUNT_NAME=target-storage-account-name
Decide an expiry date for the sas token, then generate it. Tokan will remain valid till this date;
export SAS_EXPIRY=2026-03-31
export SAS_TOKEN=$(
az storage account generate-sas \
--expiry $SAS_EXPIRY \
--account-name $STORAGE_ACCOUNT_NAME \
--services b --resource-types co --permissions acuwrl \
--https-only \
--output tsv)
Issue this command to do the data copy.
azcopy copy --recursive=true \
https://storage.cloud.google.com/$BUCKET_NAME \
https://$STORAGE_ACCOUNT_NAME.blob.core.windows.net?$SAS_TOKEN
This may take a while to complete depending upon the amount of data being copied. Keep an eye on the logs being generated to know if everything worked as expected.