Upload files to Azure Blob Storage from Azure DevOps

There are number of ways to upload files to blob storage. This article will discuss two different options to upload files from Azure DevOps pipelines to blob storage.

First option below use Azure Cli(AzureCLI@2) with inline bash script. bash script will do number of task such as get account keys, create blob container, generate sas token, upload files to blob, and echo the download uri with a sas token. Also note that we use az storage azcopy which is still in EXPERIMENTAL stage at the time of writing.

- task: AzureCLI@2
      inputs:
        azureSubscription: 'Name of the Azure Resource Manager service connection'
        scriptType: 'bash'
        scriptLocation: 'inlineScript'
        addSpnToEnvironment: true 
        inlineScript: |   
          az config set extension.use_dynamic_install=yes_without_prompt
          accountkey=`az storage account keys list -n $(storageaccount_name) --query="[0].value" -o tsv`
          az storage container create -n $(container_name) --account-name $(storageaccount_name)
          uploadtokenexpiry=`date -u -d "30 minutes" '+%Y-%m-%dT%H:%MZ'`
          downloadtokenexpiry=`date -u -d $(token_expiry) '+%Y-%m-%dT%H:%MZ'`
          uploadsas=$(az storage container generate-sas -n $(container_name) --https-only --permissions rwl --expiry $uploadtokenexpiry -o tsv --account-name $(storageaccount_name) --account-key $accountkey)
          downloadsas=$(az storage container generate-sas -n $(container_name) --https-only --permissions rl --expiry ${downloadtokenexpiry} -o tsv --account-name $(storageaccount_name) --account-key ${accountkey})
          az storage azcopy blob upload -c $(container_name) --account-name $(storageaccount_name) -s myfile.txt  --recursive --sas-token $uploadsas
          echo https://$(storageaccount_name).blob.core.windows.net/$(container_name)/myfile.txt?$downloadsas

Note that the above task required Azure Resource Manager service connection for the azureSubscription parameter and use storageaccount_name, container_name and token_expiry(ex: "30 day") pipeline variables so we can provide those values when running the pipeline.

The second option below shows how to use the Azure File Copy task to upload a blob and use the output variables from task to get the download url with sas token.

    - task: AzureFileCopy@4
      name: storagecopy
      displayName: "Copy to Storage"
      inputs:
        SourcePath: '$(Build.ArtifactStagingDirectory)/myfile.txt'
        azureSubscription: 'Name of the Azure Resource Manager service connection'
        Destination: 'AzureBlob'
        storage: 'storageaccount_name'
        ContainerName: 'container_name'
        sasTokenTimeOutInMinutes: '240' 
      
    - task: Bash@3
      displayName: "Display URL"
      inputs:
        targetType: 'inline'
        script: |
            echo $(storagecopy.StorageContainerUri)
            echo $(storagecopy.StorageContainerSasToken)

Note that we can change the sas token expiry time using sasTokenTimeOutInMinutes parameter.

Though the second option is simple and straightforward, it requires the service principle used for the service connection to have "Storage Blob Data Contributor" role added.