Copy Files/Directories using Robocopy with Powershell

I have worked on some repetitive works which involves copying files between some servers. I always try to automate as it reduce the manual errors and I am lazy to do boring task in manual :).

Using powershells Copy-Item is simple and straight forward. But Robocopy provides more features. It provides many options like retry/jobs/filtering/mirroring which I use mostly when working.

We can get the all supported parameters from Robocopy documentation or simple running robocopy /? in command line.

$source = '<SOURCE_PATH>'
$destination = '<DESTINATION_PATH>'
# /E - Copies subdirectories. This option automatically includes empty directories. Refer the doc for supported parameters
$robocopyOptions = @('/E')

Write-Host 'Copying from $source to $destination'

$CmdLine = @($source, $destination) + $robocopyOptions
& 'robocopy.exe' $CmdLine

Write-Host 'Copy Completed'

Preceding command invokes Robocopy windows utility and copy the files. This is the simplest usage and we can use it for more complex scenarios.

Can not Build/Publish Visual Studio Database project Targeting Azure Sql Database v12

Recently, I was working on a visual studio Database project and trying to publish to Azure SQL. But there is no option for selecting Azure SQL Database V12 in Target platform Database dropdown.  Microsoft Azure SQL Database is the only option.

In order to fix that issue just open the database project file(.sqlproj) in the XML editor and manually update the DSP tag with the following.

<DSP>Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider</DSP>
That’s it. Now we should be able to publish to Azure SQL Database.
Happy Coding 🙂

Read and Write/Update the XML file in Powershell

In many cases, we found ourselves in a situation where we need to Read and update XML files.

It’s quite straight forward in Power Shell.

Sample XML File: 

Powershell script to read and update XML file

<App>
<Secret></Secret>
</App>
view raw sample.xml hosted with ❤ by GitHub

Powershell script to read and write XML file

$xmlFileName "Path to XML File";
[xml]$xmlDoc = Get-Content $xmlFileName
$xmlDoc.APP.Secret = "Some Value"
$xmlDoc.Save($xmlFileName)

Install Azure Blob Storage module in Sitecore 9.3 on prem

By default, we can store Blobs in SQL Database. Earlier we had the option to store Blobs in the file system. From, Sitecore 9.3 we have the Blob Storage concept. Blob Storage concepts give us freedom to configure storage providers as we like. Yes, So we can configure Sitecore to store Blobs anywhere we like :).

We can install the Sitecore Azure Blob Storage module to configure Sitecore to store Blobs in Azure Storage. Do you need to Store Blobs somewhere else?. May be in Google Cloud Storage or in AWS Storage. In that case, we can implement our own storage provider implementing Sitecore.Framework.Data.Blobs.Abstraction.

Let’s take a look at how we can configure the Sitecore Azure Blob Storage module in on-prem Sitecore 9.3 instance. Please refer the Sitecore doc for installing Sitecore 9.3 

1.Download Sitecore Azure Blob Storage module from here

2. Create an Azure Storage Account. Refer Microsoft docs for more information

3. Create a container

4. Copy the Storage connection string. Refer Microsoft docs

5. Use MsDeploy to install Sitecore Azure Blob Storage WDP.

"<FolderPathOfMsDeploy>\msdeploy.exe" -verb:sync -source:package="<FilePathOfWDP>" -dest:auto="<RootUrlOfSitecoreInstance>" -setParam:"IIS Web Application Name"="<IISWebAppName>" -setParam:"AzureStorageConnectionString"="<AzureStorageConnectionString>" -setParam:"AzureStorageContainerName"="<AzureStorageBlobContainerName>" -setParam:"DefaultProvider"="azure" -enableRule:DoNotDeleteRule -verbose
view raw msdeploy.cmd hosted with ❤ by GitHub

Parameters:

  • FilePathOfWDP: File path to Azure Blob Storage WDP
  • RootUrlOfSitecoreInstance: Url of Sitecore instance . In my case ” https://sc93xpcm/ ” .
  • IISWebAppName: IIS Web App Name. ex: “sc93xpCM”
  • AzureStorageConnectionString: Azure storage connection string we copied from the previous step no 4.
  • AzureStorageContainerName: Azure Storage container name

Above msdeploy cmd install the module in SC instance.

We have to do extra step on on-premise installation. We need to update Connectionstring.config. We can do it manually or using XDT transformation.

  1. Manual Step.

Add below node addingAzure Storage connection string into connectionstring.config. 

<add name=”azureblob” connectionString=”<Azure Storage Connection String>”  />

      2. XDT transform 

Donwload Microsoft.Web.Xdt dll from nuget

Execute following powershell script 

function XmlDocTransform($xml, $xdt)
{
if (!$xml -or !(Test-Path path $xml PathType Leaf)) {
throw "File not found. $xml";
}
if (!$xdt -or !(Test-Path path $xdt PathType Leaf)) {
throw "File not found. $xdt";
}
$scriptPath = (Get-Variable MyInvocation Scope 1).Value.InvocationName | split-path parent
Add-Type LiteralPath "$scriptPath\Microsoft.Web.XmlTransform.dll"
$xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
$xmldoc.PreserveWhitespace = $true
$xmldoc.Load($xml);
$transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
if ($transf.Apply($xmldoc) -eq $false)
{
throw "Transformation failed."
}
$xmldoc.Save($xml);
}
XmlDocTransform xml "<PhysicalFolderOfSitecoreWebApp>\App_Config\ConnectionStrings.config" xdt "<PhysicalFolderOfSitecoreWebApp>\App_Data\Transforms\AzureBlobStorageProvider\Xdts\App_Config\ConnectionStrings.config.xdt"

Update PhysicalFolderOfSitecoreWebApp in the script before running.  Make sure Microsoft.Web.XmlTransform.dll is in the same location where the script is executing. 

This will add a connection string node in connectionstrings.config.  

Now Sitecore is configured to store blobs in Azure Blob Storage. So when we create new Media items, blobs being stored in Azure Storage. :).  

Let’s see how we can implement custom providers to store blobs in any other storage in later posts. 🙂

 

 

Azure Web App – Request Timeout Issue- 500 Error

Recently I was facing a issue with request time out in a web app in azure app services. It was a synchronous file upload which take more than 4 seconds. (Yes, off-course, synchronous way is not the optimum solution)

I investigated this issue and I found that Azure App services (Web app) has default 230 seconds of timeout. If a request take more than this time it will be a 500 Error. But still this request is allowed to continue in the background in server.

So we should keep this in mind and we should design our applications in a reactive way.

So if you are getting a request timeout in Azure web app this could be the issue.

More readings:

https://feedback.azure.com/forums/169385-web-apps/suggestions/36572656-make-web-app-timeout-of-230-seconds-configurable

https://social.msdn.microsoft.com/Forums/azure/en-US/560dc2a9-43e1-4c68-830c-6e1defe2f72d/azure-web-app-request-timeout-issue?forum=WAVirtualMachinesforWindows

https://docs.microsoft.com/en-us/azure/app-service/faq-availability-performance-application-issues#why-does-my-request-time-out-after-230-seconds

https://www.edureka.co/community/22010/azure-asp-net-webapp-the-request-timed-out

The “Using” Statement In Powershell

When we do code in c#, we have using statement to dispose our objects. so we don’t have to. :).  What about PowerShell? Can we do that?

Here is a PowerShell function which behaves as using statement. 🙂


Function Using-Object(
[System.IDisposable]
$InputObject,
[scriptblock]
$ScriptBlock = {throw "ScriptBlock is mandatory, please provide a value."})
{
try
{
. $ScriptBlock
}
finally
{
if ($null -ne $InputObject -and $InputObject -is [System.IDisposable])
{
$InputObject.Dispose()
}
}
}

So whenever we are dealing with an object that should be disposed we can use this function as below.


# $Connection object will be diposed.
UsingObject($Connection = New-Object System.Data.SQLClient.SQLConnection($ConnectionString)) {
#code goes here.
}

view raw

using-using.ps1

hosted with ❤ by GitHub

Isn’t that cool? 🙂

Download Blob as file in Javascript

While we are working with Javascript we may find our self in a situation where we need to let users download blob as File.

So In this post, I’ll share basic javascript function which allows us to download blob as a file in the browser.


const downloadBlobAsFile = function(data, filename){
				const contentType = 'application/octet-stream';
        if(!data) {
            console.error(' No data')
            return;
        }

        if(!filename) filename = 'filetodonwload.txt'

        if(typeof data === "object"){
            data = JSON.stringify(data, undefined, 4)
        }

        var blob = new Blob([data], {type: contentType}),
            e    = document.createEvent('MouseEvents'),
            a    = document.createElement('a')

        a.download = filename
        a.href = window.URL.createObjectURL(blob)
        a.dataset.downloadurl =  [contentType, a.download, a.href].join(':')
        e.initMouseEvent('click', true, false, window, 0, 0, 0, 0, 0, false, false, false, false, 0, null)
        a.dispatchEvent(e)
    }
    
    // call the function 
    const data = "some data";
    const fileName = "filetodonwload.txt";
    downloadBlobAsFile(data, fileName);

You can find a working sample here. 

🙂