Copy text content in Angular

Recently, I found my self in a situation where I need text copy to clipboard feature enabled in my angular application.
So, I created stackblitz project to showcase implementation. It is quite straight implementation and have a look on code in stackblitz.

Thanks. Happy coding 🙂

Can not Build/Publish Visual Studio Database project Targeting Azure Sql Database v12

Recently, I was working on a visual studio Database project and trying to publish to Azure SQL. But there is no option for selecting Azure SQL Database V12 in Target platform Database dropdown.  Microsoft Azure SQL Database is the only option.

In order to fix that issue just open the database project file(.sqlproj) in the XML editor and manually update the DSP tag with the following.

<DSP>Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider</DSP>
That’s it. Now we should be able to publish to Azure SQL Database.
Happy Coding 🙂

Read and Write/Update the XML file in Powershell

In many cases, we found ourselves in a situation where we need to Read and update XML files.

It’s quite straight forward in Power Shell.

Sample XML File: 

Powershell script to read and update XML file

<App>
<Secret></Secret>
</App>
view raw sample.xml hosted with ❤ by GitHub

Powershell script to read and write XML file

$xmlFileName "Path to XML File";
[xml]$xmlDoc = Get-Content $xmlFileName
$xmlDoc.APP.Secret = "Some Value"
$xmlDoc.Save($xmlFileName)
view raw read-write-xml.ps1 hosted with ❤ by GitHub

Install Azure Blob Storage module in Sitecore 9.3 on prem

By default, we can store Blobs in SQL Database. Earlier we had the option to store Blobs in the file system. From, Sitecore 9.3 we have the Blob Storage concept. Blob Storage concepts give us freedom to configure storage providers as we like. Yes, So we can configure Sitecore to store Blobs anywhere we like :).

We can install the Sitecore Azure Blob Storage module to configure Sitecore to store Blobs in Azure Storage. Do you need to Store Blobs somewhere else?. May be in Google Cloud Storage or in AWS Storage. In that case, we can implement our own storage provider implementing Sitecore.Framework.Data.Blobs.Abstraction.

Let’s take a look at how we can configure the Sitecore Azure Blob Storage module in on-prem Sitecore 9.3 instance. Please refer the Sitecore doc for installing Sitecore 9.3 

1.Download Sitecore Azure Blob Storage module from here

2. Create an Azure Storage Account. Refer Microsoft docs for more information

3. Create a container

4. Copy the Storage connection string. Refer Microsoft docs

5. Use MsDeploy to install Sitecore Azure Blob Storage WDP.

"<FolderPathOfMsDeploy>\msdeploy.exe" -verb:sync -source:package="<FilePathOfWDP>" -dest:auto="<RootUrlOfSitecoreInstance>" -setParam:"IIS Web Application Name"="<IISWebAppName>" -setParam:"AzureStorageConnectionString"="<AzureStorageConnectionString>" -setParam:"AzureStorageContainerName"="<AzureStorageBlobContainerName>" -setParam:"DefaultProvider"="azure" -enableRule:DoNotDeleteRule -verbose
view raw msdeploy.cmd hosted with ❤ by GitHub

Parameters:

  • FilePathOfWDP: File path to Azure Blob Storage WDP
  • RootUrlOfSitecoreInstance: Url of Sitecore instance . In my case ” https://sc93xpcm/ ” .
  • IISWebAppName: IIS Web App Name. ex: “sc93xpCM”
  • AzureStorageConnectionString: Azure storage connection string we copied from the previous step no 4.
  • AzureStorageContainerName: Azure Storage container name

Above msdeploy cmd install the module in SC instance.

We have to do extra step on on-premise installation. We need to update Connectionstring.config. We can do it manually or using XDT transformation.

  1. Manual Step.

Add below node addingAzure Storage connection string into connectionstring.config. 

<add name=”azureblob” connectionString=”<Azure Storage Connection String>”  />

      2. XDT transform 

Donwload Microsoft.Web.Xdt dll from nuget

Execute following powershell script 

function XmlDocTransform($xml, $xdt)
{
if (!$xml -or !(Test-Path path $xml PathType Leaf)) {
throw "File not found. $xml";
}
if (!$xdt -or !(Test-Path path $xdt PathType Leaf)) {
throw "File not found. $xdt";
}
$scriptPath = (Get-Variable MyInvocation Scope 1).Value.InvocationName | split-path parent
Add-Type LiteralPath "$scriptPath\Microsoft.Web.XmlTransform.dll"
$xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
$xmldoc.PreserveWhitespace = $true
$xmldoc.Load($xml);
$transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
if ($transf.Apply($xmldoc) -eq $false)
{
throw "Transformation failed."
}
$xmldoc.Save($xml);
}
XmlDocTransform xml "<PhysicalFolderOfSitecoreWebApp>\App_Config\ConnectionStrings.config" xdt "<PhysicalFolderOfSitecoreWebApp>\App_Data\Transforms\AzureBlobStorageProvider\Xdts\App_Config\ConnectionStrings.config.xdt"
view raw xdttransform.ps1 hosted with ❤ by GitHub

Update PhysicalFolderOfSitecoreWebApp in the script before running.  Make sure Microsoft.Web.XmlTransform.dll is in the same location where the script is executing. 

This will add a connection string node in connectionstrings.config.  

Now Sitecore is configured to store blobs in Azure Blob Storage. So when we create new Media items, blobs being stored in Azure Storage. :).  

Let’s see how we can implement custom providers to store blobs in any other storage in later posts. 🙂

 

 

Azure Web App – Request Timeout Issue- 500 Error

Recently I was facing a issue with request time out in a web app in azure app services. It was a synchronous file upload which take more than 4 seconds. (Yes, off-course, synchronous way is not the optimum solution)

I investigated this issue and I found that Azure App services (Web app) has default 230 seconds of timeout. If a request take more than this time it will be a 500 Error. But still this request is allowed to continue in the background in server.

So we should keep this in mind and we should design our applications in a reactive way.

So if you are getting a request timeout in Azure web app this could be the issue.

More readings:

https://feedback.azure.com/forums/169385-web-apps/suggestions/36572656-make-web-app-timeout-of-230-seconds-configurable

https://social.msdn.microsoft.com/Forums/azure/en-US/560dc2a9-43e1-4c68-830c-6e1defe2f72d/azure-web-app-request-timeout-issue?forum=WAVirtualMachinesforWindows

https://docs.microsoft.com/en-us/azure/app-service/faq-availability-performance-application-issues#why-does-my-request-time-out-after-230-seconds

https://www.edureka.co/community/22010/azure-asp-net-webapp-the-request-timed-out

The “Using” Statement In Powershell

When we do code in c#, we have using statement to dispose our objects. so we don’t have to. :).  What about PowerShell? Can we do that?

Here is a PowerShell function which behaves as using statement. 🙂

Function Using-Object(
[System.IDisposable]
$InputObject,
[scriptblock]
$ScriptBlock = {throw "ScriptBlock is mandatory, please provide a value."})
{
try
{
. $ScriptBlock
}
finally
{
if ($null -ne $InputObject -and $InputObject -is [System.IDisposable])
{
$InputObject.Dispose()
}
}
}

view raw
using-statement.ps1
hosted with ❤ by GitHub

So whenever we are dealing with an object that should be disposed we can use this function as below.

# $Connection object will be diposed.
UsingObject($Connection = New-Object System.Data.SQLClient.SQLConnection($ConnectionString)) {
#code goes here.
}

view raw
using-using.ps1
hosted with ❤ by GitHub

Isn’t that cool? 🙂

Download Blob as file in Javascript

While we are working with Javascript we may find our self in a situation where we need to let users download blob as File.

So In this post, I’ll share basic javascript function which allows us to download blob as a file in the browser.


const downloadBlobAsFile = function(data, filename){
				const contentType = 'application/octet-stream';
        if(!data) {
            console.error(' No data')
            return;
        }

        if(!filename) filename = 'filetodonwload.txt'

        if(typeof data === "object"){
            data = JSON.stringify(data, undefined, 4)
        }

        var blob = new Blob([data], {type: contentType}),
            e    = document.createEvent('MouseEvents'),
            a    = document.createElement('a')

        a.download = filename
        a.href = window.URL.createObjectURL(blob)
        a.dataset.downloadurl =  [contentType, a.download, a.href].join(':')
        e.initMouseEvent('click', true, false, window, 0, 0, 0, 0, 0, false, false, false, false, 0, null)
        a.dispatchEvent(e)
    }
    
    // call the function 
    const data = "some data";
    const fileName = "filetodonwload.txt";
    downloadBlobAsFile(data, fileName);

You can find a working sample here. 

🙂

Deploy storage account and output connection string with SAS token using ARM template

I found my self in a situation where I needed to deploy Azure storage account with a blob container and generate connection string with SAS token and update one of the web app’s settings with generated connection strings.

For this purpose, I used linked ARM template and created Storage account and blob container and generated the connection string with SAS token and output from that template so that master template can use this value.

Table of content

  1. Craft ARM Template
  2. Output connection string with SAS token
  3. Output connection string with account key
  4. Deploy Template

Craft arm template

We need to Craft ARM template as below for our requirement.

{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"variables": {
"storageAccountApiVersion": "2018-07-01",
"storageAccountNameTidy": "[toLower(trim(parameters('storageAccountName')))]",
"blobEndPoint":"[concat('https://&#39;,variables('storageAccountNameTidy'),'.blob.core.windows.net/')]"
},
"parameters": {
"location": {
"type": "string",
"defaultValue": "southeastasia"
},
"storageAccountName": {
"type": "string",
"defaultValue": "awesomestorage"
},
"accountType": {
"type": "string",
"defaultValue": "Standard_LRS"
},
"accessTier": {
"type": "string",
"defaultValue": "Hot"
},
"supportsHttpsTrafficOnly": {
"type": "bool",
"defaultValue": true
},
"sasTokenExpiry": {
"type": "string",
"defaultValue": "2020-12-31T23:59:00Z"
},
"containerName": {
"type": "string",
"defaultValue": "test"
},
"accountSasProperties": {
"type": "object",
"defaultValue": {
"signedServices": "b",
"signedPermission": "rl",
"signedResourceTypes": "sco",
"keyToSign": "key2",
"signedExpiry": "[parameters('sasTokenExpiry')]"
}
}
},
"resources": [
{
"name": "[parameters('storageAccountName')]",
"type": "Microsoft.Storage/storageAccounts",
"apiVersion": "[variables('storageAccountApiVersion')]",
"location": "[parameters('location')]",
"properties": {
"accessTier": "[parameters('accessTier')]",
"supportsHttpsTrafficOnly": "[parameters('supportsHttpsTrafficOnly')]"
},
"dependsOn": [],
"sku": {
"name": "[parameters('accountType')]"
},
"kind": "BlobStorage",
"resources": [
{
"name": "[concat('default/', parameters('containerName'))]",
"type": "blobServices/containers",
"apiVersion": "[variables('storageAccountApiVersion')]",
"dependsOn": [
"[parameters('storageAccountName')]"
]
}
]
}
],
"outputs": {
"storageAccountConnectionString": {
"type": "string",
"value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountNameTidy'), ';AccountKey=', listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountNameTidy')), variables('storageAccountApiVersion')).keys[0].value)]"
},
"storageAccountConnectionStringWithSAS": {
"type": "string",
"value": "[concat('BlobEndpoint=',variables('blobEndPoint'),';SharedAccessSignature=', listAccountSas(variables('storageAccountNameTidy'), variables('storageAccountApiVersion'), parameters('accountSasProperties')).accountSasToken)]"
}
}
}

 

generate connection string with sas token

As per the above full ARM Template, we can see connection string is generated with full access and another is generated with SAS token.

In order to generate a connection string with SAS token, I have used listAccountSas ARM function.

Find More details on this function here

"storageAccountConnectionStringWithSAS": { "type": "string", "value": "[concat('BlobEndpoint=',variables('blobEndPoint'),';SharedAccessSignature=', listAccountSas(variables('storageAccountNameTidy'), variables('storageAccountApiVersion'), parameters('accountSasProperties')).accountSasToken)]" }

We need to pass three parameters for this function

  • resourceIdenifier

The name of the storage account within the specified resource group

  • apiVersion

The API version to use for this operation

  • requestParameters

We need to pass parameters as specified here

"accountSasProperties": {
            "type": "object",
            "defaultValue": {
                "signedServices": "b",
                "signedPermission": "rl",
                "signedResourceTypes": "sco",
                "keyToSign": "key2",
                "signedExpiry": "[parameters('sasTokenExpiry')]"
            }
        }

We can find more details about parameters specified herein above Microsoft documentation.

Generate connection string with storage account key

We can generate connection string which has full access to storage account with Storage account access keys.  We can use listKeys ARM function for this.

We can find more details on this function here.

We need to pass two parameters for this function

  • Storage account resource id
  • API version

This function gives you all keys in the storage account and we can select one key to create connection string as below.

 


listKeys(resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountNameTidy')), variables('storageAccountApiVersion')).keys[0].value

Deploy ARM template

We can use the following PowerShell script to deploy ARM template.

Note: Fill out the required parameters. (denotes in caipatal)


$password = "SECRET"
$clientId = "CLIENTID"
$securePassword = ConvertTo-SecureString $password -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential ($clientId, $securePassword)
Login-AzureRmAccount -ServicePrincipal -TenantId "TENANTID" -SubscriptionId "SUBSCRIPTIONID" -Credential $credentials 


$templateFilePath = "ARM TEMPLATE PATH"

$resourceGroupName = "RESOURCEGROUPNAME"
$resourceGroupLocation = "LOCATION"
$deploymentName = "DEPLOYMENTNAME"

#Create or check for existing resource group
$resourceGroup = Get-AzureRmResourceGroup -Name $resourceGroupName -ErrorAction SilentlyContinue
if(!$resourceGroup)
{
    Write-Host "Resource group '$resourceGroupName' does not exist. To create a new resource group, please enter a location.";
    if(!$resourceGroupLocation) {
        $resourceGroupLocation = Read-Host "resourceGroupLocation";
    }
    Write-Host "Creating resource group '$resourceGroupName' in location '$resourceGroupLocation'";
    New-AzureRmResourceGroup -Name $resourceGroupName -Location $resourceGroupLocation
}
else{
    Write-Host "Using existing resource group '$resourceGroupName'";
}


# Start the deployment
Write-Host "Starting deployment...";
New-AzureRmResourceGroupDeployment -ResourceGroupName $resourceGroupName -Name $deploymentName -TemplateFile $templateFilePath;

We can see the following output once deployment is successful.

So this is how we deploy storage account and generate connection strings. 🙂