Full Architecture for SQL Backup to Blob Storage in Cloud


One of the tasks that any DBA should know How to take backup to URL or restore a backup from URL and this’s the easiest part very simple steps and command but this is not our main objective, we should understand all of the components how it is working, what is the different options, what is the changes happened before SQL Server 2016 and After SQL Server 2016 to take the maximum benefits from this service or feature here in my article I will try to conclude all of this things as much as I can.

  • Introduction.
  • Benefits from taking backup to URL.
  • What is Azure Storage Blob?
  • What are the Pre-requisites to do SQL backup on the URL?
  • What is the Difference between SAS Token and access key Azure Storage
  • When I Should use SAS Token or access key Azure Storage
  • When the Backup files will be saved as Block Blob or Page Blob
  • How to Generate and configure Access Policy, Access key, and Shared Access Signature
  • Create SQL Server Credential using access key or SHARED ACCESS SIGNATURE
  • Backup to Azure Storage and restore from Azure Storage.
  • Validate the Backup File.
  • Notes and Limitation for SQL backup to URL.
  • References and Resources.
SQL backup to Azure Storage
Continue reading “Full Architecture for SQL Backup to Blob Storage in Cloud”

Big Data Storage in Microsoft Azure


Azure storage is a Microsoft solution designed to store your data on it with different models (Blobs, Tables, Queue, File Shares, and Disks) to know the differences, usage, High Availability and the architecture for Azure storage I highly recommend you to check this path you will find a lot of tips and articles About Azure storage subject.

Now let us take a look about storage technology when we talk about Big data, what is the best option for our data and how I can choose the best option, based on what so let us think from architecture and design point of view.

Microsoft provided a variety of options to store your big data on it based on your requirements and based on the data type, in this article we will discuss 4 options that can be used as Azure storage solution for big data.

  • Azure Storage Blobs
  • Azure Data Lake Store
  • Azure Cosmos DB
  • Azure HBase on HDInsight
Continue reading “Big Data Storage in Microsoft Azure”

Recover blob data from azure storage after delete


One of the tasks that an Azure administrator should take care of it when you are managing the data on your organization and saving it on Azure Storage you should know How to keep it up and running and also you should know How to recover it after delete

Today we will see what is the required configuration we should do it to be able to recover your deleted data from Azure storage Microsoft Azure has a feature called SOFT DELETE by using this feature you can configure the Azure storage to keep the data available for recovery within the specified data retention period after the delete operation before starting for explaining the soft delete I highly recommend you to take a look into this list of articles you will find very important information and useful content for How to manage Azure storage Check the Articles from ➡️ here

  • Azure Storage Soft delete options and information
  • How to Enable Soft Delete Configuration on Azure Storage account
  • How to check the deleted data and recover it
  • Keep Following
Recover Blob After Delete
Continue reading “Recover blob data from azure storage after delete”

Provisioning a Gen 2 Azure Data Lake


Gen 2 Azure Data Lake It is Microsoft Service designed for big data processing, and it is a combination between azure storage feature and Azure Gen1 Data lake and Microsoft recommend to use GEN2 for X of reasons (Performance, Management, and Security) Azure Gen2 Data Lake is the addition of a hierarchical namespace  , So Gen2 Azure Data Lake is designed for managing and storing a very big large amount of data petabytes. For more information about Gen2 Azure Data Lake Check Microsoft documentation

Continue reading “Provisioning a Gen 2 Azure Data Lake”