Azure AZ101 Checklist of Materials

Here is my checklist of Azure infrastructure handy materials that I use for AZ101 students as homework or additional class learning. I was asked for this by one of my students and I thought I’d share it here. The focus here is on just some of the topics and it is not exhaustive:

  • Azure Migrate
  • Azure Site Recovery
  • Disaster Recovery

Azure Migrate

Currently, you can only create an Azure Migrate project in West Central US or East US region. However, this does not impact your ability to plan your migration for a different target Azure location. The location of the migration project is used only to store the metadata discovered from the on-premises environment.​

Create an Azure Migrate project 

Reference on how to Create an Azure Migrate Project

Current Limitations on creating a project in Azure Migrate

Geography Storage location
Azure Government US Gov Virginia
Asia Southeast Asia
Europe North Europe or West Europe
United States East US or West Central US

Creating a Collector

Download the collector appliance

Create the Collector VM

Run the collector to discover VMs

Assessing Migration Readiness

Assessing VM Size

Troubleshooting Readiness Issues and Common Errors

Costing Considerations

Reviewing the VM Sizing

Other useful nuggets

Azure Migrate FAQ

Videos for Migrating Servers to Azure

Channel 9 video on how to migrate your applications and workloads to the cloud.
Microsoft Mechanics video on how to discover, assess and migrate Windows and Linux virtual machines from VMware to Azure.
Microsoft Channel 9 video on Azure Migrate with Michael Leworthy and Lindsay Berg of the Product Team

Azure Site Recovery

Overview of Azure Site Recovery and Site Recovery Pricing

Getting Ready for Azure Site Recovery

Prepare Azure Reference

Prepare on-premises Hyper-V servers for disaster recovery to Azure

Tutorial: Prepare Azure resources for replication of on-premises machines

Tutorial: Azure resources for replication of Hyper-V machines

Azure Storage Reference

Deployment Planner

Site Recovery Deployment Planner for Hyper-V to Azure

Analyze the Azure Site Recovery Deployment Planner Report

Recovery Services overview

Troubleshooting remote desktop connection after failover using ASR

Completing Migration

Enable replication

Excluding disks from the replication process

Add a group to a Recovery Plan

Add a script or manual action to a Recovery Plan

Tutorial: Setting up a replication policy

Recovery Plans

Failover in Site Recovery

Create and customize recovery plans

Test failover to Azure in Site Recovery

Run a failback for Hyper-V VMs

Add Azure Automation runbooks to recovery plans

Troubleshooting Hyper-V to Azure replication and failover

Set up disaster recovery to Azure for Hyper-V VMs using PowerShell and Azure Resource Manager

Videos for Azure Site Recovery

 

Azure Site Recovery Video Series

Video 1 Infrastructure Setup

Video 2 Enable protection for Hyper-V virtual machines

Video 3 Recovery Plan Test Failover, Planned Failover and Unplanned Failover to Azure

Video 4: Failover from Azure to on-premises (failback)

Video 5: Failback from Azure to On-Premises (VMWare)

Disaster Recovery

This section covers the five steps in the process to migrate VMware VMs to Azure: Configure the Infrastructure, Configure the vCenter Server, Protect the Virtual Machines, Configure Disaster Recovery (DR) and Failover, and Configure Failback.

 

Backing up Virtual Machines

Plan your VM backup infrastructure in Azure

Use the Azure portal to restore virtual machines

Tutorial: Back up and restore files for Windows Virtual Machines in Azure

Azure Virtual Machine PowerShell samples

Virtual Machine Architecture

Architectural Components

Azure Site Recovery pricing

Virtual Machine Replication

Replication Settings

Azure Database Migration Guide

Automatic update of the Mobility Service in Azure to Azure replication

Accelerated Networking with Azure virtual machine disaster recovery

Create a Windows virtual machine with Accelerated Networking

Set up disaster recovery for Azure VMs to a secondary Azure region

Troubleshooting

Troubleshoot Azure-to-Azure VM replication issues

Troubleshoot issues with the Azure Site Recovery agent

Troubleshoot issues with the Azure Site Recovery agent

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

AWS RDS vs Microsoft Azure SQL Database: What does it mean for the business?

As a freelance industry analyst who has worked with GigaOm, I’m pleased to see the GigaOM Transactional Field Test derived from the industry-standard TPC Benchmark™ E (TPC-E) report which compares Amazon Web Services Relational Database Service (AWS RDS) and Microsoft Azure SQL Database. It’s written by William McKnight and Jake Dolezal from GigaOm.

From the business intelligence perspective, it is incredibly useful to compare AWS RDS and Microsoft Azure SQL Database. Before we dive in, what do they have in common? They are both fully-managed cloud SQL Server offerings, and from the Business Intelligence perspective, it can be difficult for business decision makers to choose between them. In this post, we’ll distill the technical language so that business decision makers find it easier to make the right decision for them.

Why SQL Server?

SQL Server is one of the most mature, well-known and common databases in the world, according to data from DB-Engines ranking. In fact, there is some evidence to suggest that SQL Server is going to overtake Oracle, potentially due to the cheaper price tag.

SQL Server offers fantastic business intelligence features, and this is probably one of the key areas where it wins over Oracle. In my experience, this is where I saw SQL Server ‘creep’ into IT estates due to its superior business intelligence solutions in SQL Server, which were easier to use. In particular, the SQL Server Reporting Services solution solved problems for Finance departments; they could get repeatable financial reports very easily, and Microsoft SQL Server solved a business problem. So it took hold.

Why SQL Server in the cloud?

Generally speaking, the cloud is attractive to businesses due to its lower costs, scalability and pay-as-you-go pricing model.

For IT Departments, cloud databases are mainly used for read-intensive, data intensive applications such as data warehousing, data mining and business intelligence operations which need elasticity and scalability. Cloud databases offer reliable computing, storage, backup and network facilities at the lowest cost, which is particularly important when the IT Department is regarded as a cost center.

AWS RDS and Microsoft Azure SQL Database are both web services that makes it easy to set up and scale a relational database in the Cloud. They are designed for developers or businesses that want cloud databases.

AWS RDS vs Microsoft Azure SQL Database

GigaOm have held a GigaOM Transactional Field Test which aims at comparing the two databases. In the test, the performance was scaled to meet the needs of 80,000 customers, and the performance was tuned to meet the database engine’s optimal capability to process data. The tests consisted of five test runs that lasted a duration of two hours each for both platforms. You can read more detail about the test runs in the GigaOM Transactional Field Test.

Cost Differentiation

The paper found that Microsoft Azure SQL Database was considerably less expensive to run in comparison to AWS RDS over a month. Of course, the reader should try out different scenarios in order to make the decisions that are right for their environment. That said, the cost savings are compelling, and if you have not looked at Microsoft Azure SQL Database before, then it is worth reviewing the cost differences. This held to be true over different reviews of the solutions, such as cost, licencing and time reservations.

The paper concluded that the database, along with the cloud, matters to latency which is
the killer for important transactional applications. Microsoft Azure SQL Database presents a compelling proposition for the modern transactional workload, meeting the need for data warehousing, data mining and business intelligence operations which are the engine of many of today’s businesses.

Conclusion

The reality is that we live in a multi-cloud world. Apart from cost, how can organizations show that they are creating uniquely desirable products and services rather than simply discussing cost?

Since Microsoft are the creators and owners of Microsoft Azure SQL Database, the differentiation here is that Microsoft lead with research, development and innovation in SQL Server. This means that Microsoft lead with the ability to design, deliver and support SQL Server in its different guises, whether on-premises, in the cloud, or a hybrid mixture of both platforms.

Since Microsoft own SQL Server, they are uniquely placed to understand the dynamics of the customer needs and the market, thereby meeting those needs to develop uniquely well-specified products for the market. Therefore, it makes sense that Azure would be finely tuned to rise up and meet these needs. Hence, it makes sense that Microsoft SQL Azure Database costs are optimized to be lower, and this is reflected in the costing differences found in the study.

To read the GigaOM Transactional Field Test derived from the industry-standard TPC Benchmark™ E (TPC-E) report, please refer to the article. Please feel free to leave your thoughts in the comments.

Immersive Reader is now LIVE! New addition to Azure Cognitive Services family

Today, Microsoft launched Immersive Reader, aimed at increasing the opportunities for developers to embed inclusivity in their apps. I love reading, and I love the idea that reading will be brought to people who might otherwise struggle with text reading and comprehension.

You can learn more over at the Azure Cognitive Services website. If you want to get started straight away, here is the Ibiza Portal Create Blade and here’s the Documentation.

Developers have such a unique place in the world, using their creativity to make things better for other people. I’m excited about this journey and let’s see the magic that people create!

Data Warehousing and Business Intelligence in the Cloud with #Azure: How do we get good, fast, cheap and easy?

Customers want their data good, fast, cheap and easy. A tall order, right?

One of the biggest challenges that I see with data warehousing in the cloud is that customers are concerned with cost. I was interested to see the Gigaom report on the topic of Data Warehousing in the cloud, which contained a number of benchmarks, including cost.

The study by GigaOm showed that Azure SQL Data Warehouse is now outperforming the competition up to a whopping 14x times at up to 17 times cheaper than the competition, namely, Google BigQuery and AWS Redshift. This is an incredible achievement and the Azure team should be proud!

As part of my work in Business Intelligence, often, this involves a move to the cloud by default. Simply put, customers want quick Business Intelligence and they don’t want to spend time or effort in looking after kit. They want to delegate the responsibility. This means that cost is a key differentiator, since they want their data, good, fast and cheap. I’m glad to see that the Azure SQL Datawarehouse is competing on cost and performance since customers do want their data good, fast, cheap and easy.

Customers also want their data easy and this is where Power BI comes in. If a customer wants to use Power BI, I generally recommend that they put their data into Azure so that the data is traversing the Azure network. This means that the customer is not paying to extract or access their data from another cloud system and then put it into Power BI.

The Gigaom paper on cloud data warehousing is worth a read – and I am not just saying that because I‘ve done work as a Gigaom analyst! You can access the paper here.

 

Error converting data type varchar to numeric where ISNUMERIC finds only numbers

I am writing some SQL to form the basis of views to pull data from a Microsoft SQL Server source into Azure SQL Database, and then the data would go to Power BI. The data was all presented in string format initially (not my data, not my monkeys, not my circus), and I wanted to correct the data types before the data got into Power BI.

I noted that one of the columns failed to convert VARCHAR to DECIMAL. The error message is below, and it’s usually fairly easy to sort:

Error converting data type varchar to numeric

Normally, I’d use ISNUMERIC to identify the rows that fail to have a value in that column that could be converted to a number. Then, I could identify the value, and then I could replace or exclude it, as required.

However, on this occasion, using ISNUMERIC failed to identify any columns as being non-numeric. ISNUMERIC returned all of the rows as TRUE, and that confused me. I knew that something was triggering the CONVERT instruction to go wrong.

I ran a quick query, ordering the column in ASCENDING order, while running the original offending query that failed. This showed that the query stopped at the value 9.45. I then ran another query that returned the rows, where the value was greater than 9.45, and ordered the results.

In this result set, the value came through as follows:

9.450000001e-05

Aha! This explained why SQL Server could convert the value to numeric, because of the scientific notation used when the values are very large or very small.

Now, I ran my query again, using a NOT LIKE (which I also do not like!)

WHERE [My Column Name] NOT LIKE ‘%e%’
Now, out of my large record set, I got one offending row with the scientific notation, out of millions of rows. At least I had something to work with now; I could remove the data, run an update, or work with the scientific notation.
I hope that helps someone!

Moving WordPress websites between Azure Subscriptions

I’m keen to learn practical aspects of Azure and cloud computing, so I can really understand their value for small businesses who rely on cloud computing. I don’t feel comfortable advocating for something I don’t really understand, or haven’t tried myself. So I set up my Data Relish website using Azure and WordPress, and integrated it with HubSpot so I could use Power BI and HubSpot together. I also set up other tools such as SendGrid by Twilio and CloudFlare. I learned a lot about technologies with which I’m not very familiar.

action-1854117_1920So now that my learning and confidence has increased, I decided to move my Azure website from a trial/test subscription to a different subscription. So how did I do that?

It turned out to be easy to move my website and all of the artefacts from one Azure Subscription to another. Note that my setup met the following conditions and limitations for moving a Web App, which I’ve copied here from the Azure website:

  • The destination resource group must not have any existing App Service resources. App Service resources include:
    • Web Apps
    • App Service plans
    • Uploaded or imported SSL certificates
    • App Service Environments
  • All App Service resources in the resource group must be moved together.
  • App Service resources can only be moved from the resource group in which they were originally created. If an App Service resource is no longer in its original resource group, it must be moved back to that original resource group first, and then it can be moved across subscriptions.

How did I move Subscriptions?

AzureMySQLLogoIn the Azure Portal, I selected the Azure database for MySQL database that underpins my WordPress site.

Then, I clicked on the Change Subscription link in the Overview blade.

 

The next page told me the associated Azure artefacts I’d need to move with it. This page was super helpful since it saved me a step in working out what else I needed to move.

From the drop-down list, I chose my new Subscription, and then clicked Apply.

I waited for two minutes while it deployed to the new subscription, and then the Azure notification popped up in the browser to say that the move had completed… and then I checked to see if my website was up and running.

Much to my huge relief, yes, my website was still up and running. As far as I can see, it all moved seamlessly across. I will be checking the functionality over the next few days just to check it is all running.

Not all Azure operations can be moved so easily, and it is worth checking before you move anything. Here’s a good Azure reference page to review before you start.

 

Connecting #Azure WordPress, #HubSpot data for analyzing data in #PowerBI for a small business #CRM

I got to the end of the free WordPress account for my small business account and I wanted to analyse my CRM and sales data better. I wanted to dial up my sales and marketing, and, of course, use data to understand my audience better. With the free WordPress edition, I could not do some of the things that I wanted, such as HubSpot integration and advanced analytics.

Why CRM?

As a small business, I rely on a lot of word of mouth business. When business leads come in, I need to track them properly. I have not always been very good at following-up in the past, and I am learning to get better at actioning and following-up.

 

I love the HubSpot CRM solution, and I decided I’d take it a step further by integrating HubSpot with my WordPress website, which is hosted in Azure and you can see my Data Relish company site here, with the final result. HubSpot have got great help files here, and I am referring you to them.

What technology did I use?

Microsoft Azure WordPress  – Azure met my needs since it could give me the opportunities for integration, plus additional space for storing resources such as downloads or videos.

Power BI – great way to create dashboards

HubSpot – CRM marketing and sales for small business

I found that using Microsoft Azure was a great way to make the jump from free WordPress to a hosted solution. Now, I am not a web developer and I do not intend to become one. However, I do want to use technology to meet my small business needs, and to do so in a way that is secure. I’m going to write up some posts on how to get started.

To get started with a website in Azure, you can follow the instructions here or watch the Channel 9 video for instructions.

Now, I needed a way of working with the HubSpot data in Power BI, and this is where the CData PowerBI and HubSpot connector comes in.
In running a small business, you need to be super-precious with your time. I could spend ages trying to create my own connector, or I could use a robust, off-the-counter connector that would do it for me.

In a small business, spending your time badly is still a cost.

In a business, you have to decide between spending money or spending time on an activity. If something is taking too long to do by yourself and someone/something could do it better but you have to pay for it, then it’s a false economy and a bad decision to do it by yourself. You’ve got a choice between expending time and effort, or a choice between spending money. Experience will tell you when to do what, but wasting time is difficult to measure.

There aren’t many options for Power BI and HubSpot, but I was pleased to find the CData connector.

Disclaimer: I didn’t tell HubSpot or CData that I was writing this blog so it isn’t endorsed by either of them.

What does CData look like?

You can download the CData ODBC Driver, which connects Power BI to HubSpot. Here’s a snip of their site:

CData PowerBI ODBC Driver

I downloaded the trial, and then went through the install. It was easy and ‘next next next’. When it is installed, it launches a browser to ask you to log into HubSpot, which I did. Then, quickly, I got the following screen – yay, I am in business!

CData Authorization Successful

Then, off to Power BI to download the latest edition of Power BI Desktop. It’s easy to install, and I could get cracking very quickly.

How do we get access to the HubSpot data?

In Power BI Desktop, click on the Get Data icon in the Home tab, and then choose the ODBC option.

Get Data ODBC

Click on the Connect button

Look for the HubSpot ODBC connector in the drop-down list. It should appear something like this:

ODBC Hubspot Power BI

Then, you will be asked for your name and password, and then click Connect:

ODBC HubSpot Username password

Once you have connected, you will be presented with a list of HubSpot tables

Hubspot Tables

Click the tables that you want, and the data will be loaded into Power BI.

If you don’t know which table you want, load in the tables starting with Deals first, then then compare it with the HubSpot screen. This will help you to understand better how the columns relate to your HubSpot data on your screen.

I’ll add more about HubSpot analysis in the future, but for now, happy PowerBI-ing!