Click here to close now.

Welcome!

@ContainersExpo Blog Authors: Rex Morrow, Datical, Elizabeth White, XebiaLabs Blog, Pat Romanski, John Wetherill

Blog Feed Post

31 Days of Servers in the Cloud – Move a local VM to the Cloud (Part 5 of 31)

VMs up, up, and away!My turn!

In todays installment of our “31 Days of Servers in the Cloud”, we wanted to show you how easy it is to load a locally created, Hyper-V based virtual machine into Windows Azure.

“But it’s not really that easy, is it?  I’ve had a heckuva time trying to make this work!”

Actually, once the preliminaries are in place, it is easy.  But to upload anything from your local machine into a Windows Azure storage account requires you to connect to your Azure account.. which means having a management certificate in place to authenticate the connection.. which is a process that is hard to discover.  Searching for a quick solution was confusing, because the tools are always changing.. and what was required several months ago isn’t necessarily the easiest way to do this.

This leads me to a little disclaimer, which really could apply to every single article written for this series:

This documentation provided is based on current tools as they exist during the Windows Azure Virtual Machine PREVIEW period.  Capabilities and operations are subject to change without notice prior to the release and general availability of these new features. 

That said, I’m going to try to make this process as simple as possible, and leave you not only with the ability to launch a VM from your own uploaded .VHD (virtual hard disk) file, but also leave you in good shape for using some pretty useful tools (such as Windows PowerShell) for managing your Windows Azure-based resources. 

The rest of this article assumes that you already have a Windows Azure subscription.  If you don’t have one, you can start a FREE 90 TRIAL HERE.

Create a local VM using Hyper-V

I’m going to assume that you know how to use Hyper-V to create a virtual machine.  You can do this in Hyper-V running on Windows Server 2008 R2 or Windows Server 2012.  You could even use Hyper-V installed on Windows 8.  The end result should be that you have a virtual machine installed as you want it, sysprepped (important!), and ready to go.  It’s that machine’s .VHD (the virtual hard disk) file that you’re going to be uploading into Windows Azure storage.

If you want further help building and preparing a virtual machine, check out the first part of this article on how to build a VM: Creating and Uploading a Virtual Hard Disk that Contains the Windows Server Operating System

NOTE: If you’re going to use one of the storage exploring tools I will be mentioning later, you will want to create your disk as (or convert your disk to) a fixed-format VHD.  This is because those tools won’t convert the disk file on the fly, and the disk in Windows Azure storage is required to be a fixed disk (as opposed to a dynamic disk, which is the default). 

Setup Windows Azure Management

Before we can connect to our Windows Azure storage and start uploading, we need to have a management certificate in place, as well as the tools for doing the upload installed.

Although there are manual ways of creating and uploading a self-signed certificate, the easiest method is to use the Windows Azure PowerShell cmdlets.  Here is the download location for those:

Windows Azure PowerShell: https://www.windowsazure.com/en-us/manage/downloads/ 

Note that although the page says that it’s the November 2012 release, it actually gives you the December 2012 release.  That’s important, because the extremely beneficial Add-AzureVHD PowerShell cmdlet was only introduced in December.

Once those are installed, you can follow the instructions here:

Get Started with Windows Azure Cmdlets: http://msdn.microsoft.com/en-us/library/windowsazure/jj554332.aspx

Specifically THIS SECTION which describes how to use the Get-AzurePublishSettingsFile, which generates a certificate in Windows Azure and creates a local “.publishsettings” file that is then imported locally using the Import-AzurePublishSettingsFile cmdlet.  Once that’s done, you’ll have the management certificate in place locally as well as in your Azure account.  And the best part is, this relationship is persistent!  From this point on the opening of the Windows Azure PowerShell window will be properly associated with your account. 

For a really great write-up on setting up and using PowerShell for Windows Azure, check out Michael Washam’s excellent article HERE.

Create an Azure Storage Account

If you have already created a virtual machine in Windows Azure, then you already have a storage account and container that you can use to hold your disks.  But if you haven’t already done this, you will want to go into your portal and create one.

At the bottom of the portal, click “+ New”, and then choose Data Services –> Storage –> Quick Create

image

You’ll give your storage a unique name and choose geographical location, and then create it.

Once it’s created, select the new storage account and create a new “Blob Container” by selecting the CONTAINERS tab, and then clicking “CREATE A BLOB CONTAINER”.

image

image

image

Note the URL.  Copy it to the clipboard or otherwise keep it handy.  This URL will be used when we upload our VHD.

Upload the Hard Disk into Windows Azure Storage Container

“Kevin..  you also mentioned that we’ll need some tool to do the actual uploads.”

That’s right.  Until recently, the only tool provided by Microsoft for doing this is the “csupload” tool, which is a commandline utility that is installed with the Windows Azure SDK.  (Windows Azure Tools: http://www.windowsazure.com/en-us/develop/downloads/ – But don’t install it just yet… it installs much more than you need to complete this exercise.)

Once the SDK is installed, and you have the SubscriptionID and the Certificate Thumbprint for your connection, you open the Windows Azure Command Prompt and use the csupload command in two steps: to setup the connection, and to do the upload.  Here is the text from the article, Creating and Uploading a Virtual Hard Disk that Contains the Windows Server Operating System , which describes how to use the csupload tool.

All that said… DON’T DO IT!  Unless you’re a developer, the Windows Azure SDK is much more than you need!

“So what’s the alternative, Kevin?”

PowerShell!  Yes.. you already have the PowerShell for Windows Azure installed, so now you’re going to use two PowerShell CmdLets: Add-AzureVHD and Add-AzureDisk

Add-AzureVHD is the upload.  This is the one that takes a LONG TIME to run (depending on the size of your .VHD and your upstream connection speed).  The result is that you have a new Page Blob object up in your storage.

Add-AzureDisk essentially tells Windows Azure to treat that new blob as a .VHD file that has a bootable operating system in it.  Once that’s done, you can go into the Windows Azure Portal, create a new machine, and see your disk as one of the machine disks available.

So in my example, with a fresh, sysprepped, fixed-disk (10GB) .VHD installation of Windows Server 2012, I run these two commands:

Add-AzureVhd -Destination http://kevremdiskstorage.blob.core.windows.net/mydisks/SmallTestServer.vhd -LocalFilePath d:\SmallTestServer.vhd

Add-AzureDisk -DiskName SmallTestServer -MediaLocation http://kevremdiskstorage.blob.core.windows.net/mydisks/SmallTestServer.vhd -OS Windows

(Of course, the first one takes quite a while for me.  About 13 hours.  Ugh.)

“Hey Kevin.. what if I want to use and re-use that image as the basis for multiple machines?”

Excellent question!  And the good news is that basically instead of using Add-AzureDisk, you use the Add-AzureVMImage CmdLet to tell Windows Azure that the disk should be made available as a re-usable image.  Like this:

Add-AzureVMImage -ImageName Server2012Eval -MediaLocation http://kevremdiskstorage.blob.core.windows.net/mydisks/SmallTestServer.vhd -OS Windows

Once that’s done, instead of just having a disk to use once for a new machine, I have a starting-point for one or more machines.

Create the Machine

In the portal it’s really no more complex than creating a new machine from the gallery:

image

Your disk should show up towards the bottom of the list.  Select it, and build your machine.

Once created, you should be able to start it as if it were any other machine built from a previoulsy installed disk.

If you chose to add your disk as an image in the repository, then you also could create it using QUICK CREATE, because it is an image that is now available for you to use and re-use.

---

Other Errata

As long as we’re discussing working with Windows Azure Storage, here are a couple of tools that make it easier to manage, navigate, and upload/download items in your storage cloud:

Both have free trials, and aren’t really all that expensive.  I’ve had mixed results, and you have to be careful that you’re creating “page blobs” and not “block blobs”.  And with a slow upload connection, these tools are rather fragile.  Benefit –  Both of these allow you to configure a connection to your Windows Azure subscription and multiple storage accounts in order to upload and download your .VHD files.  For our purposes, these will do what the Add-AzureVHD cmdlet did for us, plus let you create or manage storage containers.  You’ll still need to run the Add-AzureDisk and Add-AzureVMImage commands to configure your disks for use.

(Major kudos to Joerg of ClumsyLeaf Software (makers of CloudXplorer), who answered my support questions in a matter of minutes!  And on a Saturday, no less!)

---

What do you think?  Are you going to try this out?  At the very least I hope that this article helps you get PowerShell configured for working with your Windows Azure objects.  Give us your questions or feedback in the comments.

Read the original blog entry...

More Stories By Kevin Remde

Kevin is an engaging and highly sought-after speaker and webcaster who has landed several times on Microsoft's top 10 webcast list, and has delivered many top-scoring TechNet events and webcasts. In his past outside of Microsoft, Kevin has held positions such as software engineer, information systems professional, and information systems manager. He loves sharing helpful new solutions and technologies with his IT professional peers.

A prolific blogger, Kevin shares his thoughts, ideas and tips on his “Full of I.T.” blog (http://aka.ms/FullOfIT). He also contributes to and moderates the TechNet Forum IT Manager discussion (http://aka.ms/ITManager), and presents live TechNet Events throughout the central U.S. (http://www.technetevents.com). When he's not busy learning or blogging about new technologies, Kevin enjoys digital photography and videography, and sings in a band. (Q: Midlife crisis? A: More cowbell!) He continues to challenge his TechNet Event audiences to sing Karaoke with him.

@ThingsExpo Stories
There will be 150 billion connected devices by 2020. New digital businesses have already disrupted value chains across every industry. APIs are at the center of the digital business. You need to understand what assets you have that can be exposed digitally, what their digital value chain is, and how to create an effective business model around that value chain to compete in this economy. No enterprise can be complacent and not engage in the digital economy. Learn how to be the disruptor and not the disruptee.
2015 predictions circa 1970: houses anticipate our needs and adapt, city infrastructure is citizen and situation aware, office buildings identify and preprocess you. Today smart buildings have no such collective conscience, no shared set of fundamental services to identify, predict and synchronize around us. LiveSpace and M2Mi are changing that. LiveSpace Smart Environment devices deliver over the M2Mi IoT Platform real time presence, awareness and intent analytics as a service to local connected devices. In her session at @ThingsExpo, Sarah Cooper, VP Business of Development at M2Mi, will d...
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud create greater value for the user? Why do connected features improve the overall experience? And why do...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Imagine a world where targeting, attribution, and analytics are just as intrinsic to the physical world as they currently are to display advertising. Advances in technologies and changes in consumer behavior have opened the door to a whole new category of personalized marketing experience based on direct interactions with products. The products themselves now have a voice. What will they say? Who will control it? And what does it take for brands to win in this new world? In his session at @ThingsExpo, Zack Bennett, Vice President of Customer Success at EVRYTHNG, will answer these questions a...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust IT industrialization – allowing customers to provide amazing user experiences with optimized IT per...
We’re entering a new era of computing technology that many are calling the Internet of Things (IoT). Machine to machine, machine to infrastructure, machine to environment, the Internet of Everything, the Internet of Intelligent Things, intelligent systems – call it what you want, but it’s happening, and its potential is huge. IoT is comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures. As a result, huge volumes of data are being generated, and that data is being processed into useful actions that can “command and control” thi...
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a business application to multiple users (multi-tenancy). Docker, a container technology, was used to ...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, will describe how to revolutionize your architecture and...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.