Click here to close now.

Welcome!

Containers Expo Blog Authors: Rex Morrow, Datical, Elizabeth White, Pat Romanski, John Wetherill, Liz McMillan

Related Topics: Containers Expo Blog, Microsoft Cloud, Linux Containers, Open Source Cloud, Silverlight, CloudExpo® Blog, Apache

Containers Expo Blog: Blog Post

Step-by-Step: Build Linux VMs in the Cloud with Windows Azure

Windows Azure Virtual Machines provides Linux and Windows VM support

The Windows Azure Infrastructure as a Service (IaaS) offering supports running Windows virtual machines and Linux virtual machines in the Cloud. In this article, I provide step-by-step guidance for running a new Linux virtual machine in the cloud using our Windows Azure platform.

Linux?

That’s right, Linux!

Windows Azure runs Linux VMs as a first-class citizen on our cloud platform, with support in the Preview offering for four common Linux distributions:

  • SUSE Linux Enterprise Server (SLES) 11 SP2
  • OpenSUSE 12.1
  • CentOS 6.3
  • Ubuntu 12..04.1 and 12.10

The best news is … you can try this out for FREE by signing up for a FREE Windows Azure 90-day Trial of our Virtual Machine Preview.

NOTE: When activating your FREE Trial for Windows Azure, you will be prompted for credit card information.  This information is used only to validate your identity and your credit card will not be charged, unless you explicitly convert your FREE Trial account to a paid subscription at a later point in time.

What about custom Linux images?

If the standard Linux platform images don’t meet your exact needs, there’s several options for leveraging or customizing other images for Windows Azure VMs …

  • We’ve recently announced our own Open Source VM Depot for Linux images for Windows Azure VMs. 
  • SUSE Studio provides integrated Windows Azure support for deploying customized SLES and OpenSUSE images directly to Windows Azure.  You’ll need to enable support for Windows Azure deployment under your SUSE Studio Account Settings by activating “Experimental Features”  after registering on the SUSE Studio site.
  • We also provide step-by-step guidance for preparing your own Linux images for use with Windows Azure if you need further customization flexibility.

Step-by-Step: Build a Linux Apache Web Server VM in the Windows Azure Cloud
In this scenario, we’ll work through the process of provisioning a new Linux Apache web server running SUSE Linux Enterprise Server 11 SP2 in a virtual machine on the Windows Azure cloud platform.

  1. To prepare your Windows Azure environment, make sure you’ve first completed all steps in our Getting Started article.
  2. After you’ve provisioned a new Linux virtual machine, we’ll be configuring it via a remote Secure Shell (SSH) console session.  If you don’t already have an SSH client installed on your PC, I highly recommend PuTTY.  Download PuTTY and install it before proceeding.
  3. Open Internet Explorer and browse to https://manage.windowsazure.com/ to enter the Windows Azure portal. Then, log in with your credentials.

  4. In the menu located at the bottom, select New | Compute | Virtual Machine | From Gallery to start creating a new virtual machine.

    creating-a-new-virtual-machine
    Creating a new Virtual Machine

  5. On the VM OS Selection page, click Platform Images on the left menu and select the SUSE Linux Enterprise Server 11 SP2 OS image from the list. Click the Next button to continue.

  6. On the Virtual Machine Configuration page, specify a unique Virtual Machine Name and Administrative Password to be provisioned for the new VM. 

    image
    Virtual Machine Configuration

    Note: It is suggested to use secure passwords for administrative users and service accounts, as Windows Azure virtual machines could be accessible from the Internet knowing just their public DNS host names.  You can also read this document on the Microsoft Security website that will help you select a secure password: http://www.microsoft.com/security/online-privacy/passwords-create.aspx.

    Record the username and password information that you’ve entered above for use when remotely connecting to this new virtual machine later in this step-by-step guide.

    Click the Next button to continue.

  7. On the Virtual Machine Mode page, specify a unique public DNS host name that you’ll use to initially access this new VM remotely.  For the Storage Account and Region/Affinity Group/Virtual Network fields, select the storage account and affinity group that you previously created in the Getting Started article.

    Click the Next button to continue.

    image
    Virtual Machine Mode

    Record the public DNS hostname for testing this new virtual machine after the configuration steps are complete.

  8. On the Virtual Machine Options page, accept the default values and click the Checkmark button to begin provisioning your new virtual machine.

    image
    Virtual Machine Options

  9. You will be returned to the Virtual Machines page and your new virtual machine will be listed with a status of Starting (Provisioning) while it is being initially provisioned on the Windows Azure cloud platform.

    image
    Provisioning a New Virtual Machine

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.

  10. After the new virtual machine has finished provisioning, click on the Name of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal. This will navigate to a Dashboard page for this new virtual machine.

    image
    Virtual Machine Dashboard

  11. On the Virtual Machine Dashboard page, make note of the SSH Details.  You’ll use this information to connect to the new virtual machine remotely via an SSH client to configure the Apache web server daemon.

  12. On the Virtual Machine Dashboard page, click on the Endpoints link located in the top navigation area of this page.  This will navigate to the Endpoints page for this virtual machine, listing all firewall endpoint traffic that is currently permitted inbound to this virtual machine.

    image
    Virtual Machine Endpoints

    Note that SSH traffic has automatically been permitted inbound to this new Linux virtual machine, but no other network traffic is permitted by default.  In the next two steps, we’ll add a new endpoint to permit inbound web http traffic to our virtual machine.

  13. On the Virtual Machine Endpoints page, click the +Add Endpoint button located on the bottom toolbar of this page.  This will launch the Add Endpoint wizard.

    image
    Add Endpoint Wizard

    Click the Next button to continue.

  14. On the Specify the details of the endpoint wizard page, enter a name ( web-http ), a public port ( 80 ) and a private port ( 80 ) in the respective fields. 

    image
    Specify the Details of the Endpoint

    Click the Checkmark button to provision the new endpoint.

  15. On your PC, launch an SSH client, such as PuTTY, and establish a new remote SSH session to your virtual machine using the SSH details recorded in Step 9 above in the Host Name and Port fields of the SSH connection properties.

    image
    Establish a new Remote SSH Session with PuTTY

    Click the Open button to establish a remote SSH session to your virtual machine.

  16. Because this is the first time connecting to this virtual machine, you may be presented with a Security Alert dialog box prompting you to accept the host key for securing this SSH session.  

    image
    PuTTY Security Alert during First-Time SSH Connection

    Click the Yes button to accept this host key into your SSH client cache and continue with establishing the connection.

  17. In the new SSH session window, authenticate with the Username and Password information recorded above in Step 4 above.

    image
    SSH Session Window

  18. In the SSH session window, enter the following command to elevate your session for performing root-level administrative commands:

    sudo su -

    When prompted, confirm your identity by re-entering the same Password used to authenticate in Step 17 above.

  19. In the SSH session window, install the packages needed for running the YaST2 setup and configuration tool by entering each of the following commands, pressing ENTER after each line:

    zypper install yast2
    zypper install yast2-ncurses
    zypper install yast2-ncurses-pkg
    zypper install yast2-qt
    zypper install yast2-packager
    zypper install yast2-runlevel
    zypper install yast2-network
    zypper install yast2-http-server

  20. In the SSH session window, launch the YaST2 setup and configuration tool by entering the following command:

    yast2

    This will launch the YaST2 Control Center menu-based setup and configuration tool.

    image
    YaST2 Control Center

  21. In the YaST2 Control Center, press ENTER on the Software menu choice and then select Software Management by pressing ENTER again. This will launch the YaST2 Software Management tool shown below.

    image
    YaST2 Software Management

  22. In the YaST2 Software Management tool, press ALT+F to change the Filter type, select Patterns using your arrow keys and press ENTER.

    image
    Selecting a Pattern Filter

  23. In the YaST2 Software Management tool, press TAB to switch your cursor focus to the Patterns List and scroll down using your arrow keys until Web and LAMP Server is highlighted.

    image
    Selecting the Web and Lamp Server Pattern Filter

  24. In the YaST2 Software Management tool, press ALT+T to open the Actions menu, then press ALT+A to select All Listed Packages, then press ALT+I to select the option to Install All.  After this is completed, a + ( plus sign ) should appear next to each of the 9 selected packages.

    image
    Selecting to Install All Packages in the Web and Lamp Server Pattern Filter

  25. In the YaST2 Software Management tool, press ALT+A to start the installation. Press ENTER when prompted for confirmation. 

    image
    Performing the Web and Lamp Server Package Installation

    Once the installation is complete, the YaST2 Control Center main menu will appear again.

  26. In the YaST2 Control Center main menu, use arrow keys to select Network Services –> HTTP Server to configure the newly installed Web server.

    image
    Selecting the HTTP Server Wizard Configuration Tool

    Press ENTER on the HTTP Server menu selection.

  27. In the HTTP Server Wizard, press F10 on each screen of the wizard to accept default configuration values and finish the wizard. 

    image
    HTTP Server Configuration Wizard

    Once the configuration process is complete, the YaST2 Control Center main menu will appear again.

  28. In the YaST2 Control Center main menu, use arrow keys to select System –> System Services (Runlevel) to start the Web server.

    image
    Selecting System Services (Runlevel) to Start Web Server

    Press ENTER on the System Services (Runlevel) menu selection.

  29. In the YaST2 System Services (Runlevel) tool, use arrow keys to select apache2 in the Service list.  Press ALT+E to enable and start the Apache HTTP daemon.

    image
    Enabling the Apache Web Daemon

  30. In the YaST2 System Services (Runlevel) tool, after the Apache Web Daemon is started, press ENTER and then ALT+O to accept the new Runlevel changes.

    image
    Accepting the Runlevel Changes

    Press ENTER to confirm that changes will be saved.

We’re done! Let’s test it!

After the above steps are completed, the Apache Web Daemon will be running on SUSE Linux Enterprise Server 11 SP2 in a virtual machine on the Windows Azure cloud platform.

  • Test the new virtual machine by opening a new browser window and navigating to:

    http://<public_DNS_hostname>.cloudapp.net

    Use the public DNS hostname value recorded in Step 7 above.

    image
    Testing the Linux HTTP Web Server VM

What’s Next? Keep Building!

Learn more about Windows Azure Virtual Machines with this FREE online training:

In addition to running Linux virtual machines on the Windows Azure cloud platform, did you know that Windows Server 2012 Hyper-V and our FREE Hyper-V Server 2012 products support running Linux virtual machines in your on-premise data center?

  • Do It: Learn more about these products at:
    • Build your Windows Server 2012 Lab
    • Join our FREE Early Experts Virtualizer Knowledge Quest
    • Build a FREE Hyper-V Server 2012 cluster
    • Download Linux Integration Services for Hyper-V

Which Linux workloads are you planning to virtualize?

Feel free to leave your comments below with your thoughts, questions and ideas for virtualizing Linux workloads on the Windows Azure cloud platform, Windows Server 2012 and our FREE Hyper-V Server 2012.

Keith

Build Your Lab! Build Your Lab! Download Windows Server 2012
Build Your Lab in the Cloud! Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines
Join our "Early Experts" study group! Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group

More Stories By Keith Mayer

Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

@ThingsExpo Stories
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a business application to multiple users (multi-tenancy). Docker, a container technology, was used to ...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will addresses this very serious issue of profound change in the industry.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers is very hard. You have to learn five new and different technologies and best practices (libswarm, sy...
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data retrieval. They can easily adapt to new data sets and provide access to both structured and unstruc...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...