Click here to close now.


Containers Expo Blog Authors: Liz McMillan, JP Morgenthal, Pat Romanski, Carmen Gonzalez, Don MacVittie

Related Topics: Microsoft Cloud, Containers Expo Blog, Silverlight, Agile Computing, @CloudExpo, Cloud Security

Microsoft Cloud: Blog Post

Step-by-Step: Tired of Tapes? Backup SQL Databases to the Cloud

New feature in SQL Server 2012 SP1 CU2 provides native backup to Windows Azure cloud storage

I think every IT Pro I’ve ever met hates tape backups … but having an offsite component in your backup strategy is absolutely necessary for effective disaster recovery.  One of the new features provided in SQL Server 2012 Service Pack 1 Cumulative Update 2 is the ability to now backup SQL databases and logs to Windows Azure cloud storage using native SQL Server Backup via both Transact-SQL (T-SQL) and SQL Server Management Objects (SMO).

Backup to cloud storage is a natural fit for disaster recovery, as our backups are instantly located offsite when completed.  And, the pay-as-you-go model of cloud storage economics makes it really cost effective – Windows Azure storage costs are less than $100/TB per month for geo-redundant storage based on current published costs as of this article’s date.  That’s less than the cost of a couple SDLT tapes! You can check out our current pricing model for Windows Azure Storage on our Price Calculator page.

In this article, I’ll step through the process of using SQL Server 2012 SP1 CU2 native backup capabilities to create database backups on Windows Azure cloud storage.

How do I get started?
To get started, you’ll need a Windows Azure subscription.  Good news! You can get a FREE 90-Day Windows Azure subscription to follow along with this article, evaluate and test … this subscription is 100% free for 90-days and there’s absolutely no obligation to convert to a paid subscription.

  • DO IT: Sign-up for a Free 90-Day Windows Azure Subscription

    NOTE: When activating your FREE 90-Day Subscription for Windows Azure, you will be prompted for credit card information.  This information is used only to validate your identity and your credit card will not be charged, unless you explicitly convert your FREE Trial account to a paid subscription at a later point in time.

You’ll also need to download Cumulative Update 2 for SQL Server 2012 Service Pack 1 and apply that to the SQL Server instance with which you’ll be testing.

Don’t have a SQL Server 2012 instance in your data center that you can test with?  No problem! You can spin up a SQL Server 2012 VM in the Windows Azure Cloud using your Free 90-Day subscription.

Let’s grab some cloud storage
Once you’ve got your Windows Azure subscription activated and your SQL Server 2012 lab environment patched with SP1 CU2, you’re ready to provision some cloud storage that can be used as a backup location for SQL databases …

  1. Launch the Windows Azure Management Portal and login with the credentials used when activating your FREE 90-Day Subscription above.
  2. Click Storage in the left navigation pane of the Windows Azure Management Portal.

    Windows Azure Management Portal – Storage Accounts
  3. On the Storage page of the Windows Azure Management Portal, click +NEW on the bottom toolbar to create a new storage account location.

    Creating a new Windows Azure Storage Account location
  4. Click Quick Create on the New > Storage popup menu and complete the fields as listed below:

    - URL: XXXbackup01 ( where XXX represents your initials in lowercase )

    - Region / Affinity Group: Select an available Windows Azure datacenter region for your new Storage Account. 

    NOTE: Because you will be using this Storage Account location for backup / disaster recovery scenarios, be sure to select a Datacenter Region that is not near to you for additional protection against disasters that may affect your entire local area.

    Click the Create Storage Account button to create your new Storage Account location.
  5. Wait for your new Storage Account to be provisioned. 

    Provisioning new Windows Azure Storage Account

    Once the status of your new Storage Account shows as Online, you may continue with the next step.
  6. Select your newly created Storage Account and click the Manage Keys button on the bottom toolbar to display the Manage Access Keys dialog box.

    Manage Access Keys dialog box

    Click the image button located next to the Secondary Access Key field to copy this access key to your clipboard for later use.
  7. Create a container within your Windows Azure Storage Account to store backups.  Click on the name of your Storage Account on the Storage page in the Windows Azure Management Portal to drill into the details of this account, then select the Containers tab located at the top of the page.

    Containers tab within a Windows Azure Storage Account

    On the bottom toolbar, click the Add Container button to create a new container named “backups”.

You’ve now completed the provisioning of your new Windows Azure storage account location.

We’re ready to backup to the cloud
When you’re ready to test a SQL database backup to the cloud, launch SQL Server Management Studio and connect to your SQL Server 2012 SP1 CU2 database engine instance.  After you’ve done this, proceed with the following steps to complete a backup …

  1. In SQL Server Management Studio, right-click on the database you wish to backup in the Object Explorer list pane and select New Query.

    SQL Server Management Studio
  2. In the new SQL Query Window, execute the following Transact-SQL code to create a credential that can be used to authenticate to your Windows Azure Storage Account with secure read/write access:

    CREATE CREDENTIAL myAzureCredential
    WITH IDENTITY='XXXbackup01',

    Prior to running this code, be sure to replace XXXbackup01 with the name of your Windows Azure Storage Account created above and paste in the Access Key you previously copied to your clipboard.

  3. In the SQL Query Window, execute the following Transact-SQL code to perform the database backup to your Windows Azure Storage Account:

    BACKUP DATABASE database_name TO
    WITH CREDENTIAL='myAzureCredential' , STATS = 5;

    Prior to running this code, be sure to replace XXXbackup01 with the name of your Windows Azure Storage Account and replace database_name with the name of your database.

    Upon successful execution of the backup, you should see SQL Query result messages similar to the following:

    Successful Backup Results

What about restoring?
Restoring from the cloud is just as easy as backing up … to restore we can use the following Transact-SQL syntax:


Thoughts? Comments? Feedback?
What are your thoughts around leveraging the cloud for backup storage? Feel free to post your comments, questions and feedback below.


Build Your Lab! Build Your Lab! Download Windows Server 2012
Build Your Lab in the Cloud! Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines
Join our "Early Experts" study group! Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group

More Stories By Keith Mayer

Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

@ThingsExpo Stories
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, explored the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context with p...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...