Click here to close now.


Containers Expo Blog Authors: Pat Romanski, Elizabeth White, Ian Khan, Jason Bloomberg, Liz McMillan

Blog Feed Post

Data Security Using SQL Azure

One of the major concerns in using SQL Azure is the security of data such as credit card numbers, Social Security numbers, salaries, bonuses etc. The degree to which data needs to be protected is to be determined by each business entity but generally, on-site data is more secure than data stored in the cloud.
This is a simple example of using SQL Server Integration Services SIS and SQL Server Reporting Services tools to accomplish just that.
We start off with this scenario: The fictitious company SecureAce wants to place one of their Employee tables on SQL Azure, but they do not want to keep any sensitive information such as employee salaries. However from time to time they need to generate report of their employees and salaries to management.
The solution to this scenario is divided in two parts.
In the first part, the on-site data in the employees table is partitioned in such a way that the sensitive information stays on-site and the larger, non-sensitive data is stored on SQL Azure.
In the second part SSIS is used to bring the two pieces of data together and load an Access database (on-site) which is used as a front end for reporting information to management, an entirely realistic way of data management. Although a Microsoft Access database is used, any other destination handled by SSIS can also be used[s1] , such as another SQL Server database. Herein we used MS Access as it is a very common product used in many small businesses.
 It may be noted however that Microsoft is now supporting connecting SQL Azure to MS Access directly, review this link for details: With the availability of Microsoft Office Professional Plus 2010, the author was able to directly connect to SQL Azure using an ODBC connection.
Splitting the data and uploading to SQL Azure
This is a preparation for the SSIS task that follows. We will be using Northwind database’s Employee table and splitting it in two parts each containing different columns, a vertical partition. One part will remain on site which contains the salary information of employees and the other which is loaded to SQL Azure will contain most of other information.  In the Northwind database, the employee table does not have a salary column and hence an extra column will be added for this simulation. The procedure is described in the following[s2]  steps[Maitreya3] .
·         Create a table Employees in VerticalPart using the following statement:
CREATE TABLE [dbo].[Employees](
[LastName] [nvarchar](20) NOT NULL,
[FirstName] [nvarchar](10) NOT NULL,
[HomePhone] [nvarchar](24) NULL,
[Extension] [nvarchar](4) NULL,
[Salary] [money] NULL
·         Use Import / Export Wizard to populate the columns (except Salary) of the above table using Northwind's Employees table
·         Modify table by adding salary for each employee
[s6] [j7] There are only few employees and this should not be a problem. When you want to save the table, you may not be able to do so unless you have turned-on this option, in the Tools menu of SSMS. You will get a reply after you save [s8] [j9] the Employees table as shown.

Now run a SELECT query to verify that the salary column has been populated as shown.

Copy the script for Northwind’s Employee table and modify it by changing the table name and removing some columns resulting in the following statement:

CREATE TABLE [dbo].[AzureEmployees](
[LastName] [nvarchar](20) NOT NULL,
[FirstName] [nvarchar](10) NOT NULL,
[Title] [nvarchar](30) NULL,
[TitleOfCourtesy] [nvarchar](25) NULL,
[HireDate] [datetime] NULL,
[Address] [nvarchar](60) NULL,
[City] [nvarchar](15) NULL,
[Region] [nvarchar](15) NULL,
[PostalCode] [nvarchar](10) NULL,
[Country] [nvarchar](15)
Note that the table name has been changed to AzureEmployees. This is the table that will be stored in the Bluesky database on SQL Azure.
Login to SQL Azure and create the table in Bluesky database by running the above create table statement.
The table will be created with the above schema which you may verify in the Object Browser.

Use Import and Export Wizard to populate the columns of AzureEmployees with data from Northwind. Use the query option to move data from source to destination using the following query.
SELECT EmployeeID, LastName, FirstName,
Title, TitleOfCourtesy, HireDate,
Address, City,Region, PostalCode,
Save the query results to the AzureEmployees table you created earlier as shown. 

Follow wizard’s steps to review data mapping as shown

Complete the wizard steps as shown.

Verify data in AzureEmployees in Bluesky database on SQL Azure by running a SELECT statement.
By following the above we have created two tables, one on-site and the other on SQL Azure.
Although data transformation of string data types did not present any error due to string length it could present some problems if the string length is over 8000 if the strings are of type varchar (max) and text. In these cases just change them to nvarchar (max) to overcome the problem. For details review the following link:
Merging data and loading an Access database
In this section we will reconstruct the Employees table on-site by retrieving data from SQL Azure as well as SQL Server’s VerticalPart database and merge them. After merging them, we will place them in an MS Access database so that simple reports can be authored.
In order to do this we take the following steps.
  1. Click open BIDS from its shortcut.
  2. Create a Integration Services Project after providing a name for the project. Change the default name of the Package file.
The Project folder should appear as shown in the next image. Project name and Package name were provided.

  1. Drag and drop a Data Flow task to the Control Flow tabbed page of the package designer surface.
  2.  In the bottom pane Connection Managers, configure connection managers one each for SQL Azure database; VerticalPart database on SQL Server 2008; and an MS Access database as shown.

The next image shows the details of the connection manager Hodentek3\KUMO.VerticalPart. Note that SqlClient Data Provider is used. The SQL Server Hodentek3\KUMO is configured for Windows Authentication.

This next image shows the connection for the Bluesky database on SQL Azure. The authentication information is the same one you have used so far and, if it is correct you should be able to see the available databases.

  1. Create an MS Access database (Access 2003 format) and use it for this connection.
Later we also create a table in this database to receive the merged fields from SQL Azure and the on-site server.
For this connection manager we use the following settings and verify by clicking the Test Connection button:
Provider:                 Native OLE DB\Microsoft Jet 4.0 OLE DB Provider
Database file is at:  C:\Users\Jay\AccessSQLAzure.mdb
User name:              Admin
Password:               <empty>

It is assumed that the reader has familiarity with using SSIS. The author recommends his own book on SSIS for beginners, which may be found here:
Each of the above connections can be tested using the Test Connection button on them.
Merging columns from SQL Azure and SQL Server
You will use two ADO.NET Source data flow sources, one each for SQL Azure and SQL Server. The outputs will be merged.
  1. Add two ADO.NET data flow sources to the tabbed designer pane Data Flow.
  2. Rename the default names of the source components to read From SQL Azure Database and From SQL Server 2008 database.

  1. Configure the ADO.NET Source Editor connected to SQL Azure to display the following as shown in the next image.
ADO.NET Connection manager:
Data access mode: Table or view
Name of the table or view: "dbo"."AzureEmployees"
You must use the server name appropriate for your SQL Azure instance.

Configured as shown and you should be able to view the data in this table with the Preview…button.

  1. Configure the ADO.NET Source Editor connected to SQL Server to display the following as shown in the next image.
Use the following details to configure  From SQL Server 2008 database source used in the ADO.NET Source Editor are as follows:
ADO.NET Connection manager: Hodentek3\KUMO.Verticalpart
Data access mode: Table or view
Name of the table or view: "dbo"."Employees"

Again you should be able to view the data in this table with the Preview…button.
Sorting the outputs of the sources
Since the data coming at the exit point of the sources are not sorted it is important to get the sorting correct and same in both sources before they can be merged.
  1. Drag and drop two Sort dataflow controls from the Toolbox to the design surface just below the ADO.NET data sources.
  2. Start with the one that is going to be receiving its input from the From SQL Azure Database source control.
  3. Click From SQL Azure Database and drag and drop the green dangling line on to the Sort control below it as shown.

  1. Double click the Sort control to display the Sort Transformation Editor and place a check mark for EmployeeID as shown.

  1. Repeat the same procedure for the From SQL Server 2008 Database source. Now we have two sort controls receiving their inputs from two source controls with outputs sorted.
  2. Drag and drop a Merge Join Data Flow Transformation from the Toolbox on to the design surface.
  3. Click the Sort data flow transformation on the left (connected to From SQL Azure Database) and drag and drop its green dangling line on to the Merge Join data flow transformation.
The Input Output Selection window will be displayed as shown.

  1. Select the Merge Join Left Input and click OK.
  2. Repeat the same for the other Sort on the right (this time select Merge Join Right Output).
This Merge control now merges the output from the two sort controls and provides a merged output.
You still need to configure the Merge Join.
  1. Double click Merge Join to open the Merge Join Transformation editor page as shown.
Read the instructions on this window.

  1. Place check mark for EmployeeID in both the Sort lists shown in the top pane. The bottom pane gets populated with Input columns and Output aliases. Make sure the join type is Left outer join as in the above image (use drop-down handle if needed).
We can add for each flow path a Data Viewer so that we can monitor the flow of data at run time by momentarily stopping the flow downstream. We are skipping this diagnostic step.
Porting output data from Merge Join to an MS Access Database
We will be using the merged data from the two sources to fill up a table in an MS Access 2003 database. 
  1. In the MS Access database you created while setting up the Connection Managers create a table, Salary Report table with the design parameters shown in the next image.

  1. Drag and drop an OLE DB Destination component from the Toolbox on to the package designer pane just underneath the Merge Join component.
  2. Drag and drop the green dangling line from Merge Join to the OLE DB Destination component.
  3. Double click the OLE DB Destination component to open its editor and fill in the details as follows:
OLEDB connection manager:   AccessSQLAzure
Data access mode:                     Table or View
Name of the table or view:        Salary Report

  1. Click Mappings to verify all the columns are present.
  2. Build the project and execute the package.
The package elements turn yellow and later green indicating a successful run.
You can verify the table in the access database for the transferred values. This should have all the merged columns from the two databases. Note that in the image, columns have been rearranged to move the Salary column into view.

This is an excerpt of Chapter 6 from my book:
Book published by

 [s1]Do you want to elaborate on this a bit and put it up as a tip for the readers?
 [s2]This sounds like an incomplete sentence. Please complete it
 [Maitreya3]' the following procedure:' or a similar term can be used. This statement sounds incomplete.
 [s4]This looks out of place. Do we need an explanation under this or do we have it as a part of the explanation above?
 [j5]Modified. Part of a number of steps, now bulleted.
 [s6]How about a numbered bullet list here?
 [s8]Save what?

Read the original blog entry...

More Stories By Jayaram Krishnaswamy

Jayaram Krishnaswamy is a technical writer, mostly writing articles that are related to the web and databases. He is the author of SQL Server Integration Services published by Packt Publishers in the UK. His book, 'Learn SQL Server Reporting Services 2008' was also published by Packt Publishers Inc, Birmingham. 3. "Microsoft SQL Azure Enterprise Application Development" (Dec 2010) was published by Packt Publishing Inc. 4. "Microsoft Visual Studio LightSwitch Business Application Development [Paperback] "(2011) was published by Packt Publishing Inc. 5. "Learning SQL Server Reporting Services 2012 [Paperback]" (June 2013) was Published by Packt Publishing Inc. Visit his blogs at: He writes articles on several topics to many sites.

@ThingsExpo Stories
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
Most of the IoT Gateway scenarios involve collecting data from machines/processing and pushing data upstream to cloud for further analytics. The gateway hardware varies from Raspberry Pi to Industrial PCs. The document states the process of allowing deploying polyglot data pipelining software with the clear notion of supporting immutability. In his session at @ThingsExpo, Shashank Jain, a development architect for SAP Labs, discussed the objective, which is to automate the IoT deployment process from development to production scenarios using Docker containers.
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
Countless business models have spawned from the IaaS industry – resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his general session at 17th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, an IBM Company, broke down what we have to work with, discussed the benefits and pitfalls and how we can best use them to design hosted applications.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.