Click here to close now.


Containers Expo Blog Authors: Elizabeth White, Liz McMillan, Victoria Livschitz, Jason Bloomberg, AppDynamics Blog

Blog Feed Post

Windows 8 Notifications: Using Azure for Periodic Notifications

At the end of my last post, I put in a plug for using Windows Azure to host periodic notification templates, so I’ll use this opportunity to delve into a bit more detail. If you want to follow along and don’t already have an Azure subscription, you can get a free 90-day trial account on Windows Azure in minutes.

The Big Picture

The concept of periodic notifications is a simple one: it’s a publication/subscription model. Some external application or service creates and exposes a badge or tile XML template for the Windows 8 application to consume on a regular cadence. The only insight the Windows 8 application has is the public HTTP/HTTPS endpoint that the notification provider exposes. What’s in the notification and how often it’s updated is completely within the purview of the notification provider. The Windows 8 application does need to specify the interval at which it will check for new content – selecting from a discrete set of values from 30 minutes to a day – and that should logically correspond to the frequency at which the notification content is updated, but there is no strict coupling of those two events.

Periodic notification workflow for Chez ContosoTake for instance, a Windows 8 application for Chez Contoso (right), a local trendy restaurant that displays its featured entrée on a daily basis within a Start screen tile, trying to lure you in for a scrumptious dinner. Of course the chef’s spotlight offering will vary daily, so it’s not something that can be packaged up in the application itself; furthermore, the eatery may want to have some flexibility on the style of notification it provides – mixing it up across the several dozen tile styles to keep the app’s presence on the Start screen looking fresh and alive.

Early each morning then, the restaurant management merely needs to update a public URI that returns the template with information on the daily special. The users of the restaurant’s application, which taps into that URI, will automatically see the updates, whether or not they even run the application that day.

Using Windows Azure Storage

Since the notification content that needs to be hosted is just a small snippet of XML, a simple approach is to leverage Windows Azure blob storage, which can expose all of its contents via HTTP/HTTPS GET requests to a blob endpoint URI.

In fact, if Chez Constoso owned the Windows Azure storage account, the restaurant manager could simply copy the XML file over to the cloud (using a file copy utility like CloudBerry Explorer), and with an access policy permitting public read access to the blob container, the Windows 8 application would merely need to register the URI of that blob in the call to startPeriodicUpdate (along with a polling period of a day and an initial poll time of 9 a.m.). There would be no requirement (or cost) for an explicit cloud service per se: the Windows Azure Storage service fills that need by supporting update and read operations on cloud storage.

Unfortunately, there is a catch! By default, the expiration period for a tile is three days; furthermore, if the client machine is not connected to the network at the time the poll for the update is made, that request is skipped. In the Chez Contoso scenario, this could mean that the daily dinner special for Monday might be what the user sees on his or her Start screen through Wednesday – not ideal for either the restaurant or the patron.

  • The good news is that to ensure content in a tile is not stale, a expiration time can be provided via a header in the HTTP response that returns the XML template payload, and upon expiry, the tile will be replaced by the default image specified in the application manifest.
  • The bad news is that you can’t set HTTP headers when using Azure blob storage alone, you need a service intermediary to serve up the XML and set the appropriate header… enter Windows Azure.

Using Windows Azure Cloud Services

A Windows Azure service for a periodic notification can be pretty lightweight – it need only read the XML file from Windows Azure storage and set the X-WNS-Expires header that specifies when the notification expires. Truth be told, you may not need a cloud storage account since the service could generate the XML content from some other data source. In fact, that same service could expose an interface or web form that the restaurant management uses to supply content for the tile. To keep this example simple and focused though, let’s assume that the XML template is stored directly in Windows Azure storage at a known URI location; how it got there is left to the reader :).

With Windows Azure there are three primary ways to setup a service:

Windows Azure Web Sites, a low cost and quick option for setting up simple yet scalable services.

Windows Azure Virtual Machines, an Infrastructure-as-a-Service (Iaas) offering, enabling you to build and host a Linux or Windows VM with whatever service code and infrastructure you need.

Windows Azure Cloud Services, a Platform-as-a-Service offering, through which you can deploy a Web Role as a service running within IIS.

I previously covered the provisioning of Web Sites in my blog post on storing notification images in the cloud, so this time I’ll go with the Platform-as-a-Service offering using Windows Azure Cloud Services and ASP.NET.

Getting Set Up for Azure Development

If you haven’t used Windows Azure before, now’s a great time to get a 90-day free, no-risk trial.

As far as tooling goes, you can use Visual Studio 2012, Visual Studio 2010, or Visual Web Developer Express. I’m using Visual Studio 2012 for the rest of this post, but the experience is quite similar for Visual Studio 2010 and Visual Web Developer Express.

Since the service will be developed in ASP.NET, you’ll also want to download specifically the Windows Azure SDK for .NET (via the Web Platform Installer). In fact, if you don’t have Visual Studio already installed, this will give you Visual Web Developer Express automatically.

Creating the Service

In Visual Studio, create a new Solution (File>New Project…) of type Cloud:

Creating a new Cloud Service solution

When you create a new Cloud Service, you get the option to create a number of other related projects, each corresponding to a Web or Worker Role that is deployed under the umbrella (and to the same endpoint) of that Cloud Service. Web and Worker Roles typically coordinate with other Azure services like storage and the Service Bus to provide highly scalable, decoupled, and asynchronous implementations. In this simple scenario, you need only a single Web Role, which will host a simple service.

Adding a Web (Service) Role

For the type of ASP.NET MVC project, select Web API, a convenient and lightweight option for building a REST-ful service:

Creating a Web API ASP.NET MVC application

Coding the Service

At a minimum, the service needs to do two things:

  1. return the tile XML content within a predetermined Windows Azure blob URI, the same one that is updated by the restaurant every morning at 9 a.m.
  2. set the X-WNS-Expires header to the time at which the requested tile should no longer be shown to the user. In this case, let’s assume the restaurant stops serving the special at midnight each day, and you don’t want to show the tile after that time.

To create the API for returning the XML content, add a new empty API Controller called Tiles on the ASP.NET MVC 4 project (or you can just modify the ValuesController that gets automatically created):

Creating a new API controller

New API Controller dialog

In that controller, you’ll need a single Get method (corresponding to the HTTP GET request that the Windows 8 application will make on a regular basis to get the XML tile content). That Get method will access a parameter indicating the name of the blob on Windows Azure that contains the XML. For example, a URL like:

will access the blob called dinner in a Windows Azure blob storage container called chezcontoso. The api/tiles segment of the URL is controlled by the ASP.NET MVC route that is set by default in the global.asax file of the project, so it too can be modified if you like.

The complete code for that Get method follows and is also available as a Gist on GitHub for easier viewing and cutting-and-pasting. Note, you will need to supply your own storage account credentials in Line 5ff (see below for a primer on setting up your storage account).

   1:  // /api/tiles/<tilename>
   2:  public HttpResponseMessage Get(String id)
   3:  {
   4:      // set cloud storage credentials
   5:      var client = new Microsoft.WindowsAzure.StorageClient.CloudBlobClient(
   6:          "",
   7:          new Microsoft.WindowsAzure.StorageCredentialsAccountAndKey(
   8:              "YOUR_STORAGE_ACCOUNT_NAME",
   9:              "YOUR_STORAGE_ACCOUNT_KEY"
  10:              )
  11:      );
  13:      // create HTTP response
  14:      var response = new HttpResponseMessage();
  15:      try
  16:      {
  17:          // get XML template from storage
  18:          var xml = client.GetBlobReference(id.Replace("_", "/")).DownloadText();
  20:          // format response
  21:          response.StatusCode = System.Net.HttpStatusCode.OK;
  22:          response.Content = new StringContent(xml);
  23:          response.Content.Headers.ContentType = 
  24:              new System.Net.Http.Headers.MediaTypeHeaderValue("text/xml");
  26:          // set expires header to invalidate tile when content is obsolete
  27:          response.Content.Headers.Add("X-WNS-Expires", GetExpiryTime().ToString("R"));
  28:      }
  29:      catch (Exception e)
  30:      {
  31:          // send a 400 if there's a problem
  32:          response.StatusCode = System.Net.HttpStatusCode.BadRequest;
  33:          response.Content = new StringContent(e.Message + "\r\n" + e.StackTrace);
  34:          response.Content.Headers.ContentType = 
  35:              new System.Net.Http.Headers.MediaTypeHeaderValue("text/plain");
  36:      }
  38:      // return response
  39:      return response;
  40:  }
  42:  private DateTime GetExpiryTime()
  43:  {
  44:      return DateTime.UtcNow.AddDays(1);
  45:  }

And now for the line-by-line examination of the code:

Line 2 The Get method is defined to accept a single argument (this will be of the format container_blob) which is automatically passed in by the ASP.NET MVC routing engine to the id parameter. This leverages the default route for APIs as defined in the App_Start/WebApiConfig.cs of the ASP.NET MVC project, but you do have full control over the specific path and format of the URL.
Lines 5-11 This instantiates the StorageClient class wrapper for interacting with Windows Azure blob storage. For simplicity, the credentials are hard-coded into this method. As a best practice, you should include your credentials in the role configuration (see the Configuring a Connection String to a Storage Account in Configuring a Windows Azure Project) and then reference that setting in code via RoleEnvironment.GetConfigurationSettingValue.
Line 14 A new HTTP response object is created, details for which are filled by the rest of the code in the method.
Line 18 The blob resource in Windows Azure is downloaded using the account set in Line 5ff. The id parameter is assumed to be of the format container_blob, with the underscore character separating the distinct Windows Azure blob construct references. Since references to containers and blobs in Windows Azure require the / separator, the underscore in the parameter value is replaced before the call is made to download the content.

You could alternatively create a new MVC routing rule that accepted two parameters to the Get method, container and blob, and then concatenate those values here. I opted for the implementation here so as not to require additional modifications in other parts of the ASP.NET MVC project.
Line 21 A success code is set for the HTTP response, since an exception would have occurred if the tile content were not found. It’s still possible that the content is not valid, but the notification engine in Windows 8 will handle that silently and transparently and simply not attempt to update the tile.
Line 22 The content payload for the tile, which should be just a snippet of XML subscribing to the Tile schema, is added to the HTTP response.
Line 23 The content type is, of course, XML.
Line 27 To guard against stale content, the X-WNS-Expires header is set to the time at which the tile should no longer be displayed on the user’s Start screen, and instead the default defined in the application’s manifest should be used. Here the time calculation is refactored into a separate method which just adds a day to the current time. That’s actually not the best implementation, but we’ll refine that a bit later in the post.

Note that the time is formatted using the “R” specifier, which applies the RFC1123 format required for HTTP dates.
Lines 31-35 The exception handling code is fairly simplistic but sufficient for the example. If there’s any problem in accessing the blob content, a HTTP 400 status code is returned. The notification processing component of Windows 8 will realize there’s a problem and not attempt to modify the tile. The additional information provided (message and stack trace) is there for diagnostic purposes; it would never be visible to the user of the Windows 8 application.
Line 39 The HTTP response is returned as the result of the service request.
Line 42-45 GetExpiryTime calculates when the tile should expire and the default tile defined in the application’s manifest should be reinstated. In the code embedded above, the tile will expire one day after the content was polled versus the default of three days. The Gist includes an updated version of that calculation that I’ll discuss a bit later in the post.

Testing the Service

The Windows Azure SDK includes an emulator that allows you to run a Windows Azure service on your local machine for testing. You can invoke the emulator when running Visual Studio (as an administrator) by hitting F5 (or Ctrl+F5). For the service application we’re working with, that will open a browser to the default page for the site. That page isn’t the one you want, of course, but since the tile is available via an HTTP GET request, simply supplying the appropriately formatted URI should bring up the XML content in the browser.Browsing to Web API endpoint

After confirming the service works, it’s time to give it shot with a real Windows 8 application that leverages periodic notifications. If you’ve got one coded already, then all you need to do is provide the full URI to startPeriodicUpdate. If you haven’t written your app yet, the Push and periodic notifications client-side sample offers a quick way to test.

When you run that sample, select the fourth scenario (Polling for tile updates) and enter the URL for the service you just created. Then press the Start periodic updates button.

Sample application demonstrating periodic updates

Tile resulting from periodic updateAs for all periodic updates, when startPeriodicUpdate is called, the endpoint is polled immediately, so you should see the tile updated on your Start Screen along the lines of what is shown to the left. If you were to manually update the tile XML in blob storage, you should see the new content reflected in about a half-hour, since that’s the default recurrence period specified in the sample application. For the actual application, a recurrence period of one day makes more sense; additionally, you’d set the startTime parameter in the call to startPeriodicUpdate to 9 a.m. or slightly after to be sure that the poll picks up the daily refreshed content.

Deploying the Service

Visual Studio Publish... optionWhen you’re satisfied the application is working with the local service, you’re ready to deploy it to the cloud with your 90-day free Windows Azure account (or any other account you may have). You can deploy the ASP.NET site to Windows Azure directly from Visual Studio by selecting the Publish option from the Cloud Service project (right).

If this is the first time you’ve published to Windows Azure from Visual Studio, you’ll need to download your publishing credentials, which you can access via the link provided on the Publish Windows Azure Application dialog below.

Downloading Windows Azure publication credentials

Selecting that link prompts you to login to the Windows Azure portal with your credentials and download a file with the .publishsettings extension. Save that file to your local disk and click the Import button to link Visual Studio to your Azure subscription. You only need to do this once, and you should then secure (or remove) the .publishsettings file, since it contains information that would enable others to access your account. With the subscription now in place, you can continue to the next step of the wizard.

Windows Azure storage location promptYou may next be prompted to create a storage account on Windows Azure. This account is used to store your deployment package during the publication process, and it can be used by all the services you ultimately end up deploying. As a result, you need only do this once; although, you can change the default storage account used at a later stage in the wizard.  To create the account enter a storage account name, which must be unique across all of Windows Azure, and indicate which of the eight data centers should house the account.

Service creation dialogIf you have existing services deployed, you’ll have the option to update one of those with the build currently in Visual Studio, or as in this case, you can create a new Cloud Service on Windows Azure. Doing so requires supplying (1) a service name (which must be unique across Windows Azure since it becomes the third level domain name prepended to the domain which all Windows Azure Cloud Services are a part of) and (2) the location of the data center where the service will run. As you can see to the left, I chose my service to reside in the East US and be addressable via

If the service name is valid and not in use, the Common Settings and Advanced Settings options will be populated automatically and allow some customization of how the service runs and what capabilities you have (e.g., remote desktop to your VM in the cloud, whether to use a production or staging slot, if Intellisense is enabled, etc.) The defaults are fine for our purposes here, but you can read more about the additional settings on the Windows Azure site.

Publish Settings

At this point the Next button brings you to a summary screen which indicates that your choices will be saved for when you redeploy, or you can just click Publish to deploy your service to Windows Azure now. Within Visual Studio you can keep tabs on the deployment via the Windows Azure Activity log (accessible via the View->Other Windows menu option in Visual Studio) – note that it may take 10 minutes or more the first time you deploy to a new service; subsequent updates are generally much quicker.

Windows Azure Activity Log

Now that the service is fully deployed to the cloud, you can revisit the sample tile application and provide the Azure hosted URI (versus the localhost one used earlier when testing) to see the complete end-to-end process in action!

Tidying Up

Earlier on, I mentioned I’d revisit the tile timeout logic, namely the implementation of GetExpiryTime in the code sample; it’s time to clean up that loose end!

Since Chez Contoso updates its menu daily at 9 a.m., it seems quite logical to have the tile expire on a daily basis (which was the default implementation I introduced above), but that creates a less than satisfactory experience:

When users fire up the application for the first time, the URI will be polled for the tile and by default the tile will expire 24 hours from then.  If a user should first access the app at 8:45 a.m. and then disconnect for the rest of the day, they will continue to see the previous day’s special on their Start screen! I’d say that if the restaurant stops serving at midnight, no one should continue to see that day’s special beyond that point. 

To address that scenario, here’s an updated implementation of GetExpiryTime:

   1:  private DateTime GetExpiryTime()
   2:  {
   3:      Int32 TimeZoneOffset = -4;  // EDT offset from UTC
   5:      // get representation of local time for restaurant
   6:      var requestLocalTime = DateTime.UtcNow.AddHours(TimeZoneOffset);
   8:      // if request is hitting before 9 a.m., information is stale
   9:      if (requestLocalTime.Hour <= 8)
  10:          return DateTime.UtcNow.AddDays(-1);
  12:      // else, set tile to expire at midnight local time
  13:      else
  14:      {
  15:          var minutesUntilExpiry = (24 - requestLocalTime.Hour) * 60 
- requestLocalTime.Minute;
  16:          return DateTime.UtcNow.AddMinutes(minutesUntilExpiry);
  17:      }
  18:  }

The basic algorithm here is to convert the time the request hits the server (in UTC) to the equivalent local time. Here, I’m assuming the restaurant is on the East Coast of the United States, which currently is offset by four hours from UTC (Line 3). One disadvantage of this particular approach is that the conversion to Standard Time will require a change to the algorithm. That could be alleviated by moving the offset to the services configuration file (ServiceConfiguration.cscfg), but it would still need to be updated twice a year. A more robust (and likely complex) implementation is almost certainly possible, but I’m deeming this ‘close enough’ to illustrate the concept.

Once the representation of the time local to the restaurant is obtained (Line 6), a check is made to see if the time is before or after 9 a.m., the time we’re assuming management is prompt about updating the coming evening’s special.

If it’s before 9 a.m. that day (Line 9), then the tile is set to expire before it’s even delivered, because the current contents refer to the previous day’s special entrée. This does mean that the tile is delivered unnecessarily, but the payload is small, and the Windows 8 notification mechanism will honor the expiration date. An alternative would be to return an HTTP status code of 404, which may be more correct and elegant, but require updating a bit more of the other code.

If the request arrives after 9 a.m. Chez Contoso time (Line 13ff), then the expiration time is calculated by determining how many minutes are left until midnight. Once that time hits, the tile on the user’s Start screen will be replaced with the default supplied in the application’s manifest, which would probably include a stock photo or some other generic and time insensitive text.

Final Design Considerations

Congratulations! At this point you should have a cloud service setup that can service one or many of your Windows 8 applications. Before signing off on this post though, I wanted to reiterate a few things to keep in mind about periodic notifications.

  • If the machine is not connected to the internet when it’s time to poll the external URI, then that attempt is skipped and not retried until the next scheduled recurrence – which could be as long as a day away. The takeaway here is to consider expiration policies on your tiles so that stale content is removed from the user’s Start screen.
  • The polling interval is not precise; there can be as much as a 15-minute delay for the notification mechanism to poll the endpoint.
  • The URI is always polled immediately when a client first registers (via startPeriodicUpdate), but you can specify a start time of the next poll and then the recurrence interval for every request thereafter.
  • You can leverage the notification queue by providing up to five URIs to be polled at the given interval. Each of those can have separate expiration policies and also provide different tag value (X-WNS-Tag header) to determine which tile of the set should be replaced with new content.

Get a free 90-day trial accountThis section is a primer for setting up a Windows Azure storage account in case you want to follow along with the blog post above and set up your own periodic notification endpoints in Windows Azure. If you’ve already worked with Windows Azure blob storage, you won’t miss anything by skipping this!

Getting Your Windows Azure Account

There are a number of options to quickly provision a Windows Azure account. Many of these are either free or have a free monthly allotment of services - more than enough to get your feet wet and explore how Windows Azure can enhance Windows 8 applications from a developer and an end-user perspective:

Creating Your Storage Account

+NEWOnce your subscription has been provisioned, log in to the Windows Azure Management Portal, and select the NEW option at the bottom left.

From the list of options on the left, select STORAGE and then QUICK CREATE:

Creating a new storage account

You’ll need to provide two things before clicking CREATE STORAGE ACCOUNT:

    • a unique name for your storage account (3-24 lowercase, alphanumeric characters), which also becomes the first element of the five part hostname that you’ll used to access blobs, tables and queues within that storage account, and
    • the region you’d like your data to be stored, namely one of Azure’s eight data center locations worldwide.
The geo-replication option offers the highest level of durability by copying your data asynchronously to another data center in the same region. For the purposes of a sample it’s fine to leave it checked, but do note that replication does add roughly a 33% cost to your storage.

Once your account is created, select the account in the portal, and click the MANAGE KEYS option at the bottom center, to bring up your primary and secondary access key.  You’ll ultimately need to use one of these keys (either is fine) to create items in your storage account.

Storage Access Keys

Installing a Storage Account Client

The Windows Azure Management portal doesn’t provide any options for creating, retrieving or updating data within your storage account, so you’ll probably want to install one of the numerous storage client application out there. There is a wide variety of options from free to paid, from proprietary to open source.  Here is a partial list of the one’s I’m aware of – I tend to favor Cloudberry Storage Explorer when working with blob storage.

Regardless of which you use, you’ll need to setup access to your storage account by providing the storage account name and either the primary or secondary account key. Here for instance is a screen shot of how to do it Cloudberry Explorer:

Selecting Windows Azure Account within CloudBerry Explorer

Creating a Container

Windows Azure Blob storage is a two-level hierarchy of container and blobs:

  • a container is somewhat like a file folder, except containers cannot be nested. This is the level at which you can define an access policy applying to the container and the blobs within it,
  • a blob is simply an unstructured bit of data; think of it as a file with some metadata and a MIME-type associated with it. Blobs always reside in containers and are accessed with a URI of the format (http is also supported).

You can create a container programmatically of course, and the 3rd party notification service could do so, but it’s likely you’ll set up the container before hand and let the service simply deal with the blobs (XML files) in it. Depending on the storage client you’re using, creating a container will be very much like creating a file folder in Explorer. Here’s a snapshot of the user interface in CloudBerry Explorer. 

Creating a  new container

Note that I’ve set the container policy to Public read access for blobs only, which would allow the storage account to be a direct endpoint that a Windows 8 application can subscribe to for periodic notifications. That’s a supported, but less than ideal approach as the body of the blog post above explains. A service-based delivery approach would allow a more stringent access policy of “No public read access,” since the storage key could be used by and secured at the service host.

Read the original blog entry...

More Stories By Jim O'Neil

Jim is a Technology Evangelist for Microsoft who covers the Northeast District, namely, New England and upstate New York. He is focused on engaging with the development community in the area through user groups, code camps, BarCamps, Microsoft-sponsored events, etc., and just in general serve as ambassador for Microsoft. Since 2009, Jim has been focusing on software development scenarios using cloud computing and Windows Azure. You can follow Jim on Twitter at @jimoneil

@ThingsExpo Stories
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Building actually breathes - immediately flagging overheating in a closet or over cooling in unoccupied ho...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT influence Big Data: Device, Connectivity, Context, and Intelligence. He will dive deep to the matrix...
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect and correlate data from control systems, sensors, mobile devices and IT systems for a variety of Ind...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
As enterprises capture more and more data of all types – structured, semi-structured, and unstructured – data discovery requirements for business intelligence (BI), Big Data, and predictive analytics initiatives grow more complex. A company’s ability to become data-driven and compete on analytics depends on the speed with which it can provision their analytics applications with all relevant information. The task of finding data has traditionally resided with IT, but now organizations increasingly turn towards data source discovery tools to find the right data, in context, for business users, d...
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobile software company with over 150 developers, designers, quality assurance engineers, project manage...
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless Thingies, will discuss and demonstrate how devices and humans can be integrated from a simple clust...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
SYS-CON Events announced today that Solgeniakhela will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Solgeniakhela is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between Personal and Professional Social, Mobile and Cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing ...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, will show how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He will use a Raspberry Pi to connect sensors to web services, and cloud integration to connect accounting and data, providing a Bluetooth...