Welcome!

Containers Expo Blog Authors: Sematext Blog, Elizabeth White, Pat Romanski, Liz McMillan, Carmen Gonzalez

Blog Feed Post

Windows 8 Notifications: Using Azure for Periodic Notifications

At the end of my last post, I put in a plug for using Windows Azure to host periodic notification templates, so I’ll use this opportunity to delve into a bit more detail. If you want to follow along and don’t already have an Azure subscription, you can get a free 90-day trial account on Windows Azure in minutes.

The Big Picture

The concept of periodic notifications is a simple one: it’s a publication/subscription model. Some external application or service creates and exposes a badge or tile XML template for the Windows 8 application to consume on a regular cadence. The only insight the Windows 8 application has is the public HTTP/HTTPS endpoint that the notification provider exposes. What’s in the notification and how often it’s updated is completely within the purview of the notification provider. The Windows 8 application does need to specify the interval at which it will check for new content – selecting from a discrete set of values from 30 minutes to a day – and that should logically correspond to the frequency at which the notification content is updated, but there is no strict coupling of those two events.

Periodic notification workflow for Chez ContosoTake for instance, a Windows 8 application for Chez Contoso (right), a local trendy restaurant that displays its featured entrée on a daily basis within a Start screen tile, trying to lure you in for a scrumptious dinner. Of course the chef’s spotlight offering will vary daily, so it’s not something that can be packaged up in the application itself; furthermore, the eatery may want to have some flexibility on the style of notification it provides – mixing it up across the several dozen tile styles to keep the app’s presence on the Start screen looking fresh and alive.

Early each morning then, the restaurant management merely needs to update a public URI that returns the template with information on the daily special. The users of the restaurant’s application, which taps into that URI, will automatically see the updates, whether or not they even run the application that day.

Using Windows Azure Storage

Since the notification content that needs to be hosted is just a small snippet of XML, a simple approach is to leverage Windows Azure blob storage, which can expose all of its contents via HTTP/HTTPS GET requests to a blob endpoint URI.

In fact, if Chez Constoso owned the Windows Azure storage account, the restaurant manager could simply copy the XML file over to the cloud (using a file copy utility like CloudBerry Explorer), and with an access policy permitting public read access to the blob container, the Windows 8 application would merely need to register the URI of that blob in the call to startPeriodicUpdate (along with a polling period of a day and an initial poll time of 9 a.m.). There would be no requirement (or cost) for an explicit cloud service per se: the Windows Azure Storage service fills that need by supporting update and read operations on cloud storage.

Unfortunately, there is a catch! By default, the expiration period for a tile is three days; furthermore, if the client machine is not connected to the network at the time the poll for the update is made, that request is skipped. In the Chez Contoso scenario, this could mean that the daily dinner special for Monday might be what the user sees on his or her Start screen through Wednesday – not ideal for either the restaurant or the patron.

  • The good news is that to ensure content in a tile is not stale, a expiration time can be provided via a header in the HTTP response that returns the XML template payload, and upon expiry, the tile will be replaced by the default image specified in the application manifest.
  • The bad news is that you can’t set HTTP headers when using Azure blob storage alone, you need a service intermediary to serve up the XML and set the appropriate header… enter Windows Azure.

Using Windows Azure Cloud Services

A Windows Azure service for a periodic notification can be pretty lightweight – it need only read the XML file from Windows Azure storage and set the X-WNS-Expires header that specifies when the notification expires. Truth be told, you may not need a cloud storage account since the service could generate the XML content from some other data source. In fact, that same service could expose an interface or web form that the restaurant management uses to supply content for the tile. To keep this example simple and focused though, let’s assume that the XML template is stored directly in Windows Azure storage at a known URI location; how it got there is left to the reader :).

With Windows Azure there are three primary ways to setup a service:

Windows Azure Web Sites, a low cost and quick option for setting up simple yet scalable services.

Windows Azure Virtual Machines, an Infrastructure-as-a-Service (Iaas) offering, enabling you to build and host a Linux or Windows VM with whatever service code and infrastructure you need.

Windows Azure Cloud Services, a Platform-as-a-Service offering, through which you can deploy a Web Role as a service running within IIS.

I previously covered the provisioning of Web Sites in my blog post on storing notification images in the cloud, so this time I’ll go with the Platform-as-a-Service offering using Windows Azure Cloud Services and ASP.NET.

Getting Set Up for Azure Development

If you haven’t used Windows Azure before, now’s a great time to get a 90-day free, no-risk trial.

As far as tooling goes, you can use Visual Studio 2012, Visual Studio 2010, or Visual Web Developer Express. I’m using Visual Studio 2012 for the rest of this post, but the experience is quite similar for Visual Studio 2010 and Visual Web Developer Express.

Since the service will be developed in ASP.NET, you’ll also want to download specifically the Windows Azure SDK for .NET (via the Web Platform Installer). In fact, if you don’t have Visual Studio already installed, this will give you Visual Web Developer Express automatically.

Creating the Service

In Visual Studio, create a new Solution (File>New Project…) of type Cloud:

Creating a new Cloud Service solution

When you create a new Cloud Service, you get the option to create a number of other related projects, each corresponding to a Web or Worker Role that is deployed under the umbrella (and to the same endpoint) of that Cloud Service. Web and Worker Roles typically coordinate with other Azure services like storage and the Service Bus to provide highly scalable, decoupled, and asynchronous implementations. In this simple scenario, you need only a single Web Role, which will host a simple service.

Adding a Web (Service) Role

For the type of ASP.NET MVC project, select Web API, a convenient and lightweight option for building a REST-ful service:

Creating a Web API ASP.NET MVC application

Coding the Service

At a minimum, the service needs to do two things:

  1. return the tile XML content within a predetermined Windows Azure blob URI, the same one that is updated by the restaurant every morning at 9 a.m.
  2. set the X-WNS-Expires header to the time at which the requested tile should no longer be shown to the user. In this case, let’s assume the restaurant stops serving the special at midnight each day, and you don’t want to show the tile after that time.

To create the API for returning the XML content, add a new empty API Controller called Tiles on the ASP.NET MVC 4 project (or you can just modify the ValuesController that gets automatically created):

Creating a new API controller

New API Controller dialog

In that controller, you’ll need a single Get method (corresponding to the HTTP GET request that the Windows 8 application will make on a regular basis to get the XML tile content). That Get method will access a parameter indicating the name of the blob on Windows Azure that contains the XML. For example, a URL like:

http://periodicnotifications.cloudapp.net/api/tiles/chezcontoso_dinner

will access the blob called dinner in a Windows Azure blob storage container called chezcontoso. The api/tiles segment of the URL is controlled by the ASP.NET MVC route that is set by default in the global.asax file of the project, so it too can be modified if you like.

The complete code for that Get method follows and is also available as a Gist on GitHub for easier viewing and cutting-and-pasting. Note, you will need to supply your own storage account credentials in Line 5ff (see below for a primer on setting up your storage account).

   1:  // /api/tiles/<tilename>
   2:  public HttpResponseMessage Get(String id)
   3:  {
   4:      // set cloud storage credentials
   5:      var client = new Microsoft.WindowsAzure.StorageClient.CloudBlobClient(
   6:          "http://YOUR_STORAGE_ACCOUNT_NAME.blob.core.windows.net",
   7:          new Microsoft.WindowsAzure.StorageCredentialsAccountAndKey(
   8:              "YOUR_STORAGE_ACCOUNT_NAME",
   9:              "YOUR_STORAGE_ACCOUNT_KEY"
  10:              )
  11:      );
  12:  
  13:      // create HTTP response
  14:      var response = new HttpResponseMessage();
  15:      try
  16:      {
  17:          // get XML template from storage
  18:          var xml = client.GetBlobReference(id.Replace("_", "/")).DownloadText();
  19:  
  20:          // format response
  21:          response.StatusCode = System.Net.HttpStatusCode.OK;
  22:          response.Content = new StringContent(xml);
  23:          response.Content.Headers.ContentType = 
  24:              new System.Net.Http.Headers.MediaTypeHeaderValue("text/xml");
  25:  
  26:          // set expires header to invalidate tile when content is obsolete
  27:          response.Content.Headers.Add("X-WNS-Expires", GetExpiryTime().ToString("R"));
  28:      }
  29:      catch (Exception e)
  30:      {
  31:          // send a 400 if there's a problem
  32:          response.StatusCode = System.Net.HttpStatusCode.BadRequest;
  33:          response.Content = new StringContent(e.Message + "\r\n" + e.StackTrace);
  34:          response.Content.Headers.ContentType = 
  35:              new System.Net.Http.Headers.MediaTypeHeaderValue("text/plain");
  36:      }
  37:  
  38:      // return response
  39:      return response;
  40:  }
  41:  
  42:  private DateTime GetExpiryTime()
  43:  {
  44:      return DateTime.UtcNow.AddDays(1);
  45:  }

And now for the line-by-line examination of the code:

Line 2 The Get method is defined to accept a single argument (this will be of the format container_blob) which is automatically passed in by the ASP.NET MVC routing engine to the id parameter. This leverages the default route for APIs as defined in the App_Start/WebApiConfig.cs of the ASP.NET MVC project, but you do have full control over the specific path and format of the URL.
Lines 5-11 This instantiates the StorageClient class wrapper for interacting with Windows Azure blob storage. For simplicity, the credentials are hard-coded into this method. As a best practice, you should include your credentials in the role configuration (see the Configuring a Connection String to a Storage Account in Configuring a Windows Azure Project) and then reference that setting in code via RoleEnvironment.GetConfigurationSettingValue.
Line 14 A new HTTP response object is created, details for which are filled by the rest of the code in the method.
Line 18 The blob resource in Windows Azure is downloaded using the account set in Line 5ff. The id parameter is assumed to be of the format container_blob, with the underscore character separating the distinct Windows Azure blob construct references. Since references to containers and blobs in Windows Azure require the / separator, the underscore in the parameter value is replaced before the call is made to download the content.

You could alternatively create a new MVC routing rule that accepted two parameters to the Get method, container and blob, and then concatenate those values here. I opted for the implementation here so as not to require additional modifications in other parts of the ASP.NET MVC project.
Line 21 A success code is set for the HTTP response, since an exception would have occurred if the tile content were not found. It’s still possible that the content is not valid, but the notification engine in Windows 8 will handle that silently and transparently and simply not attempt to update the tile.
Line 22 The content payload for the tile, which should be just a snippet of XML subscribing to the Tile schema, is added to the HTTP response.
Line 23 The content type is, of course, XML.
Line 27 To guard against stale content, the X-WNS-Expires header is set to the time at which the tile should no longer be displayed on the user’s Start screen, and instead the default defined in the application’s manifest should be used. Here the time calculation is refactored into a separate method which just adds a day to the current time. That’s actually not the best implementation, but we’ll refine that a bit later in the post.

Note that the time is formatted using the “R” specifier, which applies the RFC1123 format required for HTTP dates.
Lines 31-35 The exception handling code is fairly simplistic but sufficient for the example. If there’s any problem in accessing the blob content, a HTTP 400 status code is returned. The notification processing component of Windows 8 will realize there’s a problem and not attempt to modify the tile. The additional information provided (message and stack trace) is there for diagnostic purposes; it would never be visible to the user of the Windows 8 application.
Line 39 The HTTP response is returned as the result of the service request.
Line 42-45 GetExpiryTime calculates when the tile should expire and the default tile defined in the application’s manifest should be reinstated. In the code embedded above, the tile will expire one day after the content was polled versus the default of three days. The Gist includes an updated version of that calculation that I’ll discuss a bit later in the post.

Testing the Service

The Windows Azure SDK includes an emulator that allows you to run a Windows Azure service on your local machine for testing. You can invoke the emulator when running Visual Studio (as an administrator) by hitting F5 (or Ctrl+F5). For the service application we’re working with, that will open a browser to the default page for the site. That page isn’t the one you want, of course, but since the tile is available via an HTTP GET request, simply supplying the appropriately formatted URI should bring up the XML content in the browser.Browsing to Web API endpoint

After confirming the service works, it’s time to give it shot with a real Windows 8 application that leverages periodic notifications. If you’ve got one coded already, then all you need to do is provide the full URI to startPeriodicUpdate. If you haven’t written your app yet, the Push and periodic notifications client-side sample offers a quick way to test.

When you run that sample, select the fourth scenario (Polling for tile updates) and enter the URL for the service you just created. Then press the Start periodic updates button.

Sample application demonstrating periodic updates

Tile resulting from periodic updateAs for all periodic updates, when startPeriodicUpdate is called, the endpoint is polled immediately, so you should see the tile updated on your Start Screen along the lines of what is shown to the left. If you were to manually update the tile XML in blob storage, you should see the new content reflected in about a half-hour, since that’s the default recurrence period specified in the sample application. For the actual application, a recurrence period of one day makes more sense; additionally, you’d set the startTime parameter in the call to startPeriodicUpdate to 9 a.m. or slightly after to be sure that the poll picks up the daily refreshed content.

Deploying the Service

Visual Studio Publish... optionWhen you’re satisfied the application is working with the local service, you’re ready to deploy it to the cloud with your 90-day free Windows Azure account (or any other account you may have). You can deploy the ASP.NET site to Windows Azure directly from Visual Studio by selecting the Publish option from the Cloud Service project (right).

If this is the first time you’ve published to Windows Azure from Visual Studio, you’ll need to download your publishing credentials, which you can access via the link provided on the Publish Windows Azure Application dialog below.

Downloading Windows Azure publication credentials

Selecting that link prompts you to login to the Windows Azure portal with your credentials and download a file with the .publishsettings extension. Save that file to your local disk and click the Import button to link Visual Studio to your Azure subscription. You only need to do this once, and you should then secure (or remove) the .publishsettings file, since it contains information that would enable others to access your account. With the subscription now in place, you can continue to the next step of the wizard.

Windows Azure storage location promptYou may next be prompted to create a storage account on Windows Azure. This account is used to store your deployment package during the publication process, and it can be used by all the services you ultimately end up deploying. As a result, you need only do this once; although, you can change the default storage account used at a later stage in the wizard.  To create the account enter a storage account name, which must be unique across all of Windows Azure, and indicate which of the eight data centers should house the account.

Service creation dialogIf you have existing services deployed, you’ll have the option to update one of those with the build currently in Visual Studio, or as in this case, you can create a new Cloud Service on Windows Azure. Doing so requires supplying (1) a service name (which must be unique across Windows Azure since it becomes the third level domain name prepended to the cloudapp.net domain which all Windows Azure Cloud Services are a part of) and (2) the location of the data center where the service will run. As you can see to the left, I chose my service to reside in the East US and be addressable via periodicnotifications.cloudapp.net.

If the service name is valid and not in use, the Common Settings and Advanced Settings options will be populated automatically and allow some customization of how the service runs and what capabilities you have (e.g., remote desktop to your VM in the cloud, whether to use a production or staging slot, if Intellisense is enabled, etc.) The defaults are fine for our purposes here, but you can read more about the additional settings on the Windows Azure site.

Publish Settings

At this point the Next button brings you to a summary screen which indicates that your choices will be saved for when you redeploy, or you can just click Publish to deploy your service to Windows Azure now. Within Visual Studio you can keep tabs on the deployment via the Windows Azure Activity log (accessible via the View->Other Windows menu option in Visual Studio) – note that it may take 10 minutes or more the first time you deploy to a new service; subsequent updates are generally much quicker.

Windows Azure Activity Log

Now that the service is fully deployed to the cloud, you can revisit the sample tile application and provide the Azure hosted URI (versus the localhost one used earlier when testing) to see the complete end-to-end process in action!

Tidying Up

Earlier on, I mentioned I’d revisit the tile timeout logic, namely the implementation of GetExpiryTime in the code sample; it’s time to clean up that loose end!

Since Chez Contoso updates its menu daily at 9 a.m., it seems quite logical to have the tile expire on a daily basis (which was the default implementation I introduced above), but that creates a less than satisfactory experience:

When users fire up the application for the first time, the URI will be polled for the tile and by default the tile will expire 24 hours from then.  If a user should first access the app at 8:45 a.m. and then disconnect for the rest of the day, they will continue to see the previous day’s special on their Start screen! I’d say that if the restaurant stops serving at midnight, no one should continue to see that day’s special beyond that point. 

To address that scenario, here’s an updated implementation of GetExpiryTime:

   1:  private DateTime GetExpiryTime()
   2:  {
   3:      Int32 TimeZoneOffset = -4;  // EDT offset from UTC
   4:   
   5:      // get representation of local time for restaurant
   6:      var requestLocalTime = DateTime.UtcNow.AddHours(TimeZoneOffset);
   7:   
   8:      // if request is hitting before 9 a.m., information is stale
   9:      if (requestLocalTime.Hour <= 8)
  10:          return DateTime.UtcNow.AddDays(-1);
  11:   
  12:      // else, set tile to expire at midnight local time
  13:      else
  14:      {
  15:          var minutesUntilExpiry = (24 - requestLocalTime.Hour) * 60 
- requestLocalTime.Minute;
  16:          return DateTime.UtcNow.AddMinutes(minutesUntilExpiry);
  17:      }
  18:  }

The basic algorithm here is to convert the time the request hits the server (in UTC) to the equivalent local time. Here, I’m assuming the restaurant is on the East Coast of the United States, which currently is offset by four hours from UTC (Line 3). One disadvantage of this particular approach is that the conversion to Standard Time will require a change to the algorithm. That could be alleviated by moving the offset to the services configuration file (ServiceConfiguration.cscfg), but it would still need to be updated twice a year. A more robust (and likely complex) implementation is almost certainly possible, but I’m deeming this ‘close enough’ to illustrate the concept.

Once the representation of the time local to the restaurant is obtained (Line 6), a check is made to see if the time is before or after 9 a.m., the time we’re assuming management is prompt about updating the coming evening’s special.

If it’s before 9 a.m. that day (Line 9), then the tile is set to expire before it’s even delivered, because the current contents refer to the previous day’s special entrée. This does mean that the tile is delivered unnecessarily, but the payload is small, and the Windows 8 notification mechanism will honor the expiration date. An alternative would be to return an HTTP status code of 404, which may be more correct and elegant, but require updating a bit more of the other code.

If the request arrives after 9 a.m. Chez Contoso time (Line 13ff), then the expiration time is calculated by determining how many minutes are left until midnight. Once that time hits, the tile on the user’s Start screen will be replaced with the default supplied in the application’s manifest, which would probably include a stock photo or some other generic and time insensitive text.

Final Design Considerations

Congratulations! At this point you should have a cloud service setup that can service one or many of your Windows 8 applications. Before signing off on this post though, I wanted to reiterate a few things to keep in mind about periodic notifications.

  • If the machine is not connected to the internet when it’s time to poll the external URI, then that attempt is skipped and not retried until the next scheduled recurrence – which could be as long as a day away. The takeaway here is to consider expiration policies on your tiles so that stale content is removed from the user’s Start screen.
  • The polling interval is not precise; there can be as much as a 15-minute delay for the notification mechanism to poll the endpoint.
  • The URI is always polled immediately when a client first registers (via startPeriodicUpdate), but you can specify a start time of the next poll and then the recurrence interval for every request thereafter.
  • You can leverage the notification queue by providing up to five URIs to be polled at the given interval. Each of those can have separate expiration policies and also provide different tag value (X-WNS-Tag header) to determine which tile of the set should be replaced with new content.

Get a free 90-day trial accountThis section is a primer for setting up a Windows Azure storage account in case you want to follow along with the blog post above and set up your own periodic notification endpoints in Windows Azure. If you’ve already worked with Windows Azure blob storage, you won’t miss anything by skipping this!

Getting Your Windows Azure Account

There are a number of options to quickly provision a Windows Azure account. Many of these are either free or have a free monthly allotment of services - more than enough to get your feet wet and explore how Windows Azure can enhance Windows 8 applications from a developer and an end-user perspective:

Creating Your Storage Account

+NEWOnce your subscription has been provisioned, log in to the Windows Azure Management Portal, and select the NEW option at the bottom left.

From the list of options on the left, select STORAGE and then QUICK CREATE:

Creating a new storage account

You’ll need to provide two things before clicking CREATE STORAGE ACCOUNT:

    • a unique name for your storage account (3-24 lowercase, alphanumeric characters), which also becomes the first element of the five part hostname that you’ll used to access blobs, tables and queues within that storage account, and
    • the region you’d like your data to be stored, namely one of Azure’s eight data center locations worldwide.
The geo-replication option offers the highest level of durability by copying your data asynchronously to another data center in the same region. For the purposes of a sample it’s fine to leave it checked, but do note that replication does add roughly a 33% cost to your storage.

Once your account is created, select the account in the portal, and click the MANAGE KEYS option at the bottom center, to bring up your primary and secondary access key.  You’ll ultimately need to use one of these keys (either is fine) to create items in your storage account.

Storage Access Keys

Installing a Storage Account Client

The Windows Azure Management portal doesn’t provide any options for creating, retrieving or updating data within your storage account, so you’ll probably want to install one of the numerous storage client application out there. There is a wide variety of options from free to paid, from proprietary to open source.  Here is a partial list of the one’s I’m aware of – I tend to favor Cloudberry Storage Explorer when working with blob storage.

Regardless of which you use, you’ll need to setup access to your storage account by providing the storage account name and either the primary or secondary account key. Here for instance is a screen shot of how to do it Cloudberry Explorer:

Selecting Windows Azure Account within CloudBerry Explorer

Creating a Container

Windows Azure Blob storage is a two-level hierarchy of container and blobs:

  • a container is somewhat like a file folder, except containers cannot be nested. This is the level at which you can define an access policy applying to the container and the blobs within it,
  • a blob is simply an unstructured bit of data; think of it as a file with some metadata and a MIME-type associated with it. Blobs always reside in containers and are accessed with a URI of the format https://accountname.blob.core.windows.net/container-name/blob-name (http is also supported).

You can create a container programmatically of course, and the 3rd party notification service could do so, but it’s likely you’ll set up the container before hand and let the service simply deal with the blobs (XML files) in it. Depending on the storage client you’re using, creating a container will be very much like creating a file folder in Explorer. Here’s a snapshot of the user interface in CloudBerry Explorer. 

Creating a  new container

Note that I’ve set the container policy to Public read access for blobs only, which would allow the storage account to be a direct endpoint that a Windows 8 application can subscribe to for periodic notifications. That’s a supported, but less than ideal approach as the body of the blog post above explains. A service-based delivery approach would allow a more stringent access policy of “No public read access,” since the storage key could be used by and secured at the service host.

Read the original blog entry...

More Stories By Jim O'Neil

Jim is a Technology Evangelist for Microsoft who covers the Northeast District, namely, New England and upstate New York. He is focused on engaging with the development community in the area through user groups, code camps, BarCamps, Microsoft-sponsored events, etc., and just in general serve as ambassador for Microsoft. Since 2009, Jim has been focusing on software development scenarios using cloud computing and Windows Azure. You can follow Jim on Twitter at @jimoneil

@ThingsExpo Stories
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band-aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It does...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...