Welcome!

Containers Expo Blog Authors: Elizabeth White, Pat Romanski, ManageEngine IT Matters, Liz McMillan, Rishi Bhargava

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, @CloudExpo, @BigDataExpo, SDN Journal

Containers Expo Blog: Article

The Big Data Bottleneck: Uploading to the Cloud

If only we could get those gigando-bytes into the Cloud in the first place. And there’s the rub.

The problem with Big Data is that, well, Big Data are big. Really big. We’re talking terabytes. Petabytes. Zettabytes. Whatever’s-even-bigger-bytes. And of course, we want to solve all our Big Data challenges in the Cloud. If only we could get those gigando-bytes into the Cloud in the first place. And there’s the rub.

Uploading Big Data from our internal network to the Cloud via an Internet connection is as practical as filling a swimming pool through a drinking straw. It doesn’t matter how sophisticated our Big Data analytics, how super-duper our Hadoopers. If we can’t efficiently get our data where we need them when we need them, we’re stuck.

Optimize the Pipe
Fortunately, the Big Data upload problem isn’t new. In fact, it’s been around for years, under the moniker Wide Area Network (WAN) Optimization. Fortunate for us because vendors have been working on WAN Optimization techniques for a while now, and now several of them are repurposing those techniques to help with the Cloud.

For example, Aryaka has been peddling WAN Optimization appliances for several years. Put one appliance in your local data center, a second in the remote data center, and proprietary technology moves data from one to the other at a rapid clip. Now that the Cloud has turned their world upside down, they are providing a distributed service at the remote end, a “mesh of network connections” better suited to the Cloud. In other words, Aryaka is building an offering similar to Content Delivery Networks (CDNs) like Akamai.

RainStor, in contrast, focuses primarily on a proprietary compression algorithm that promises to squeeze data into one fortieth their original size. Furthermore, RainStor’s compressed data remain directly accessible using standard SQL or even MapReduce on Hadoop with no storage-eating, time-consuming reinflation.

Then there’s Aspera, who’s found a sophisticated way around the limitations of the Transmission Control Protocol (TCP) itself. After all, TCP’s tiny packets and penchant for resending them are a large part of the reason uploading Big Data over the Internet runs like such a dog in the first place. To teach this dog a new trick or two, Aspera transfers use one TCP port for session initialization and control, and one User Datagram Protocol (UDP) port for data transfer.

UDP is an older, fire-and-forget protocol that doesn’t perform the retries that provide TCP’s reliability, but by combining the two protocols, FASP achieves nearly 100% error-free data throughput. In fact, FASP reaches the maximum transfer speed possible given the hardware on which you deploy it, and maintains maximum available throughput independent of network delay and packet loss. FASP also aggregates hundreds of concurrent transfers on commodity hardware, addressing the drinking straw problem in part by supporting hundreds of straws at once.

CloudOpt is also a player worth mentioning. Their JetStream technology takes a soup-to-nuts approach that combines compression and transmission protocol optimization with advanced data deduplication, SSL acceleration, and an ingenious approach to getting the most performance out of cached data. Or Attunity Cloudbeam, that touts file to Cloud upload, file to Cloud replication, and Cloud to Cloud replication. Attunity’s Managed File Transfer (MFT) incorporates a secure DMZ architecture, security policy enforcement, guaranteed and accelerated transfers, process automation, and audit capabilities across each stage of the file transfer process.

Finally, there’s Amazon Web Services (AWS) itself. Yes, most if not all of the vendors discussed above can firehose data into AWS’s various storage services. But AWS also offers a simple, if decidedly low-tech approach as well: AWS Import/Export. Simply ship your big hard drives to Amazon. They’ll hook them up, copy the data to your Simple Storage Service (S3) or other storage service, and ship the drive back when you’re done. This SneakerNet or “Forklifting” approach, believe it or not, can even be faster than some of the over-the-Internet optimizations for certain Big Data sets, even considering the time it takes to FedEx AWS your drives.

On Beyond Drinking Straws
The problem with most of the approaches above (excepting only Aspera and Amazon’s forklift) is that they make the drinking straw we’re using to fill that swimming pool better, faster, and bigger – but we’re still filling that damn pool with a straw. So what’s better than a straw? How about many straws? If any optimization technique improves a single connection to the Internet, then it stands to reason that establishing many connections to your Cloud provider in parallel would multiply your upload speed dramatically.

Fair enough, but let’s think out of the box here. A fundamental Big Data best practice is to bring your analytics to your data. The reasoning is that it’s hard to move your data but easy to move your software, so once your data are in the Cloud, you should also run your analytics there.

But this argument also works in reverse. If your data aren’t in the Cloud, then it may not make sense to move them to the Cloud simply to run your software there. Instead, bring your software to your data, even if they’re on premise.

Perish the thought, you say! We’re sold on Big Data in the Cloud. We’ve crunched the numbers and we know it’s going to save us money, provide more capabilities, and facilitate sharing information across our organization and the world. Fair enough. Here’s another twist for you.

Why are your Big Data sets outside the Cloud to begin with? Sure, you’re stuck with existing, legacy data sets wherever they happen to be today. But as a rule, those don’t constitute Big Data, or will cease to qualify as being large enough to warrant the Big Data label relatively soon. By definition, Big Data sets keep expanding exponentially, which means that you keep creating them with generations of newfangled tools.

In fact, there are already multitudinous sources for raw Big Data, as varied as the Big Data challenges organizations struggle with today. But many such sources are already in the Cloud, or could be moved to the Cloud simply. For example, clickthrough data from your Web sites. Such data come from your Web servers, which should be in the Cloud anyway. If your Big Data come from Web Servers scattered here and there in the Cloud, then moving the clickthrough data to a Big Data repository for processing can be handled in the same Cloud. No need for uploading.

What about data sources that aren’t already in the Cloud? Many Big Data streams come from instrumentation or sensors of some sort, from seismographs underground to EKGs in hospitals to UPC scanners in supermarkets. There’s no reason why such instrumentation shouldn’t pour their raw data feeds directly to the Cloud. What good is storing a week’s worth of supermarket purchasing data on premise anyway? You’ll want to store, process, manage, and analyze those data in the Cloud, so the sooner you get it there, the better.

The ZapThink Take
The only reason we have to worry about uploading Big Data to the Cloud in the first place is because our Big Data aren’t already in the Cloud. And broadly speaking, the reason they’re not already in the Cloud is because the Cloud isn’t everywhere. Instead, we think of the Cloud as being locked away in data centers, those alien, air conditioned facilities packed full of racks of high tech equipment.

That may be true today, but as ZapThink has discussed before, there’s nothing in the definition of Cloud Computing that requires Cloud resources to live in data centers. You might have a bit of the Cloud in your pocket, or on your laptop, in your car, or in your refrigerator. For now, this vision of the Internet of Things meeting the Cloud is mostly the stuff of science fiction. We’re only now figuring out what it means to have a ubiquitous global network of sensors, from the aforementioned EKGs and UPC scanners to traffic cameras to home thermostats. But the writing is on the wall. Just as we now don’t think twice about carrying supercomputers in our pockets, it’s only a matter of time until the Cloud itself is fully distributed and ubiquitous. When that happens, the question of moving Big Data to the Cloud will be moot. They will already be there.

Are you one of the vendors mentioned in this article and have a correction, or a vendor who should have been mentioned but wasn’t? Please feel free to comment here.

Image Source: US Navy

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

@ThingsExpo Stories
It’s 2016: buildings are smart, connected and the IoT is fundamentally altering how control and operating systems work and speak to each other. Platforms across the enterprise are networked via inexpensive sensors to collect massive amounts of data for analytics, information management, and insights that can be used to continuously improve operations. In his session at @ThingsExpo, Brian Chemel, Co-Founder and CTO of Digital Lumens, will explore: The benefits sensor-networked systems bring to ...
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
Much of IT terminology is often misused and misapplied. Modernization and transformation are two such terms. They are often used interchangeably even though they mean different things and have very different connotations. Indeed, it is somewhat safe to assume that in IT any transformative effort is likely to also have a modernizing effect, and thus, we can see these as levels of improvement efforts. However, many businesses are being led to believe if they don’t transform now they risk becoming ...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.