Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: @CloudExpo

@CloudExpo: Blog Post

The IT Dilemma Meets Cloud Computing

Technology Refresh and the Law of Plentitude

Emerging technologies have always forced business decision-makers to decide if they will embrace a new technology s a first-mover, or if they will maintain their existing technologies. Each brings a risk – does the cost of maintaining existing technology result in higher maintenance and operational expenses, or does the cost of embracing and acquiring new technology put an unwarranted capital and process change burden on the organization?

Many years ago (~15) the Northern Telecom (Nortel) DMS 100/250/300/500 line of digital telephone switches represented one of the finest technologies for digital communications. The cost was high, but the technology promised telecom carriers everything they would need to operate their networks well into the next generation, which was not yet associated with a real time horizon. At least in marketing PowerPoint slides. Buy a DMS 500, and you will be running that for a couple decades.

Then seemingly overnight the Internet matured, with communications applications such as Voice over Internet Protocol (VoIP), Skype, Vonage, and other Internet-enabled utilities. Suddenly the DMS, 5ESS, 4ESS, NEAC, DSC – all became obsolete almost overnight, replaced by simple Internet-friendly communication applications or Internet Protocol-based "soft switches" which managed telephony over the Internet protocol with a form factor about the size of a mini-refrigerator, And 100 times the switching capacity.

Technology refresh and the law of plentitiudeAnd, as with all soon-to-be-obsolete technologies, the cost of maintaining the legacy system, finding spare parts for the legacy system, and even finding operators for the legacy system may be rapidly hitting a point of extreme risk. The old telephone switches are now most often found in landfills, gone forever.

Traditional telecommunication transmission protocols such as SDH and SONET began falling to Ethernet, and within a period of about 5 years from 2003~2008 the "legacy" telephone technologies began to quickly fade to historical Wikipedia entries.

The Cloud Computing Analogy

We are entering a period of "plentitude" in cloud computing. The "Law of Plentitude" is loosely defined as a threshold of acceptance (of a process, technology, system, etc) that if not adopted will put an entity at a greater risk of non-participation than if they participate at the point of emergence. In technology we normally place the "Law of Plentitude" at around 15~20% diffusion into a selected environment, community, industry, or organization.

For example, when the fax was first introduced there was a single machine. By itself it is not useful, as you have nobody to fax images to on a distant end. With two fax machines it is more useful, with a community of two. The law of exponents begins at 4 users (N*N-1/2) and you end up with an addressable community of 6 potential relationships. And it continues growing.

At "plentitude," you are at risk of not acquiring a fax machine, because your community has now adopted fax machines at a level that you need to be able to communicate with fax, or find yourself in jeopardy of losing your place in the community.

It can now be argued that cloud computing is quickly starting to reach a level of "plentitude." Communities of interest are emerging within clouds, allowing near zero-latency in user-to-user transaction time. Think of a financial trading community. Zero-latency means zero transaction delays. At some point if you are not in the zero-latency community, your operation is at risk of either losing business, or being expelled by other members of the community who do not want to deal with your latency.

Think of companies outsourcing their IT infrastructure into a commercial cloud service provider, or even building their own internal enterprise cloud infrastructure. If all things are equal, and the cloud-enabled company is able to recover the cost of building their own data center, reducing operational expenses, and potentially greatly increasing their ability to expand and reduce their processing capacity, then they may have more resources left over to increase research and development or product production.

Think of the guys who were running DMS 500s in 2009, vs. their competitors who were running much more powerful, and cheaper soft switches. We can produce a roll call of regional telephone companies who closed their doors over the past few years because they simply did not have the ability to compete with next generation technology.

The Cloud Computing "Plentitude" Target

The trick of course is to try and plan your refresh, through a well-managed business case and review, to as close to the plentitude "risk threshold" as possible. This will ensure you do not fall prey to a bad technology, are able to see the industry trend towards adopting a new technology, and that your competition does not leave you suffering through a last minute technology refresh.

Cloud computing and data center outsourcing may not be the ultimate technology refresh, and still has a number of issues to resolve (security, compliance, data center stability, etc). However, the trend is clear, companies are outsourcing into commercial cloud service providers, and enterprise virtualization is on the mind of every IT manager and CFO in the business community.

We hope

If your company or organization has not yet started the review process, the technology refresh process, and the planning process to determine if/when cloud adoption is the right thing for you company, we would strongly encourage that process to begin. Now.

If nothing else, you owe it to yourself and your organization to ensure they are not caught on the bad side of plentitude.

More Stories By John Savageau

John Savageau is a life long telecom and Internet geek, with a deep interest in the environment and all things green. Whether drilling into the technology of human communications, cloud computing, or describing a blue whale off Catalina Island, Savageau will try to present complex ideas in terms that are easily appreciated and understood.

Savageau is currently focusing efforts on data center consolidation strategies, enterprise architectures, and cloud computing migration planning in developing countries, including Azerbaijan, The Philippines, Palestine, Indonesia, Moldova, Egypt, and Vietnam.

John Savageau is President of Pacific-Tier Communications dividing time between Honolulu and Burbank, California.

A former career US Air Force officer, Savageau graduated with a Master of Science degree in Operations Management from the University of Arkansas and also received Bachelor of Arts degrees in Asian Studies and Information Systems Management from the University of Maryland.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...