Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: @DevOpsSummit, Containers Expo Blog, @CloudExpo

@DevOpsSummit: Blog Feed Post

DevOps and Hybrid Infrastructure Synergy By @Kevin_Jackson | @DevOpsSummit #DevOps

The definition of DevOps emphasizes collaboration and communication between software developers and other IT professionals

(This post first appeared in IBM's Point B and Beyond)

The definition of DevOps emphasizes collaboration and communication between software developers and other IT professionals while automating the software delivery and infrastructure change process. While agile software development and the use of automated infrastructure configuration tools stand proudly in the DevOps spotlight, little has been said about the actual infrastructure that modern tools such as Puppet and Chef automate.

DevOps in Hybrid IT Environments
Much has been written about Chaos Monkey, a tool that ensures individual software components work independently by randomly killing instances and services within Netflix's Amazon Web Service (AWS) infrastructure. This process clearly stresses AWS infrastructure operations as automation scripts reconfigure infrastructure components on the fly. Without taking anything away from the operations excellence this displays, how would an enterprise match this feat across a hybrid IT environment? How would you support the DevOps philosophy across a hybrid IT infrastructure?

The DevOps philosophy embodies the practice of operations and development engineers working together through the entire service life cycle, from design to development to production support. It's linked closely with agile and lean approaches and abandons the siloed view of the development team being solely focused on building and the operations team being exclusively centered on running an application.

As enterprises adopt both private and public clouds, they typically do not throw away their in-house infrastructure. Although consolidation, outsourcing and IT efficiencies may reduce the number of corporately owned data centers, a hybrid operational environment will still remain. Extending the DevOps philosophy into such an environment requires active management of all an organization's IT infrastructure, regardless of its source. This active IT management is different from the budget-and-forget management seen in the past and requires the following:

  • Active monitoring and metering of all IT services;
  • Continuous benchmarking and comparisons of similar services; and
  • Viable options for change among pre-vetted and approved IT infrastructure service options (IT supply chain management).

These management functions are delivered by IT service broker enablement, which refers to the integration of platforms that aggregate, customize and/or integrate IT service offerings through a single platform. In transforming the traditional, mostly static infrastructure model into a multisourced IT service supply chain operation, these platforms also deliver financial management and hybrid IT solution design support. They uniquely enable the infrastructure dynamism needed to pursue DevOps across a hybrid IT environment.

A DevOps Mindset in the Dynamic World of Cloud
According to Gravitant, hybrid IT is also more than just a catalog of public and private IT infrastructure resources. It is a strategic approach that unifies the hardware and software operational components of an end-to-end solution. With this approach, an organization standardizes the delivery of multisourced solutions by doing the following:

  • Leveraging existing tools and resources without disruption;
  • Offering additional, automated choices for users who need speed and agility; and
  • Addressing architecture holistically, with the optimal balance of technology investments - on-premises, off-premises, hosted, private or public.

This concept requires a shift in structure and mindset because the dynamic world of the cloud requires a new organizational structure. The shift in structure helps organizations move from a technology mindset to a more solution-based focus, building the skills and expertise required to support fast, flexible and cost-effective IT processes. The main objective is to transform IT teams from static, technology-focused teams into brokers of IT services. When this happens, IT will become a company asset by responding dynamically to the organization's needs.

The Value of IT Service Brokerage
The IT service broker function sits between the back office (operations) and the front office (user experience), creating a middle office that is responsible for much of the new business operations skills, such as sourcing, procurement, packaging and billing. The enablement platform defines and executes the technology and sourcing strategies and supports the creation of solution architectures that maximize the value of your multisourced investment.

IT service brokerage redefines the meaning of hybrid IT by introducing inherent provisioning, orchestration, portability and interoperability services. In fact, DevOps is to software as IT service brokerage is to infrastructure. To be successful in today's dynamic and global business environment, modern organizations need to build dynamic and agile infrastructures that can support agile and dynamic software development and deployment models. This is why IT service broker enablement is the key to DevOps and hybrid infrastructure synergy.

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)

Follow me at http://Twitter.com/Kevin_Jackson

Read the original blog entry...

More Stories By Kevin Jackson

Kevin Jackson, founder of the GovCloud Network, is an independent technology and business consultant specializing in mission critical solutions. He has served in various senior management positions including VP & GM Cloud Services NJVC, Worldwide Sales Executive for IBM and VP Program Management Office at JP Morgan Chase. His formal education includes MSEE (Computer Engineering), MA National Security & Strategic Studies and a BS Aerospace Engineering. Jackson graduated from the United States Naval Academy in 1979 and retired from the US Navy earning specialties in Space Systems Engineering, Airborne Logistics and Airborne Command and Control. He also served with the National Reconnaissance Office, Operational Support Office, providing tactical support to Navy and Marine Corps forces worldwide. Kevin is the founder and author of “Cloud Musings”, a widely followed blog that focuses on the use of cloud computing by the Federal government. He is also the editor and founder of “Government Cloud Computing” electronic magazine, published at Ulitzer.com. To set up an appointment CLICK HERE

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...