Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

The DevOps Emperor Has No Clothes By @Gidrontxt | @DevOpsSummit #DevOps

Why DevOps is not for everyone and how release automation helps in a bimodal world

The DevOps Emperor Has No Clothes - When DevOps Is DeFunct
By Ron Gidron

So am I really going to write a blog calling DevOps a big hoax?

Well not exactly, but as a follow-up to my previous blog on the difference between DevOps and Continuous Delivery, I want to alert you to a few key facts:

A) DevOps is not always the right methodology for developing and delivering software.

B) Agility and speed can be achieved even in a waterfall method if the right tools are in place.

C) Release automation is bigger than DevOps (yeah, I said it!).

Say no to DevOps?
OK so here goes: why is DevOps not always the right way to go? Simply because not all systems are born equal. For a back-end master customer data system, stability trumps iteration. Fail fast and roll forward simply aren't sustainable in many of today's most core business applications such as banking, retail, media, manufacturing or any other industry vertical. Clearly these enterprises still need to innovate and provide new and exciting ways for their customers to engage through web and mobile, but not at the risk of a terminal business failure at the back end.

Delivery capabilities for these backend systems need speeding up so that they don't become the bottleneck for the entire business. However, restructuring teams and changing development along with test and release into tens and hundreds of small/tiny parallel updates cannot be supported by a legacy technology stack (say Siebel or SAP at the backend). It is likely not even advisable at all.

This doesn't mean we should accept the realities of the last century, resort to the status quo and leave our release cycles to once or twice a year for key applications. That would slowly and surely lose us any competitive edge.

Luckily there is a middle ground, and this is why I believe release automation is bigger then DevOps. That middle ground is release automation. I will explain why, but I need to start by examining DevOps a little more first.

What is DevOps?
At its core, DevOps aims to get new ideas and functionality into the hands of users quickly and iterate fast. It achieves this by fixing two traditional weaknesses in the software delivery lifecycle, namely "overhead" and "rework".

Overhead is anything that developers do that isn't directly related to developing new features or improving existing ones (manual deployments would be a big item in the overhead column for sure, so would debugging production deployment issues or handling configuration problems etc..).

Rework occurs any time a developer has to go back over a piece of code they have already worked on. However, this does not include bug fixing, so rework occurs when test cycles take a long time and developers have already been working on new functionality which now needs to be reworked to account for a newly found problem.

DevOps tackles these challenges by attempting to shorten the cycles; for example by adding smaller changes that developers and testers can test in real time or by using tools like Puppet or Chef for provisioning quickly to test while ensuring consistency across systems.

This relies on Devs owning updates and rollouts across the lifecycle including production, which challenges everything in the way enterprise teams are traditionally organized. It calls for completely new processes, which are often a great fit for newly developed applications and modern architectures but can be a complete misfit (and regulatory obstacle) for so many other core systems.

I'm sure you can see where I am going with this. It is for all of these reasons that I am calling DevOps "the emperor with no clothes". It isn't the panacea people want to believe it is.

Why release automation is bigger than DevOps
The goals of DevOps are sound however, which leads me to my last point: release automation is bigger then DevOps. The reason I say this is because release automation done right can accommodate diverse organizational needs and different types of applications in this new, bimodal world. It improves the software delivery process and its final product by rooting out manual work, improving cycle times, reducing overhead and rework.

Crucially, release automation enables incumbent businesses to stay competitive and ensure survival without forcing them to completely abandon their DNA across the board in a "big bang" fashion. Such a process could put the very existence of a large enterprise at risk.

Before signing off on this blog entry I want to make one last thing clear - the argument I am raising here isn't trying to downplay the benefits of DevOps as a valid methodology. Nor am I denouncing any of the DevOps tool chain technologies out there, which are all welcome advances of course. 2016 is (according to most industry experts) the year in which DevOps adoption goes mainstream, and I fundamentally believe that it is release automation that drives this adoption, not just pure DevOps conversions.

Read the original blog entry...

More Stories By Automic Blog

Automic, a leader in business automation, helps enterprises drive competitive advantage by automating their IT factory - from on-premise to the Cloud, Big Data and the Internet of Things.

With offices across North America, Europe and Asia-Pacific, Automic powers over 2,600 customers including Bosch, PSA, BT, Carphone Warehouse, Deutsche Post, Societe Generale, TUI and Swisscom. The company is privately held by EQT. More information can be found at www.automic.com.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...