Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: SDN Journal, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, @DXWorldExpo

SDN Journal: Blog Feed Post

Devops: Model First, Automate Later

Modeling should be the first step for devops when automating a deployment process

Modeling should be the first step for #devops when automating a deployment process

When I was a young software developer I had an interview at a large transportation company. This was when object-oriented principles were still the "thing" and Java hadn't quite yet become the language du jour - but it soon would. Sitting in a rather large conference room with a fairly nice white board I was asked to perform a fairly simple (or so it sounds) task: model a zoo.

Like the much discussed interview puzzle questions of many technology giants today, the exercise was not so much about getting it right (you really can't model an entire zoo in software during an interview) as about providing the interviewee with insight into whether or you not you understand the basic principles of modeling an environment. Are you able to identify the major "objects" and, more importantly, their relationship to other objects in the system? Are you cognizant of the minor objects that interact with the major objects, and what role they play in daily operations? Can you correctly point to not only the attributes of but the role performed by each object?

These are the kinds of questions you answer when you're actually modeling a system, and it's not unique to software development. In fact, it's probably one of the more important aspects of devops that may often be overlooked in favor of focusing on individual tasks.

I had a chance to talk with Dan Gordon at Electric Cloud about "Fail-safe Application Deployments" before the holidays and in reviewing Electric Cloud's white paper on the topic I was reminded how important modeling is - or should be - to devops.

You might recall Electric Cloud conducted a survey in June 2012 of app developers, 50% of whom said they have missed an application release date because of issues arising in the deployment process. When asked why that was, a majority (69%) pointed to the complexity of the deployment flows combined with the continued practice of manual configuration (62%) in the process as the culprit.

We know automation can help reduce deployment time and ultimately address errors by enabling more testing more often, but automating a poor or incomplete process can be as disastrous as not automating at all. It's as dangerous to automate a poor or incomplete process as it is to encrypt application data with SSL or TLS and ignore that encrypted malicious code or data is still malicious. What devops needs to do beyond adopting the agile methodologies of development to improve the deployment process is to adopt more of its principles around design and modeling.

Modeling as a Pre-Requisite

One of the five steps to fail-safe application deployments in Electric Cloud's paper on the topic is automation, of course, but its not just about automation - it's also about modeling. It suggests that the automation technology chosen to assist devops should offer a number of modeling capabilities:

It should offer extensive process modeling capabilities. There are three essential models to
consider:
• Application – the ‘what’
• Environment – the ‘where’
• Workflow execution – the ‘how’
The environment(s) should be modeled as well, with details such as:
• Server configuration
• Associated parameters
• Environment configurations

Of course Electric Cloud's solutions offer such modeling capabilities. While being able to translate a model into a concrete implementation is always a bonus, it's more important to go through the modeling exercise than anything else. Whether you're using a tool capable of modeling the model, as it were, or you're using scripts or custom developed systems is not nearly as important as actually modeling the deployment process and systems.

Being able to recognize the minutia in a deployment that can often be forgotten is the first step to eliminating missing steps in the deployment process that can cause it to fail. Applications are not islands, they rely on other applications, services, and networking to be deployed successfully, and it is often the case that configurations rely upon IP addresses or other configuration options that must be addressed late in the process - well after the actual application is "deployed" on its platform. Modeling the "objects" in a deployment - as well as their relationships - will help ensure that as the process is automated those relationships and dependent tasks are not overlooked.

Modeling doesn't have to be a formal exercise. Though many developers use UML tools or other formalized processes to conduct modeling exercises, devops should feel free to discover tools or processes for modeling that best fit their needs.

A rather large conference room and a whiteboard can be a revealing tool, after all.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...