Welcome!

Containers Expo Blog Authors: Pat Romanski, Elizabeth White, Liz McMillan, Yeshim Deniz, Ravi Rajamiyer

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo

Containers Expo Blog: Article

Back to Square One

A better way to build virtual images

I grew up playing quite a few different sports, both team and individual. For me, there was little else that I would rather have done than compete on the field, court, diamond, or course. I loved sport, and I loved to compete (still do as a matter of fact), so it was a great fit. Partially motivated by getting me to focus on something other than annoying the crap out of my little brother, my parents strongly encouraged my involvement in sports of all kind. For that, I will always be grateful. Not because I parlayed my athletic experience into a seven figure contract, flashy cars, and a private yacht (I am still open to those things though), but because sports taught me many, many lessons. These lessons went far beyond how to make a shot, hit a ball, or return a serve. Many of these lessons were equally applicable to sports and ‘real-life', even if I did not know it then.

Now, there's a chance you grew up in a similar fashion, but if you have gotten this far in the post (I'm sure I dropped some readers after that opening act), you are probably asking yourself what this has to do with anything remotely related to cloud computing?? Well, it all goes back to those lessons that sports taught me. When I look at ongoing work, that in many cases is laying the foundation for cloud-based environments, one of those old lessons jumps out at me: Sometimes, you just have to go back to the basics!

While this lesson is probably applicable to many things going on in the cloud space now, I want to hone in on virtual image construction in particular. Virtual images are nothing new, and many companies have been making use of them for quite some time. Given that, you may be thinking that users and image providers have mastered the art of image construction. If that is your belief, I can only tell you that you are not seeing the same thing as me.

In a significant number of cases, users I talk with that are using virtual images as a basis for their cloud or enterprise-wide virtualization efforts are flat out struggling to manage their virtual image inventory. Virtual images offer extreme consumability enhancements in environment deployment, and relatively speaking, are easy to create. This has been the perfect combination for an explosion in the volume of images a company needs to manage and maintain. Over time, this kind of virtual image sprawl can cripple or completely derail a company's cloud or virtualization efforts.

Now, you may be asking if virtual image sprawl were the eventual outcome, why would I even want to adopt the use of virtual images. The answer is because sprawl does not have to be the outcome. If we go back to the basics, the basics of effective virtual image construction that is, you can put your company in a good position to avoid a potentially crippling increase in virtual image inventory.

There is an important realization when building a virtual image. You cannot capture every piece of configuration for an environment and preserve it in a virtual image. This may seem basic but is often the fundamental mistake users make when constructing virtual images. For example, if a user is constructing a virtual image containing a web server, their initial reaction may be to preserve configuration information down to the level of proxy directives in the virtual image. It may make that virtual image highly consumable in that it requires zero configuration actions after deployment, but it also restricts its use to cases where those proxy directives apply. If someone wants to deploy that image with a different set of proxy directives, they have to deploy and perform manual updates, or worse yet, they take their direction from the author of the initial image and create a new image with the proxy directives they need. Now the company has two images that provide the same basic functionality. Clearly, we have a problem.

With that said, the first step in constructing a virtual image should be deciding what to install and capture directly into the image. These things are often obvious: large binaries, software with long-running installations, content common to most classes of the image's users, etc. The key here is fighting the temptation to stuff more and more content into an image because usually all that does is restrict its applicability to a constrained set of use cases.

The next step is a bit trickier and takes a little more design work on the part of the image author. Based on what you install into an image, you need to decide what variations of content configuration image deployers may need. For example, going back to our web server virtual image, different deployers may need different proxy directives in their deployed environment. This amount of variance does not warrant the creation of a unique virtual image, but you do not want to push that configuration work on the user either.

In order to allow variations of configuration for the deployed environment, you need to identify input parameters that deployers should be able to pass into the image deployment process. Once identified, you need a set of scripts that run at image activation time, act on those input parameters, and apply the desired configuration to the deployed environment. This is not a radical idea. In fact, it is the kind of activation framework model enabled by the Open Virtualization Format via its OVF envelope.

For completeness, let's look at what a web server virtual image may look like if constructed according to these concepts. First, we start by installing operating system and web server binaries. We may extend this to include other necessary components (i.e. enterprise-wide firewall software), but we do not capture much beyond basic binaries (little to no configuration). Once we have the basic components installed, we identify input parameters deployers should be able to specify. This may include proxy directives, cache directives, authentication configuration, and more. Once identified, we write up a few simple scripts that act on the input and configure the web server. We then wrap all of this up in a framework (like one enabled by OVF). The framework's job is to automatically call our scripts during image activation and ensure user input flows down to that execution process.

That is admittedly a very simple look at the process, but I think it provides a nice overview of an effective methodology for virtual image construction. If you are out there creating virtual images, take precautions against the curse of sprawl. I hope that these tips provide you with some ammo in that effort!

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

IoT & Smart Cities Stories
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER gives detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPOalso offers sp...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
SYS-CON Events announced today that IoT Global Network has been named “Media Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. The IoT Global Network is a platform where you can connect with industry experts and network across the IoT community to build the successful IoT business of the future.