Welcome!

Containers Expo Blog Authors: Amit Gupta, Elizabeth White, Pat Romanski, Flint Brenton, Liz McMillan

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog

@DevOpsSummit: Blog Feed Post

Continuous Delivery and Legacy Software | @DevOpsSummit #DevOps #Microservices

Don't Leave Legacy Software Systems Out of the Equation

The future of software releases is clear. Continuous delivery is here to stay. But does that mean that legacy software systems and infrastructure need to be altogether abandoned?

If you polled the people busy redefining best practices today, they'd agree that in a decade we're going to be effortlessly collaborating on highly complex systems that are updated continuously. QA will be fully automated, deployment and infrastructure tasks will be immediate, and large enterprise-wide software projects will be able to move quickly.

In the future, software releases won't feel so "manual." We won't be sitting on 50-person conference calls talking about release-related downtime and asking teams to stay up all night to deliver software. As the software industry continues to mature, creating tools designed to facilitate rapid software delivery, we'll get there, but it's going to take some time.

For Some: The Future is Now

We can see this reality already in a few widely publicized, exception cases. Companies like Flickr and Etsy laid the foundations for rapid, continuous delivery of software to production, and as larger businesses start to adopt DevOps practices we're a few years into an industry-wide rush toward more agile approaches to release management.

Teams at Twitter and Facebook are pushing to production every single day (and often more frequently than that), but the common attribute for many of the companies moving faster is that they don't have to support the legacy software systems present throughout most established businesses. While everyone agrees that continuous delivery is the goal, it remains to be seen how many organizations are going to migrate not just their newer, greenfield projects, but how they are going to accelerate existing, legacy systems.

Don't Leave Legacy Software Systems Out of the Equation

Small and medium-sized businesses in technology and media industries lead the way toward agile software delivery because they don't have to deal with the technologies you might find at a multi-national bank or a global insurance company. Legacy applications carry with them legacy databases built on technologies thatdon't lend themselves to instant deployment models, but there are tools and technologies designed to bridge these gaps.

If you run a large, revenue-generating application there's a good chance that you deal with massive databases from IBM or Oracle. It's even more likely if your system supports business at scale that your application depends on several levels of middleware and multiple databases spanning a whole range of vendors and technologies. Until you can bring these systems along for the daily release schedule there's still work to be done.

Databases Can be Agile Too

I'm picking on databases because they are always the toughest aspect of enterprise release management, and they often present the most difficult challenges to moving toward fast-paced release cadences.

When your release timelines become constrained it's always the databases that complicate everything. In a complex release you need to account for set up and tear down time when planning your testing environment's database requirements. In a production-facing software release it is always the database operations that present the most risk in the process. Databases need to be migrated in one-off processes with little room for error, and during some releases database changes almost invariably call for production downtime.

The good news is that database vendors and other companies supporting tools such as Oracle and DB2 are developing novel solutions to make databases more agile. Test environments can now use data that is continuously masked from production data, and vendors are innovating with various approaches to block storage to make it easier to track database changes and perform rollbacks instantly.

Conclusion: Don't discount legacy databases when it comes to continuous delivery. All signs are pointing toward a future that incorporates more agile data management practices that will facilitate continuous delivery models.

Follow Old Advice: Divide and Conquer

Anyone in charge of an IT budget understands that you have to choose your battles wisely. If we wanted to upgrade every system in the department at once it would be easier if we also had an infinite budget and the ability to hire resources at the snap of a finger. Needless to say, this isn't how most organizations work. In reality, our resources are constrained, and, forget about the budget, the limited resource is often people. It's tough to hire good enterprise release managers, and once you hire a good ERM it takes months (or years) for them to get enough experience to truly manage across your enterprise.

You are not going to move your entire enterprise to a continuous delivery cadence in a year. You are going to move your enterprise to a more continuous delivery cadence over a number of years, and you should create a multi-year plan to do this in phases. In the first stage select the most difficult component (often your relational databases) along with the easiest component (one of your newer web applications.) Focus your efforts on creating a model for other teams to follow once you've successfully moved to a continuous release cadence, and use these initial projects as a chance to experiment with what works and what doesn't.

Conclusion: Don't try to move everything to continuous delivery at once. Choose your battles and understand how to support mixed release models (with Plutora.)

Don't Fight Uphill: Better Yet, Don't Fight at All

When you are acting as a change-agent in the enterprise it will be tempting to talk about continuous delivery as a "revolution." This is an easy trap to fall in because technology is in an almost constant state of reinvention. At almost every large company the pattern is the same: every few years the "old" platform is being replaced by a "new" platform. In the process of selling a rewrite the organization may have brought in new management or gone through a reorg in an effort to get it right.

The secret of almost every large IT organization is that the people maintaining those legacy databases, they occupy the "high ground," and they've seen your rewrite before. The technical experts in charge of your company's critical Oracle databases have heard the arguments for moving faster several times over several years, and they are more than willing to help... but not if you keep on talking about them as being "old and antiquated."

Don't frame your movement toward continuous delivery as a "revolution" aimed at "fixing bad, legacy practices." In fact, don't use the word "legacy" at all. When you use that term you immediately tell the bulk of your IT department that what they are working on is worthless, and you'll be fighting an uphill battle against people determined to prove you wrong.

Conclusion: Don't use battle metaphors. Don't fight. Enlist the help of your existing staff and engage them in an effort to improve an existing system to make it more amenable to continuous delivery.

Future: The Right Tools, The Right Approach

This future of rapid software delivery supported by pervasive automation is a goal we'll achieve in the next decade, but getting there is going to require a lot of organization and planning. To support your transition to continuous delivery, you should use a tool like Plutora to manage projects as you move to continuous delivery while taking your legacy software systems along for the ride.

Read the original blog entry...

More Stories By Plutora Blog

Plutora provides Enterprise Release and Test Environment Management SaaS solutions aligning process, technology, and information to solve release orchestration challenges for the enterprise.

Plutora’s SaaS solution enables organizations to model release management and test environment management activities as a bridge between agile project teams and an enterprise’s ITSM initiatives. Using Plutora, you can orchestrate parallel releases from several independent DevOps groups all while giving your executives as well as change management specialists insight into overall risk.

Supporting the largest releases for the largest organizations throughout North America, EMEA, and Asia Pacific, Plutora provides proof that large companies can adopt DevOps while managing the risks that come with wider adoption of self-service and agile software development in the enterprise. Aligning process, technology, and information to solve increasingly complex release orchestration challenges, this Gartner “Cool Vendor in IT DevOps” upgrades the enterprise release management from spreadsheets, meetings, and email to an integrated dashboard giving release managers insight and control over large software releases.

@ThingsExpo Stories
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...