Welcome!

Containers Expo Blog Authors: Amit Gupta, Elizabeth White, Pat Romanski, Flint Brenton, Liz McMillan

Related Topics: @ThingsExpo, Containers Expo Blog, @DevOpsSummit

@ThingsExpo: Blog Post

Your Car and the Future of Software Delivery | @ThingsExpo #IoT #M2M #ML #DevOps

The volume of code running in cars is representative of the central & growing role software plays in the automotive industry

If you're looking to predict how things should work in the future, start by looking in the right places in the present. Innovations in technology, management and collaboration that will change the way we work are already up and running in visionary organizations. Ever since reading Robert Charette's IEEE article titled This Car Runs on Code, I've been fascinated by the fact that the 100 million lines of code in cars represents one of the most complex software artifacts that we interact with day-to-day. That falls just above the number of lines of code you will find in Mac OS X, and in the same order of magnitude of complexity as the DNA of a mouse - as artfully illustrated by David McCandless on the Information Is Beautiful blog.

The volume of code running in cars is representative of the central and growing role that software plays in the automotive industry. The 2009 IEEE article highlighted a new "BMW Assist" system, that uses data from the car's air-bag, engine and other control units, along with cellular communication and GPS, to inform emergency response teams of location and injury severity. Recently, I saw a post by a mother praising the BMW Assist system (along with other technological innovations in the i8) for saving her son's life.

Likewise, Volvo has taken the potential that software and sensors provide to promise "death-proof cars" by 2020. These trends, along with autonomous driving, are creating the need for a whole new scale of software-centric innovation and are expanding the software in tomorrow's cars beyond 200 million lines of code. Software is becoming the most expensive part of the car. And this trend goes beyond cars to increasingly smarter devices across the board.

Large-scale software delivery is one of the most challenging and most important endeavors an organization can undertake. We've seen the most trivial software delivery mistakes cause business calamities. Things get even more interesting when hardware and software are mixed. For example, a software update in the Nest thermostat resulted in people unable to heat their homes during one of the coldest weekends of 2015. The bottom line is that organizations that master large-scale software delivery will thrive, while those that get trapped in its pitfalls will fall further and further behind.

Visibility into the software supply chain, automated reporting across individual boundaries and real-time flow across the software delivery value stream are critical to delivering the benefits of lean manufacturing to software. The "Industry 4.0" initiative is starting to force a connection between lean manufacturing and lean software delivery. It's at this intersection of a mature lean discipline and a new one that we're learning some of the key lessons for the future of software delivery. I've summarized a few below.

Connect the Software Supply Chain
The world around us is transforming into a set of Internet of Things (IoT) devices with microprocessors and sensors, including the world within the car. Automobile parts come from dozens of suppliers, and all of those microprocessors are running more and more code. This has transformed a hardware and part-centric supply chain, which the world learned to manage via lean manufacturing principles originating from the Toyota Production System (TPS), to a software supply chain.

Managing a software supply chain requires managing the lifecycles of numerous applications across company boundaries. To make this management possible, you need tool support. Although sometimes it's feasible to make everyone use the same supply chain management system (demonstrated by the success of the Android ecosystem, which forced members of the ecosystem to use Git), it is not possible to make suppliers use the same requirement, defect and issue tracking tools, because they tend to be development platform and company-size specific.

As a result, a new layer of integration infrastructure is required to connect the planning and tracking layer of the ecosystem. Without it, the speed of delivery is limited by the inefficiency of sending spreadsheets of requirements and defects around via email. When an integration hub is put in place to connect suppliers, a lean software supply chain becomes possible. For instance, as soon as a defect is found in a test drive or simulation, that defect can be routed in real-time to the right software supplier. As soon as the supplier commits a fix, and updates the workflow status of the defect in its issue tracker, the simulation or test drive can be rescheduled.

When you consider the bottleneck that managing tens of thousands of requirements across millions of lines of code via email and spreadsheets creates, there's a clear 10x efficiency and speed gain to be had.

Gain Visibility Across the Software Supply Chain
Connecting the software value stream across suppliers enables efficiency because artifacts like requirements and defects are moving in real-time, instead of being batched up and becoming bottlenecks. Equally as important as gaining efficiency is a gain in visibility - to show how software development is proceeding across the software supply chain. Without visibility, it is impossible to identify bottlenecks and apply the same continuous improvement that transformed manufacturing to the world of software.

When your software suppliers are not connected to your organization's lifecycle, you are relying on slow, manual, error- and opinion-prone methods of reporting. When that connectivity is automated, it becomes easy to see that a particular software component is causing a disproportionate number of defects or performance problems, and to quickly adapt. This is as important for traditional supplier relationships as it is for open source dependencies.

For example, if an open sources component that you depend on raises a security issue in the issue tracker, and that security defect does not appear immediately for your organization within its own lifecycle tools, you are now much more likely to release a component with that vulnerability. Forward-thinking automotive manufacturers are teaching us that visibility and continuous improvement are needed not just across an organization's developers and IT staff, but across the entire software supply chain.

Automate Requirements Traceability
Requirements traceability is a critical and often a regulatory requirement for devices that we fly or drive around in. But, it is notoriously difficult and expensive to gain this traceability, resulting in the "traceability gap." This causes additional non-value add work and re-work to connect requirements to defects to tests to builds, and so on, as things change. With the pace of software delivery today, change is the only constant.

The problem is not the change itself, but the disconnected nature of the change. For example, when developers change a line of code, they generally know the requirement they are working on and the release the change will go into. But they tend not to update three or more systems voluntarily when making that change. By creating an integration layer that connects the creation or update of any artifact to the downstream and upstream artifacts, such as requirements and builds, it is possible to completely automate requirements traceability.

I know this first hand as it's exactly what we've done at Tasktop. Not only does that mean that audits of our R&D require almost no effort, but our delivery is actually much more productive because developers can instantly access the code relevant to a changed requirement, for example, from one of our 10 OEM partners. What's even more exciting is that we are now starting to apply that same traceability and linking automation to large-scale automotive and manufacturing delivery, where the gains will be even larger.

Apply DevOps Principles to Systems Engineering
The lessons discussed here have been about applying scaled Agile and lean principles to software delivery. The aspect not discussed yet is connecting the build, test and deployment parts of the software lifecycle to reduce not just the development part, but also the overall cycle time. One challenge with manufacturing is how different the development environment is from the environment in which the deployed product is tested and used. If you're developing a web application, your operational and test environment are almost identical. A combination of VMs or, better yet, Docker containers, along with some service virtualization and test data automation, mean that you will have an automated layer for finding defects and then deploying to production.

Contrast that with a car, where the production environment could be flying down the road at 100 miles per hour, disconnected from any network. But the principles of DevOps still apply, in that the more you can automate the connectivity and process of testing and deployment, the more successful you will be. The grand challenge becomes creating a virtual environment where the principles of DevOps can apply to manufacturing. This has to go well beyond the test automation that we do for IT projects, which is why simulation is such an important trend in manufacturing. Once the production environment is simulated, it is possible to gain the velocity and cycle time gains of DevOps for embedded systems and devices. When a build fails at Tasktop because an Agile vendor just changed the semantics of an API call in its latest point release, a defect is instantly created on the backlog of the team that supports that connector.

By virtue of having created our Integration Factory, a simulation environment for Agile/SDLC/DevOps data and tools, we now measure a three-day Mean Time to Resolution (MTTR) for a defect discovered in a customer's on-premises environment to an updated build being in the customer's hands. The potential that this kind of simulation and connected lifecycle integration has for transforming complex manufacturing is tremendous.

The automotive industry is once again back at the forefront of technological innovation, and poised to have a very positive impact on everything from our safety to the shape of our cities in the coming decades. Effective large-scale software delivery is the discipline that will determine the success and the timing of these changes.

More Stories By Mik Kersten

Dr. Kersten is the CEO of Tasktop Technologies, creator and leader of the Eclipse Mylyn open source project, and inventor of the task-focused interface. His goal is to create the collaborative infrastructure to connect knowledge workers in the new world of software delivery. At Tasktop, Mik drives Tasktop’s strategic direction, key partnerships, and culture of customer-focused innovation. Prior to Tasktop, Mik launched a series of open source tools that changed the way software developers collaborate. As a research scientist at Xerox PARC, he created the first aspect-oriented development tools for AspectJ. He then created the task-focused interface during his PhD thesis and validated it with the release of Mylyn, now downloaded 2 million times per month. Building on the success of Mylyn, he created the Tasktop Dev and Sync product lines.

Mik's ideas on Application Lifecycle Management (ALM) and focus on individual knowledge worker needs make him a popular keynote speaker; he has been recognized with awards such as the JavaOne Rock Star and the IBM developerWorks Java top 10 writers of the decade. Mik's entrepreneurial contributions have been acknowledged by the 2012 Business in Vancouver 40 under 40, and as a World Technology Awards finalist in the IT Software category. Building on his contributions as one of the most prolific committers to Eclipse, he serves on the Eclipse Foundation's Board of Directors and web service standards bodies.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...