Welcome!

Containers Expo Blog Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Virtualization for Deeply Embedded Applications

Virtualization has penetrated far into the enterprise; now it's begun the march into portable electronics:

Readers of Virtualization Journal know that virtualization provides enormous benefits to makers and users of computing platforms ranging from desktops, to servers, and even supercomputers. The reasons for this are now obvious; cost savings through server consolidation, reduced administrative costs, and greater flexibility. Less obvious may be the degree to which virtualization can benefit deeply embedded applications such as cell phones, networking equipment, and point of sales terminals.   
 
While there are similarities in some of the value propositions involved, there are also substantial differences due to the more challenging timing and resource budgets of embedded devices. Real-time processing in embedded applications puts a premium on low latency, highly deterministic approaches to hypervisor design, while the available volatile and non-volatile memory is smaller, often by orders of magnitude, than that available in even a low end desktop machine.
 
The virtualization technique most often used in and enterprise computing or desktop application is known as “full” or “native” virtualization. In this approach, each instruction executed by a guest OS or application is trapped and each privileged instruction, instead of being executed by the underlying hardware platform, is processed by software that fully emulates the underlying hardware. This allows for the greatest flexibility in hosted software as essentially any and all software should, in theory at least, run unmodified. 
 
Unfortunately, this approach takes a relatively large amount of memory and processing overhead. In the enterprise space some of the overhead has been reduced by the inclusion by Intel and AMD of hardware virtualization support but the system overhead is still significant.  In the embedded space that hardware support is quite a bit less mature and the available processing overhead is typically not there.  While it is typical in a computing context to have ‘room for growth’ by virtue of more memory or speed than is strictly required at the time of purchase, in an embedded context this is more often than not labeled as ‘waste’ and not tolerated.
 
In order to get around this issue, most commercial virtualization vendors have adopted a technique known as “paravirtualization.”  In paravirtualization, the operating system and device drivers must be modified to take advantage of the characteristics of the hypervisor or Virtual Machine Monitor (VMM).  In this modification, calls to hardware are replaced by API calls to the hypervisor.  Since the analysis of which instructions must be managed, and just how those instructions should be managed, has all been done during the system’s design and development phase, no run time instruction trapping or analysis is required.  As a result, the performance overhead of operating virtual machines in a paravirtualized system is quite a bit lower, often by orders of magnitude than what was possible in full or native virtualization.  It also means that, as the hypervisor essentially owns hardware access, security between different virtualized domains is much greater, and systems can be built in a more robust fashion.
 
Why Should I Virtualize My Cell Phone?
I often wonder what the conversations were like years ago when microcontrollers were a new concept and customers would ask just what could be done with such a thing.  Most of the now common applications such as engine controls, GPS units, and cell phones would have seemed like so much science fiction.  But, once the basic building blocks were well understood by designers, applications began to come out of the woodwork and the microcontroller became just another generally accepted tool leading by stages to just those applications.
 
With virtualization we’re essentially at that same very early stage where designers may have heard of the technology, but they haven’t fully internalized that they have another tool in their toolkit.  The question now is more along the lines of  “what can be done with LOTs of virtual processors?”
 
When looking at the architecture of a cell phone, as often as not there’s a baseband processor that runs the actual communications, and a separate applications processor that does graphical display, multi-media, and other processing that’s not core to the phone’s basic functionality. Using virtualization, it’s very straightforward to integrate both apps processing and the radio stack on the same physical device saving BOM cost and also considerable development time.
 
Another area of study is how to support handset functionality in a robust fashion, and still have a degree of openness. The Open Handset Alliance’s “Android” platform attempts to answer the “openness” aspect, but actually does little to nothing to preserve the integrity of the handset, a critical issue with carriers.  Using virtualization it is possible to create highly secure and independent profiles for the basic phone function, and for the user, creating flexibility and preserving the integrity of the handset against malware or just simple user error. The Open and Secure Terminal Initiative or OSTI is a good example of this approach (http://www.nttdocomo.co.jp/english/corporate/technology/osti/).

More Stories By Frank Altschuler

Frank Altschuler is in charge of marketing for Trango Virtual Processors, a leading provider of embedded virtualization IP. He has just recently joined Trango from Newisys where he was in charge of marketing for their X86 scaling solutions. He has previously held marketing positions at Starcore LLC, a DSP Intellectual property firm, and Cirrus Logic, a fabless semiconductor company. Prior to moving into marketing, Altschuler spent 15 years in engineering design and development in areas such as communications and electro-optics.
He has earned a bachelor's degree in electrical engineering from North Carolina State University. For more information on Trango Virtual Processors, please visit http://www.trango-vp.com or email [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...