Welcome!

Containers Expo Blog Authors: Elizabeth White, Carmen Gonzalez, Pat Romanski, Liz McMillan, Sematext Blog

News Feed Item

Virtual Computer Releases NxTop 1.1 with System Workbench™

Virtual Computer Inc., the company redefining PC lifecycle management through virtualization, today announced the availability of NxTop 1.1. Since it was first released for sale in April 2009, NxTop has differentiated itself with its ability to deploy a single, centrally managed Windows desktop environment to all users while maintaining user-specific personalization on each PC. Release 1.1 brings further innovation in this area by introducing System Workbench™. This new element of NxTop’s award-winning management system employs file system layering to isolate elements of the system to improve backup performance, retain end-user installed programs and settings and provide desktop IT managers with tools to manipulate the file system and registry. NxTop 1.1 is available immediately through the company’s NxTop Now! program which provides early adopters access to the product prior to general availability.

“Adoption of desktop virtualization is hampered if the IT department cost savings come at the expense of end-user flexibility and convenience,” said Dan McCall, president and CEO, Virtual Computer. “Techniques such as virtual disk segmentation, user profile virtualization, and file system layering are the preferred methods to overcoming this challenge. With the release of System Workbench, Virtual Computer leads the market in retaining end-user personalization in shared virtual images.”

Shared Image Management

Since its inception, NxTop has allowed IT administrators to build a single, Windows virtual image that can be shared across their entire organization. Updates to this shared image are performed centrally on the NxTop Center management console. At boot time, NxTop presents a Windows desktop to the end-user that is a composite of the latest shared system image, user-specific profiles and settings, and any non-permanent PC data such as caches and index files.

With System Workbench, IT administrators can now control which aspects of their “gold” operating system image may be customized and retained by the end-user. Through a policy-based interface with a simple XML-based authoring language, System Workbench provides powerful new capabilities, such as:

  • A framework for whitelisting applications that can be installed into a shared image by an end-user into a persistent layer that survives a self-cleaning reboot or IT-generated system update.
  • Granular, policy-based control to map files and directories onto different layers of the virtual file system. This allows, for example, the ability to exclude large but non-essential user files such as Outlook OST cache files and Windows index files from being included in NxTop’s automated user data backups. It also allows the system to retain customized user settings for poorly designed programs that store such information in system folders instead of the user’s profile area.
  • Manipulation of programs, data files and settings for system features such as offline folders, file sharing, and antivirus databases to survive patching of shared operating system images.

In addition to System Workbench, the NxTop 1.1 release includes:

  • NxTop Engine performance enhancements, including near-native network speed via paravirtualized networking.
  • Simplified Microsoft Active Directory configuration and testing, making integration into Microsoft environments a snap.
  • Enhanced user backup capabilities, including optimized data transfer, management of restore points, and compression of virtual hard disks to improve performance and disk utilization.
  • Improved wireless support, including WPA/WPA2 Personal across all major wireless chip sets.
  • Improved scalability and performance in NxTop Center, including background task processing and a 50 percent reduction in image preparation time.

“While client-side hypervisor technology has become one of the hottest IT topics of 2009, true adoption will only occur once management solutions begin using client hypervisors to enable new use cases for IT administrators and PC end-users,” said Michael Rose, industry analyst, enterprise virtualization software, IDC. “NxTop was the first to deliver on this promise, and version 1.1 provides an ever broader set of capabilities aimed at reducing PC management costs.”

About Virtual Computer, Inc.

Virtual Computer, Inc. is redefining PC lifecycle management by making it as easy to manage a thousand PCs as it is to manage one. NxTop™, the company’s flagship PC management product, combines a bare-metal client virtualization platform with a powerful central management system to dramatically reduce PC management costs, while improving reliability, security, and the end-user experience. NxTop uses advanced virtualization technology to isolate the main components of a PC: the hardware, operating system, applications, and user data, allowing each to be managed independently. Founded in 2007, Virtual Computer is privately held and headquartered in Westford, MA. For more information visit us at http://www.virtualcomputer.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...