Is application virtualization now a necessity?

by James Rankin


Home » Articles and insights » Is application virtualization now a necessity?

Microsoft’s move to Windows 10 and Server 2016 puts enterprises into a state of “perpetual migration” that brings a whole raft of challenges to enterprise IT departments, including taking into account whether Application Virtualization is now a necessity. In a previous article, I discussed this state, and some of the ways that it can be dealt with.

One of the approaches that allow you to cope with the “perpetual migration” model was the adoption of a toolset that assisted with the migration process. This method has the advantage of, from an enterprise standpoint, being possibly the least resource-intensive, once the initial process has been successfully laid down and tested.

The toolset mainly consists of solutions that abstract the most important areas of the user experience away from the operating system, decoupling them so that in a “wipe-and-reload” situation, they are quickly and readily accessible. The three main areas that users need access to are: applications, data, and the various components that make up the user profile. If all of these can be reliably provisioned to a new endpoint, then the user will suffer little or no disruption during the regular operating system upgrades that all of Windows 10’s servicing branches bring.

Indeed, even on the Long Term Servicing Branch, which has a lifetime of up to ten years, there will be an upgrade available every year, and if you wish to take advantage of the new features, then you will be adopting an annual upgrade process. Safe to say – on any of the servicing branches, there is a potential for a lot of churn, and a lot of operating system upgrades and reloads.


This Wisconsin manufacturer needed to modernize its IT infrastructure to support rapid business growth.

Discover what they did

Applications, data and user profiles – which is most important?

It’s a clear split between applications and data – the profile, whilst essential to providing a smooth and non-disruptive user experience, cannot function without both the former items. Applications and data exist in very much a symbiotic relationship – each is pretty useless without the other. We will discuss methods of persisting data and data access in a later article – for now, let’s talk about ways of abstracting the applications from the operating system.

You could simply use a tool such as System Center Configuration Manager (SCCM) to reapply the user applications as part of a task sequence, but that not only increases the time and complexity required to complete the upgrade, but it also relies on setting up the required variables to deliver particular applications to particular users. Also, each application may require substantial “first-run” configuration, or may not work alongside other natively-installed applications.

A more precise tool than a traditional Windows deployment system is usually needed to ensure smooth and pain-free application delivery.

Different approaches to application virtualization


There are tools like App-V available, which has been around for a long time and finally, with Windows 10 Anniversary Update, is now baked-in to the operating system, requiring very simple activation. However, technologies like App-V were born in the days when application isolation was paramount, the days of “dll hell” where compatibility issues were as common as houseflies.

Although version 5 of App-V has improved the integration with the operating system significantly, isolation technologies such as App-V feel slightly antiquated when used for deployment to modern enterprise IT environments. There is also the not-insubstantial point that App-V packaging – the process for wrapping up and delivering applications – can be exceedingly complicated, particularly if the initial packaging attempt fails. In these cases, App-V administrators can find themselves in a “deep troubleshooting” mode that can go on for days or even weeks, with no guarantee that applications will in the end be successfully packaged.


Newer technologies such as VMware AppVolumes and Unidesk go completely to the other end of the scale, being concerned with deployment, not isolation. “Stacks” or “layers” of applications are folded into the operating system, and can be attached or detached on-the-fly, making provisioning of new applications (or removal of old ones) an absolute breeze.

Logoffs and reboots around application delivery become a thing of the past – updates to applications can be applied much faster. But whereas these technologies are fantastic for modern Windows applications, just about every enterprise has a reliance on at least some legacy applications (or just plain badly-developed ones!) that don’t play nicely with the rest, and allowing them to integrate with the rest of the layer or stack isn’t a great idea from a stability perspective.

In these cases, tech like AppVolumes or Unidesk is often combined with some isolation or siloing technology, such as the aforementioned App-V or Citrix XenApp. But this doesn’t provide a satisfactory or scalable solution, as the enterprise applications are split into two delivery methods, both requiring maintenance, infrastructure and resource.

Are there any application delivery methods that can sit in the middle of isolation and integration, and give us the best of both?

Application delivery methods

There are indeed, and the three worth mentioning are Numecent Cloudpaging (formerly Application Jukebox), Cloudhouse, and (formerly Spoon). Each of these can mix and match isolation and integration, and each of them (compared with other packaging technologies) is incredibly easy to produce application packages in – even for old, incompatible or downright poorly-coded legacy apps. Indeed, in Cloudpaging, you have four separate levels of isolation/integration – and you can even mix and match these levels within the packages themselves.

Cloudhouse and Turbo also support saving packages into online repositories – in the case of Cloudhouse, directly into Microsoft Azure, if required. There are on-premises versions of their offerings as well, if online isn’t suitable. All of these technologies support pre-staging of applications onto endpoints, and once there, launch times are the same as natively-installed versions.

Which solution is best for you?

Application virtualization is one of the key technologies you can use to smooth the road when dealing with Windows 10 and its aggressive OS update schedule. Indeed, when the enterprise of the future means onboarding new features and technologies very quickly, I’d say it has almost become a requirement – for all but the most ubiquitous of applications. Outside of the Microsoft Office suite, in most of the customers I work with pretty much every other application is a good candidate for virtualization.

There are solutions at both ends of the scale when it comes to application virtualization, but for your average enterprise with a mix of old and new applications, adopting one of the “mid-range” vendors has always been a good yardstick. Numecent, Cloudhouse and Turbo all provide compelling technology in this space, and using one of them to handle application delivery is a step towards the agile Windows-10-based enterprise of the future.

James Rankin

James Rankin

James is an independent consultant specializing in user and application virtualization technologies. When not working on projects for customers across the globe, he can be found writing technical articles, blogging, and speaking at conferences and user groups. He regularly writes for Source One Technology in Milwaukee.

Tired of wasting time and money on frustrating IT issues and vendors?
We're hiring!  Take a look at our engineering roles in Wisconsin.
View jobs