2.1 A brief history
When considering the history of hosted desktops, it is important to examine the birth of virtual machines within their larger cloud computing context. Over the course of the 1960’s, computer development gained much ground, but the devices at the time were still only capable of completing one instruction at once. This would often mean computational tasks had to be completed in batches, a time-consuming endeavor.
In 1963 the Massachusetts Institute of Technology (MIT) began developing project MAC (Multiple Access Computer) with a $2 million Defense Advanced Research Projects Agency grant. MAC aimed to research computer processes in greater depth, particularly in the areas of artificial intelligence, computational theory, and operating systems. A portion of this research was geared towards creating computers that could complete multiple tasks at once, and devices that were capable of having more than one simultaneous user. In response to the research, International Business Machines (IBM) created the CP-67 system – the first mainframe computer that supported virtualisation.
The process of running a user’s desktop operating system did not appear until the late 90’s when VMware first introduced it in 1999 with their first product the VMware Workstation. The technology as we know it today did not become mainstream however until VMware released a superior product repertoire in 2007. Hosted virtual desktops have since gone on to be utilised by thousands of global businesses.
2.2 Why were they introduced to the market?
When virtualisation was first conceptualised in the mid 60’s, computers were useful, but as previously explored, they had their limitations. They were only capable of performing one task at a time, with tasks having to be queued in batches. Software and hardware virtualisation was invented in subsequent decades in order to divide up large mainframe computers into manageable entities that allowed users to connect to networks and shared sets of data at the same time. It was believed that by dividing the mainframe and implementing the use of virtual machines businesses could better-focus resources and increase efficiency.
Virtualisation was also intended to increase security and stability by removing dependence on a solitary device. Prior, data had to be stored on individual machines, and should that machine become compromised, either by human error or natural disaster, then the data would have been lost. By storing data on a secure shared network, and granting access by way of virtual machines, businesses could make use of that data by allowing employees to access it from their separate operating systems, which would all but removed the likelihood of corruption.
2.3 How have hosted virtual desktops improved?
The technology that allows for machine and desktop virtualisation did not see any significant changes between the 1960’s – 90’s, other than being able to gain access to centrally stored data from multiple devices.
When the technology became slightly more popular in the late 90’s/early 2000’s it still had to be painstakingly managed on all levels, down to the code that processed actions. In 2001, the increasing migration of business files and processes online created new complex security threats which could have had lasting negative impacts. This made many businesses rush to implement cloud technology as a new security tool, as new developments meant files stored in the cloud were secure from external and internal threats because they could only be accessed securely with an internet connection.
Virtualisation gained real traction after the Sarbanes-Oxley legislation was passed in 2004. The new law introduced a strict set of management responsibilities surrounding data security, because of a series of high-profile security scandals at the time. Providers therefore increased hosted desktop functionality as a security tool, and marketed it as such.
In the present era, industry leading companies such as Citrix and VMware are continuously releasing product revisions and updates, with both companies releasing 6 collectively since 2009. This represents an increasing requirement to change the technology to reflect the needs and wants of the customer base. Things such as a friendlier user interface, competitive pricing, disaster recovery technology, and bring your own device integration have all been in recent versions of hosted desktop software.