Anyone who has been in this industry for the last 20-25 years knows what it was like to spend your Sunday morning running complex setup routines to build classrooms in which each student had a unique system, the instructor had shared resources, and everyone played a unique role. Setup took a long time. Resetting a room was impossible, and sharing a room was difficult. I remember one training centre using detachable hard drives to swap class images, hoping they did not break between classes.
The use of a virtual machine changed all that. Back in 2001, I was part of a small team tasked by what was then Microsoft Learning, to ‘transform’ a few high demand courses to run from the traditional distributed deployment model to leverage virtual machines as a tool for delivery. At the time, we used the newly acquired Connectix Virtual PC and rewrote a few courses to be delivered on virtual machines.
Instructors, training centres, and students loved it, and the move toward virtual machines as the delivery vehicle for software training was in full swing. The following years saw great creativity on the part of content authors. With the ability to build lab specific images, the ability to reset broken labs, and the simple deployment of a file copy, the overall quality of the learning experience went up. Large events adopted this model, and we saw an explosion of new content. Virtual machines made it easy to put very complex software in front of a student.
Companies invested in rich UI experiences, simplifying the ability to access libraries of hundreds of locally installed labs. Our own company reinvented the way labs were delivered at conferences with the creation of our holLaunchPad platform.
But it was not all without challenge. For all the good brought on by the movement to virtual machines, there were some big challenges. First, how to license them? A virtual machine (VM) is portable, easily stolen, so students were getting production software. Evals expire, so they don’t work . . . phone activation? Remember that?
What’s more, as people got creative with content, the size and complexity also went up. 32GB RAM on a desktop? These systems are not cheap. Download 200GB the night before a class? Want to keep five classes on a student system, now you need lots of disk. The investment in hardware quickly offset the savings in setup time.
Then there’s the issue that not everything can run in a VM. Nested virtualisation was difficult, slow, buggy until very recently.
The race was on amongst lab developers to create the best virtual learning environments. Who could make the smallest, fastest, quickest VMs that were easy to redistribute and just worked well? A handful of authors excelled at this and quickly became the go-to choices for creating virtual environments. I can still list most of these folks by name and the products they specialised in. Some are still around, but most have faded away and moved on to other things.
It was inevitable that the required investment in desktop hardware would soon drive the industry to look for more cost-effective hardware sharing solutions. Enter the age of the hosted lab. Companies specialised in putting lab environments on central servers, either in their own datacentres, or at conventions, or in individual companies. Instead of investing in hardware, training companies could rent space with hosting providers and use simple browser based clients to access and manage virtual machines. Farms of servers running hundreds of students all at the same time, from datacentres around the globe.
Two platforms, the holSystems Platform and the Lab on Demand Platform dominated the industry. One in the event and conference space, the other in the commercial training space. It was from the combination of these two platforms that the OneLearn Platform was born.
This entire hosted lab architecture is built on the premise that you are learning some piece of software that is installed and configured in an operating system, and that operating system runs on a piece of hardware you own and manage. The operating system is the container for what you learn. The virtual machine is the delivery vehicle for that container.
Think about it. Exchange. SQL. Linux. Office. Windows. C#. Java. Every one of those technologies has traditionally required a server and/or a workstation with locally installed software.
This entire industry, billing model, service model, and software platforms were built around the notion of replicating something you put in your own office or datacentre using virtual machines and centralised hardware.
Software today is no longer something you fundamentally own. It is something to which you subscribe. Software as a service is the notion of paying for software as you use it, only when you use it, and only for how much you use it. Upgrades are automatic. Current versions are assumed. Change is constant. On top of this, much of this software has moved from being desktop based, to being browser based. Our once powerful laptops, with tons of disk space and lots of RAM, can be replaced with web browsers. Office 365. QuickBooks online. Salesforce. CRM. Even real time communications products are based on the notion of having a subscription, an account, and consuming the software in real time, from a browser.
If you no longer have to install, deploy, or secure something in the traditional way, by first doing the same with an operating system, then how do you learn it? Do virtual machines and the platforms we have created to support them still have a role when we no longer need an operating system to contain the software?
If the software we consume is no longer contained within the context of an operating system, where do we find it? The answer is simple, we find it in the cloud. Cloud providers are replacing the operating system that we traditionally install on hardware, with an operating system that spans billions of dollars in hardware, hundreds of datacentres, and enough cable to go to the sun and back. Amazon, Azure, Google. They all operate on the premise that you tell them which parts of an operating system your application needs, and they will provide you with exactly that and nothing more.
For example, do you need a website? The cloud will give you a web server and some storage, but hide the operating system from you. You don’t have to worry about the tasks of installing, administering, securing, and maintaining what an entire ITPro industry was built around back in the early part of the century.
This changes how we think about hands-on training. If the unit we deploy and manage to put software in front of a user is no longer a VM, then what is it?
Cloud slice – the next container for hands-on learning
A cloud slice is what it sounds like. It involves taking a cloud provider, carving out a small slice of the capacity in a highly-managed way, and serving it to a user so they can learn. The mechanics of this vary from provider to provider, but in Microsoft Azure, it’s a resource group. Regardless of the term, the concept is the same. A service provider, upon request, will reach into a cloud provider and give you the pieces you need to learn, in a highly-controlled way, shielding you from all the complexities of getting started.
The notion of cloud slicing is core to Lab on Demand. We have been building our cloud slice technology and expertise for approximately three years, dating back to 2014, when we ran our first event lab deliveries on Azure. We are focused on delivering this on Azure first, followed by AWS, and then Google Cloud. To that end, we include cloud slicing as part of our Azure Virtual Datacenter strategy.
Here the real power of Lab on Demand comes to bear. A lab in which the user is expected to use a cloud platform such as Azure to learn software might require the user to have a user account, logon to a subscription, and create a series of objects. They typically then administer those objects through the Azure portal or using tools such as those installed on a creator’s workstation.
Lab on Demand understands the notion of a “lab without a virtual machine” in which the content of the lab is a cloud slice. A cloud slice might contain any number of the following: a set of pre-created user credentials with delegated access; a JSON based ARM template describing what is needed to start the lab; a PowerShell or CLI script to configure the lab environment; or a scoring script to run when completed.
Interestingly, cloud slices can be combined with Azure as a Fabric to provide 100% cloud only lab experiences; combining a cloud slice with a managed VM creator workstation.
The biggest business risk in the world of cloud slicing is the notion that users can create things which cost a lot of money and have those charges go on the providers’ credit card. Solving this problem is key to providing a robust cloud slicing service. Lab on Demand incorporates a set of technologies that allow customers to safely provide students with slices of Azure without the worry of unexpected charges. This set of controls all but eliminates the need for users to create trial accounts, create Microsoft IDs, redeem Azure Passes, and perform many of the initial setup tasks that plague the ‘Monday morning’ of a typical training class. It brings the getting started experience of cloud on par with the getting started experience of a virtual machine, eliminating the last barrier to broad adoption of cloud slice as the de facto content container in the future.
The industry is ready to transform. This transformation will take place over the next two years as cloud capability matures, and cloud training provides innovation and introduces features and capabilities designed to simplify and commoditise learning cloud.
Learn on Demand Systems is ready. We have invested tremendously in this area and are committed to leading the industry with the richest and most relevant features, leading the industry in patterns and design of cloud slice training, and providing the same level of “simplification of consumption” that we have brought to the world of virtual machines.
We are ready. Are you?
Share this story