This is the fourth instalment from the Building Networks for the Future series, written by Stuart Wilkie at Plymouth Academy. Stuart takes you through the stages of the deployment of the new laptops and the route he took to bring consistency for all users to have the same specification experience throughout the Academy.
So, in the earlier parts of this blog series we covered off your “traditional” ICT suite machine and how virtualisation has the power to improve your server system. We also touched on how you can also virtualise applications using the App-V framework to add further flexibility to your desktop deployment.
Thinking right back to the first article, where we were planning what to do the decision was made to deploy new laptops (kindly provided by Stone), to negate the need for classroom “teacher computers”. This did help in one way as it gave us a good quantity of legacy equipment. The problem was that now, although we had some “good specification” legacy, it was still legacy and the last thing we wanted was to have a split Windows XP/Windows 7 estate; after all, XP is coming to the end of its supported life.
“Consistency was one of the big changes I wanted to make – to unify the experience users had, no matter where on the system they were”.
The answer came from discussions though the TechNet Membership held by the Academy, and earlier “Beta” work that had been done. Because of these links with Microsoft, a test program for a new product called Windows Thin PC was accessible. This was previously known as Windows Fundamentals for Legacy PCs, when it was essentially a cut down version of Windows XP. The new version was based on Windows 7, ideal as it maintained the same look and feel and also contained all the same core features. These included, crucially; support for domain joining and group policy. It is cut down and limited in its capabilities so you cant use it as a true standalone operating system. Instead, it is designed to “connect” to something else, such as Citrix or Terminal Services.
Terminal Services, now called “Remote Desktop Services” (RDS) is not new technology. In fact, neither is Thin Client! Use of both of these in schools for anything other than Server Administration by techies is though. RDS has been a part of the Windows Server system since the NT days, when it was an extra install. Now, it is just a “role” that you can choose which has been the case since 2003. The Server 2008 R2 version though adds a whole heap of extra functionality, and changes the playing field in terms of deployment and scalability.
RDS is designed to be split out into its component parts and spread across a number of servers. Teamed with the virtualisation power of Hyper-V (see earlier article) you have something truly scalable. You split out the hosting (where all the programs run), from the web accessibility (yes, you can do that too, but more on that later) and the “brokering” (who connects where). Licensing is also handled as a separate role feature. In a typical deployment (this looks as shown) which is not too far from what we did. Leveraging Hyper-V for hosting the Remote Desktop Session Hosts (well in fact, pretty much all of the system) has two significant benefits. One of these is the snapshotting feature built into Hyper-V, which is an obvious backup route. The second is the way you can let Hyper-V manage the memory usage. Dynamic Memory Allocation is a killer feature, allowing the hosted OS to “claim” more RAM as it needs it, and release it when it doesn’t. This is ideal for a varying workload such as RDS.
Now we’ve done a quick overview, lets deep dive into some of the setup. The basic Windows Thin PC and Session Host bit is obvious from earlier posts. You can just let SCCM (System Centre Configuration Manager) deal with that. It will do the OS install and drop our basic application set on as well. Even the App-V “bubble” installations work on Remote Desktop Servers. There is a special App-V installation pack on the Microsoft Download Centre. When it comes to the power of App-V, this is even more attractive when on RDS,where your OS is 64 bit only. Of course you are separating the application from the OS, so compatibility and stability is much improved with frequently troublesome education applications!
The Session Host bit itself is essentially a desktop which will get provided to the user through the Remote Desktop Client on whatever hardware the user is connecting from. More than that, it can also do some clever things in the new version called Remote App. We will come back to that another time though, in the next instalment of this series of blogs.
If you have not yet read any of the previous posts from Stuart or would just like a recap, here they are –