*Updated post from last week - content added
This is the fourth instalment from the Building Networks for the Future series, written with Stuart Wilkie from Marine Academy Plymouth. Stuart takes us through use of a technology which has been used in industry for years, and is now making an impact in the Education Sector.
In the earlier parts of this blog series we covered how the upgrade of "traditional” ICT suite machines, and how virtualisation has the power to improve your server system. We also touched on how you can also virtualise applications using the App-V framework, to add further flexibility to your desktop deployment.
Thinking right back to the first article, where we were planning what to do - the decision was made to deploy new laptops (kindly provided by Stone), to negate the need for classroom “teacher computers”. This gave us a good quantity of legacy equipment. The problem was that now, although we had some “good specification” legacy, it was still legacy - and the last thing we wanted was to have a split Windows XP/Windows 7 estate. After all, XP is coming to the end of its supported life.
“Consistency was one of the big changes I wanted to make – to unify the experience users had, no matter where on the system they were. We wanted the same look and feel, with the same program set, and settings that followed you”.
The answer came from discussions though the TechNet Membership held by the Academy, and earlier “Beta” work that had been done. Because of these links with Microsoft, a test program for a new product called Windows Thin PC was suggested. This was previously known as Windows Fundamentals for Legacy PCs, when it was essentially a cut down version of Windows XP. The new version was based on Windows 7, ideal as it maintained the same look and feel and also contained all the same core features. These included, crucially; support for domain joining and group policy. However, "Thin PC" is a cut down system, and limited in its capabilities - so you can’t use it as a true standalone operating system. Instead, it is designed to “connect” to something else, such as Citrix or Terminal Services.
Terminal Services, now called “Remote Desktop Services” (RDS) is not new technology. In fact, neither is Thin Client! Use of both of these in schools for anything other than Server Administration by techies is, however. RDS has been a part of the Windows Server system since the NT days, when it was an extra install. Now, it is just a “role” that you can choose which has been the case since 2003. The Server 2008 R2 version though, adds a whole raft of extra functionality, and changes the playing field in terms of deployment and scalability.
RDS is designed to be split into its component parts, and spread across a number of servers. Teamed with the virtualisation power of Hyper-V (see earlier article) you have something truly scalable. You split the hosting (where all the programs run), from the web accessibility (yes, you can do that too, but more on that later) and the “brokering” (who connects where). Licensing is also handled as a separate role. A typical implementation often looks as shown. Leveraging Hyper-V for hosting the Remote Desktop Session Hosts (well in fact, pretty much all of the system) has two significant benefits. One of these is the snapshotting - a feature built into Hyper-V. This can be used as an obvious backup route. The second is the way you can let Hyper-V manage the memory usage. Dynamic Memory Allocation is a killer feature, allowing the hosted OS to “claim” more RAM as it needs it, and release it when it doesn’t. This is ideal for a varying workload such as RDS.
Now we’ve done a quick overview, lets deep dive into some of the setup. The basic Windows Thin PC and Session Host bit is obvious from earlier posts. You can just let SCCM (System Centre Configuration Manager) deal with that. It will do the OS install, and drop our basic application set on as well. Even the App-V “bubble” installations work on Remote Desktop Servers. There is a special App-V installation pack on the Microsoft Download Centre. When it comes to the power of App-V, this is even more attractive when combined with RDS. You are separating the application from the OS (which is 64 bit don't forget), so compatibility and stability is much improved - handy with frequently troublesome education applications!
What next? Well, it is time to sort out your “Broker” service. The broker deals with the “which user session connects to which server” issue – when you have more than one Session Host. You are using more than one Session Host server aren’t you? If not – it is well worth running at least two Session Host servers, even on the smallest of deployments. This means you can perform maintenance on one – while your users carry on using the other. Rather than repeat a how to guide on setting this up – here are a few great resources for this process…
You may be thinking why do I need to worry about setting up a Web Access server? Well, the joy of Web Access is in its title! Just think of what a VLE is supposed to be – the ability for students (and staff) to collaborate and work from anywhere, at any time. Nothing speaks true anywhere, anytime like being able to logon and get the same desktop and application set – from any internet connected computer.
Assuming you have followed the above guides, you will now have a fully functional RDS Farm. There are some “gotchas” here though. Make sure that you have chosen a Farm Name – something like RDS-FARM – which is used for all client access, and that you have set up DNS round-robining and the broker to use this name. The second is certificates. Certificates can be a bit of a pain if you are not careful. By default, when you set up a Session Host server, the connection certificate uses the Server name. Now, because you are using a broker and a single farm name – your actual connection could go to any of the other servers from the farm name (which isn’t an actual server). This will cause a certificate warning to show on the client, which is a bit ugly. To fix this, you will need to set up a local certificate authority (CA) on your domain. This is really easy to do though – and here are two great guides to get you going…
Then, once you have done this, a little bit of tweaking is needed to get your new CA to dish out a certificate for each of your Session Host servers. The process is nicely covered in these articles, although you will find plenty more around the internet too.
Following these will save you a whole world of pain with user adoption. This is particularly relevant when dealing with RDS from Windows Thin PC or Windows 7. Of course – one of the key benefits of Windows Thin PC was the domain joining ability. With this – we could then configure Single Sign On for the machines and Remote Desktop. What does this mean in plain English? When the user goes to the Thin Client, it looks exactly like a normal logon screen, and it is! They enter their username and password, and then the system will automatically login to the Remote Desktop using the same details – without needing to prompt them to enter them again.
To complete this even further, how about after logging in – the machine automatically runs the Remote Desktop without even showing the other desktop. Well, yes – this is exactly what Stuart has done. Back to the power of System Centre for this one; where part of the build of the machine runs a bit of script. This changes the way the system starts – “replacing the Windows Explorer”. How do you do this? Well, the details can be found here...
That’s not the end of the story though. What do you do about giving access to printers - for example? Normally, you would assign printers by Room – but of course, the Remote Desktop farm has no way of knowing which room the users are in (well – not unless you do something fancy with the connecting machine name). Well – this is where the changes to Group Policy in R2 can help you out. Here is the link you need - http://www.edugeek.net/blogs/thescarfedone/1012-managing-printers-remote-desktop-environment.html
This will work nicely for your Windows or Non-Windows machines, as all the processing happens on the Server system. For the desktop machines, since they are Windows based – we can do something even simpler. Like you would normally script the connection of printers at start-up or logon, the same rule would still apply to these machines. Remote Desktop options in Group Policy will allow you to control whether locally connected devices and printers will be transferred and made available in the Session. Perfect! This is also explained in the same article.
Of course – these same settings will allow your users connecting from anywhere to access their USB sticks and home printers through your Remote Desktop session too. Before the security conscious jump in – you do of course have control over this (for example what can be run) because all your usual Group Policies also apply in the connected session.
So – let’s get back to the new feature – Remote App. What is it? Well… think about wanting to just be able to quickly run one single application from a Remote Desktop Session. You want to do this, without getting the full desktop of the Remote Session – ie, for it to look as if the program is running directly on your computer. Welcome to Remote App!
And how do you set it up? Well, that is remarkably simple. You need to have already completed all the previous Remote Desktop steps, including Session Host and Broker – and for off-site working, you also need Web Access to be fully functional.
Then, all the rest of the real work happens on your Session Hosts – and this Microsoft Technet guide tells you all.
So there you have it – a complete overview, and pointers for how to’s on Remote Desktop in both the traditional desktop user experience sense, and the new-fangled single application sense.
If you have not yet read any of the previous posts from Stuart or would just like a recap, here they are -