BenchmarkingMarket AnalysisPerformanceRemote End User ExperienceReviews

Remote End User Experience Benchmarking for Windows Server 2016 Remote Desktop Services

By September 29, 2016April 2nd, 2018No Comments

We’ve got something cool to show you at Microsoft Ignite. If you’re there, make sure you see “Get an Independent Insider’s View of Desktop Virtualization and Session Remoting” (BRK3280) on Friday morning. I think you’ll find it worth your while, as we’re showing you how we can empirically measure the remote end user experience on Windows Server 2016. Using tools that we created, Benny Tritsch, Ruben Spruijt and I did some benchmark testing of Remote Desktop Services in Windows Server 2016 to learn about the new performance and application compatibility advances in this version of Windows Server. Using the same tools, we can show the results. See for yourself how Windows Server 2016 Remote Desktop Services performs when configured for WARP, RemoteFX and the new Discrete Device Assignment (DDA). And check out the performance of the new Azure N Series VMs configured with DDA.

Why Test on Windows Server 2016?

There’s some new features and capabilities in RDS in Windows Server 2016. Discrete Device Assignment (DDA) is new to Windows Server 2016. RemoteFX in 2016 also has some new capabilities:

  • Support for OpenGL 4.4 & OpenCL 1.1
  • Up to 1GB dedicated VRAM can be assigned to a VM guest
  • Up to 4k resolution (up from 2560×1600 in 2012 R2)
  • Guest VMs can now run Server OS

We wanted to see how the improvements in RemoteFX and the addition of DDA affected the remote end user experience – after all, this is what makes or breaks an RDS deployment.

Background: Improving Remote End User Experience with GPUs

Windows Server 2016 Virtual Desktops/ Remote Sessions handles application graphic needs in one of three ways:

  • WARP: The VMs use the hypervisor’s CPU – the GPU is emulated by the hypervisor’s CPU, making apps think there is a GPU. All VMs on the Hypervisor share the host CPU. WARP is the default configuration for RD Session Host server VMs.
  • RemoteFX: The VMs on the hypervisor share the GPU(s). The hypervisor uses the GPU’s IHV graphics driver and maps it to the VMs using the Hyper-V shared graphics framework.
  • DDA (Discrete Device Assignment): Each VM gets a direct map to its own GPU using the IHV (Independent Hardware Vendor) graphics driver. VMs do not share GPUs. For an RD Session Host server, all user sessions would share the GPU assigned to the RD Session Host.

gpu-types

Remote End User Experience Benchmarking with REX Tracker

To test these new capabilities in Windows Server 2016, we used our Remote End User Experience (REX) benchmarking system currently dubbed “REX Tracker”. (REX Tracker is currently under development and not yet publicly available, but it will be in the near future so keep checking back here for updates!)

Using REX Tracker, we create different deployment scenarios (on-premises, WAN, Cloud and hybrids) and add in common network environment stressors (e.g., latency or packet loss). We then run test sequences that simulate typical user workload using pre-installed applications and application files. Each sequence runs a program that utilizes a specific graphics format such as OpenGL, DirectX or HTML5.

Once the test sequences are complete, we grab all the resultant data (screen recordings, telemetry data, network data and response times) and compare it in a series of 4-quadrant displays. These displays demonstrate how the system(s) handle certain graphics formats under certain conditions from the end user perspective.

REX Tracker can show several experiences, such as:

  • What the end user experience is like for a user running OpenGL applications on an Azure N-series VM configured with DDA. Does an added 50ms of latency change this user experience? What about 150ms of latency?
  • How HTML5, Flash or DX10 applications behave on Windows Server 2016 virtualized guest VMs using RemoteFX.
  • How the end user experience using Azure N-Series VMs and DDA compares to using bare metal.

For this test run, we concentrated specifically on graphics performance in these deployment scenarios:

  • On – premises deployment using RemoteFX (we ran RD Virtualization Hosts, not RD Session Hosts, but RemoteFX is fully supported in both scenarios)
  • On – premises deployment using DDA (Discrete Device Assignment)
  • Azure N-Series VM configured with DDA

Here’s a Teaser

Be sure to see Benny’s Ignite session Friday morning, in person or online. After Ignite is over we will be releasing our results and test methodology in a series of blog posts here, so check back for more.  For now, here is a 4 up comparison of an Azure N Series VM configured to use DDA, running a DX 10 application when introduced to latency.  We also compare it to bare metal performance:

 

[fusion_builder_container hundred_percent=”yes” overflow=”visible”][fusion_builder_row][fusion_builder_column type=”1_1″ background_position=”left top” background_color=”” border_size=”” border_color=”” border_style=”solid” spacing=”yes” background_image=”” background_repeat=”no-repeat” padding=”” margin_top=”0px” margin_bottom=”0px” class=”” id=”” animation_type=”” animation_speed=”0.3″ animation_direction=”left” hide_on_mobile=”no” center_content=”no” min_height=”none”][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]