Hi,
I am a PhD student investigating how to provide a low latency and high quality multimedia experience to remote clients. The context of the research is to use Cloud technology to store multimedia content (video, audio, games) and allow a user to access this content remotely and interact with it seamlessly with all the rendering done on the Cloud side thus allowing the user to access content via a thin client that would otherwise be incapable of displaying such content.
For the last couple of days I am trying to experiment with the concept by running a game on a server and accessing the server via RDC. The result I got is an unplayable game. The server is connected to the LAN via GbE and the laptop is using Wi-Fi with a strong signal. I noticed a couple of things that made me curious about how RDC operates. The game itself seems to be seeing a virtual GPU attached to the remote desktop rather than the local GPU. Is this normal? Furthermore, the CPU utilisation by the RDC client on the laptop was very high which made me assume that rendering is done on client side. Is this the case? The documentation says that RemoteFX does the rendering on the server. Is the high CPU util. just a consequence of the large amount of data processed then?
I am also experiencing an odd glitch which does not allow me to use the mouse in the game. Any mouse movement on the RDC results in a completely unpredictable and jittery motion of the pointer on the server. I do not have this problem when accessing the desktop of the server. I only see this when the game is active which again makes me think that there is something going on with the rendering part rather than the mouse itself.
On the server side I am using an i7-920 with 12GB RAM and an AMD HD 3870. The client is using an i5-3612QM with the integrated graphics and 4GB RAM. The game I used is unreal tournament '99 because it is old and I thought it would not create a big load on the server.
Thanks in advance for any help.