Pages

Sunday, April 13, 2014

Why do high end AMD GPUs and Second Life not match?

I have had too much bad experience with my premium GPU from AMD in the last months. Since 2012 I own a Radeon HD 7970 GHz Edition which sells at the moment for 325$. The Radeon is better performing than a GeForce GTX 680 on Windows DirectX. When it comes to performance, the Radeon HD 7970 generally is a Porsche of the graphics cards.

With my Radeon I can run all the newest 2014 games on ultra mode with all possible gimmicks switched on and they will constantly give me a framerate of 35 frames per second (fps) or higher. It's a delight to play with it. But when I log into Second Life, this "Porsche" graphics card deteriorates to a lower end "compact car". At least it was a "compact car" half a year ago. I was having +/- 20fps with shadows enabled on highly detailed sims like Madcity or Escapades with a draw distance reduced to 128 meters. That was by the way equivalent to the performance of nvidia cards performance that cost around 75$ and that are several categories less powerful than my Radeon HD. 

But the story gets worse! Over the last months the performance has deteriorated from a "compact car" to a "clunker". Where I used to get a framerate of 20 fps, I now get 10fps with the exact same specs. I now either have to switch of shadows or I need to overclock my GPU AND the CPU (an Intel i7-2700K @3,5GHz with 16GB Ram) to the absolut limits and reduce my draw distance to 64 meters. Only with all these measures I am back at +/- 20 fps. And the low draw distance really hurts.

So shouldn't Linden Lab delete AMD graphic cards from their System Requirements page or at least add a phrase saying that those graphics cards will run with a significant lower performance than can be expected? Is it fair to mention both GPU providers equally in the System Requirements?

Let me demonstrate that this by an analogue example: Let's imagine that an ice cream parlor has a sign outside his store saying "We accept cash and credit cards". Only after the order people who paid with cash (AMD) realize that they only get 1/3 of the ice cream of those who pay with credit card (NVIDIA) - for the same money. Would that be acceptable? No! The least thing to do would be to write on the sign: "We accept cash and credit cards. But with cash you will only get 33% of the value for your money." Then people would know what they will get if they only have cash with them. 

Strangely I couldn't find much information on the recent extremely poor AMD performance in the internet. There are a few forum posts (1, 2, 3) and they do not highlight enough how big the performance difference really is. Also Second Life fan boys and girls seem to prefer putting the blame on AMD while gamers in an interesting discussion blame Linden Lab 
for using a legacy OpenGL codebase that is a no longer supported system that neither of the companies work on anymore, and back when it was in development, Nvidia had a better optimized version. SL has a horrible optimizaton as it is still very single thread performance oriented while GPU makers are sacrificing single thread performance for multithread performance. In the discussion it is also said as well, that no user should expect AMD to waste their time optimizing their drivers for that "garbage".
So is it the fault of AMD or Linden Lab that my graphics card is performing so poorly? And why is the performance even deteriorating over the last months? There might be some hope because the OpenGL support has been resumed by AMD due to Valve opening up options for Linux. AMD published OpenGL 4.3 driver in July 2013 but I am not experiencing any positive results so far. I am starting to get very frustrated.

4 comments:

  1. For one thing, there is code in the viewer that deliberately de-rates the texture memory size by 75% if an AMD chip is detected because "AMD texture memory doesn't perform well".

    ReplyDelete
    Replies
    1. Wait, say that again. There is DELIBERATE code, in the base viewer, that DEGRADES user experience if a certain chip is detected? Why on earth would any company DO that?!?

      Delete
  2. This is interesting. The teture memory is anyway limited to 512mb in SL, am I correct? If derated by 75%, that would mean that an AMD GPU runs on only 128mb of texture memory? That would be insane - since for example my GPU supports 3gb texture memory and already 512mb seems an unnecessary obstacle to me. Do you have any further sources that refer to this?

    ReplyDelete
  3. The Lab has NEVER played well with AMD (I have an old but still good 6950). I will say it is better than it was a few years ago. I will most likely switch brands on the next computer. It is sometimes a pain. I honestly think that it is somewhat unfair to take readings at Mad City as that (and other Mad Pea sims) has always been slower than molasses. LOL

    ReplyDelete