Radeon HD Cards Suffer From Spikey Frame Latency

While doing a review comparing NVidia’s GeForce GTX 660 Ti to AMD’s Radeon HD 7950, TechReport reviewers were puzzled to find that although both cards had comparable FPS (Frames per Second), the Radeon HD 7950 felt much jerkier than the GTX 660.

After closer inspection, TechReport experts found that this jitteriness is due to the Radoen HD’s uneven frame latency. Simply put, frame latency is the time it takes the graphics to produce a single frame. Ideally, frame latency should be nearly constant for the same frame rate. But this doesn’t seem to be the case with AMD’s graphics cards.

After investigating the performance of the Radeon HD 7950 more closely, TechReports discovered that the card outputs a high-latency frame every few low-latency ones. In effect, giving the visual equivalent of stuttering or jitteriness while maintaining a high average frames per second.

Even when looking at minimum FPS, such frame latency spikes are not visible because they are “smoothed” by being averaged over a second. Nonetheless, the jitteriness effect they produce can be felt by most people.

When contacted for clarification, an AMD representative said that the report has “raised some alarms” at the company and that they hope to have some answers before the holiday.

Maybe it is about time minimum, maximum and average frame latencies should be part of all new graphics cards benchmarks. What do you think? If you have a Radeon HD card, please tell us whether you’ve felt the jerkiness mentioned here.

Add new comment

This question is for testing whether you are a human visitor and to prevent automated spam submissions.


So that explains it. I was

So that explains it. I was playing xcom on a gtx 560 ti and it was so smooth. After upgrading to a hd 7970 i was getting micro stutters during scrolling and panning which really annoyed the hell out of me so i thought it was just nvidia drivers being better. In the end i sold the 7970 and purchased a gtx 680 and its been smooth sailing. Great article as this helps bring the problem to light, frame lantency.

im gonna have to agree with

im gonna have to agree with the win8 comment at the top, they probably did the BM on win8 and are blaming the card, when its the POS OS . . . ive got a 5-6yr old HD4850, and i have never had a single problem with it working as it should . . . there was a factory defect in the cooling packet, but thats sapphires fault, not ATI's

Article talks about latency

Article talks about latency problems with the drivers, I can only surmise it must be related to DX11.1 that and the fact that nothing even supports it. Dunno guess I was just taking a shot at it, what the article doesn't mention is what drivers they used for bench test.

The hardware review sites I

The hardware review sites I read do min/max/avg fps with respective graphs, and have done so for as long as I can remember. I wouldn't read a hardware review without this comparison along with a very in depth description of the testing hardware configuration. I've never noticed 'jerkiness' in any AMD graphics card I have owned, unless of course I push them too hard. I am inclined to believe TechReport has monitors with refresh rates of 60hz, thereby losing some frames above 60fps. They clearly stated in their review that the AMD card had more overall frames rendered. I would describe the effect of losing rendered frames on the output signal as jerkiness. Remember, HDMI is limited to 60hz. Good monitors can output their maximum resolution at 75hz or more through DVI. Great monitors do 100hz and above. They can be overclocked with some work too, which you would only need if you are running 3 or 4 cards in parallel.

Maybe you should read the

Maybe you should read the article a little closer before posting a wall of text? "Maybe it is about time minimum, maximum and average FRAME LATENCIES..." Not min, max, avg FPS. Most reviewers doesn't cover this. Also, a lot of people on many different forums have complained about this.

I've noticed it with every

I've noticed it with every radeon card I owned, BUT also with every modern game. I just thought it was an artifact of modern games. There's so much streamed from the HD these days (instead of loading complete levels) they stream animations, textures, geometry, everything, so as soon as one thing gets a hitch you're going to get a spike.


correction An HDMI connection can either be single-link (type A/C) or dual-link (type B) and can have a video pixel rate of 25 MHz to 340 MHz (for a single-link connection) or 25 MHz to 680 MHz (for a dual-link connection).
You voted 'no'.

Add new comment