AMD: DirectX Is Holding PC Graphics Back

Speaking to Bit-Tech, AMD GPU division worldwide developer relations manager Richard Huddy explained why PC games' graphics aren't much better than consoles' graphics even though the GPUs available for PC are several orders of magnitude more powerful than those found in consoles.

"It's funny, we often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good," he noted. "To a significant extent, that's because, one way or another, for good reasons and bad -- mostly good -- DirectX is getting in the way."

Huddy then explained that the overhead imposed by the DirectX implementation is the reason why PC games are limited to 2000 or 3000 draw calls per frame while the average console game can issue 10,000 to 20,000 draw calls per frame.

"Wrapping it up in a software layer gives you safety and security," he explained. "But it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate."

The problem with the DirectX API is that each draw call needs to be passed to the CPU before translates it before passing it to the GPU. This method allows developers to write their code once for the DirectX API which handles the translation to whatever instructions set the installed GPU uses. On the other hand, console developers can avoid all translation overhead by addressing the console's GPU directly since it doesn't change.

Add new comment

This question is for testing whether you are a human visitor and to prevent automated spam submissions.


3dcards and API's more competition better product

I remember the time when you had several GPU makers and each had their own api. (3dxf-glide, s3, Riva, Matrox, SG-OpenGL, and, Micro-Directx) I miss that, it seemed the more competition the better product we received. I am strictly a pc gamer, but I don't have anything against console gamers, I welcome them into the gaming community as brothers. And I have always been from the camp of thought that I would gladly give up a few FPS to make my game look better.

Direct X

This statment is rubbish as everyone knows that consoles are still based on the same DX tech that Pc's run on the only difference is that the ps3 and xbox 360 run on direct x 9.0c and pc's run on direct x11 giving pc's more visual and running power

I am a pc and console player i find that pc's has the better graphics by far if developers consentrated on pc games they would win hands down the reason they do not is they have to make the pc's game backwards compatable with dx9.0c for compatability with older pc's and of course porting to consoles.

That in turn is holding back pc gaming to take advantage of the full advantages of the better DX

Wii and PS3 don't use DX as

Wii and PS3 don't use DX as they're not made by Microsoft and DX on the Xbox is a stripped down version that supports only the Xbox hardware thereby removing alot of the overhead. Clearly from what you say you don't really know just how significant overhead is in windows and directX.


Agreed, he and most of the people posting here don't seem to know what they're talking about.

Some suggested a gaming OS, that would solve nothing. The whole problem is how many different gpus we have. As long as we have a choice of which components we want on our pcs, this performance issue will remain.

DirectX isn't evil though. If you ask any pc dev if he'd rather work with directX or code directly for gpus the answer will be a resounding 'DirectX!'. Because he can code graphics only once instead of coding them for every major gpu model out there.

The article guy is just an ass who doesn't care about the people who use his tech to make things. He just wants shinier graphics to show off his products.

looks like none of you have any knowledge of DirectX or games

I am a games developer, Indie Developer to be precise...

first of all, in case you are forgetting, DirectX 11 or DX11 is DirectX... windows graphics system is based off DirectX, OpenGL is mostly found in applications, rather than games, however if you were an indie games developer for the PC, you would find using DirectX an excellent place to start. it has extremely powerful functions and a magnificent graphics architecture...

I don't want to do what crappy ATI does and point the finger... but if they can not realize its poor developer work... then please for the love of games... ignore this guys BS and move on...

and fyi, console games look good and perform good because they are designed for a select if not just 1 resolution (480i/p, 720p, 1080i/p), therefore the textures can be optimized and accessed from a READY TO GO data bank, bare that in mind before you actually are foolish enough to believe console games are better than pc.

and remember this... console games perform just one task, whereas pc games which way outsell console games perform much more in terms of functionality, and they even manage to work well with a whole OS running in the background, which is managing the memory being accessed as well as channeling your controls back and forward and syncing the sound... oh yes, consoles have that too... (how do you think you can browse the web on your ps3?-remember these OS's are not designed to be an OS just a backbone)
if your pc is crap... costs less than a console... then do not expect a great gaming experience.

argh end of.

i agree!

get rid of directx and start designing games around opengl! opengl always did look better, with better lighting, and never crashed (unlike directx). for those of you who aren't up to speed, take a look at some directx vs opengl screenshots! then make your own mind up.


I would but I find Opengl far too glitchy while working in 3DS Max also I am hard pressed by the lack of details I can get out of GL, meshes look like a blobby mess. Also I have seen most of the screenshots, they are nothing more than stills or renders if you want a better understanding look at videos showing the latest GL and DX side by side for a better understanding.


If you use 3ds max to evaluate opengl that means you rely on autodesks implementation and usage of said API.

You can only claim its glitchy if you used the opengl API yourself.


It will happen eventually, PC sales outnumber Console sales, and game developers will eventually cater to the pc crowd again. EA has said the PC will be the biggest platform in the future. Soon, It will be easier to pirate console games than PC games, and the developers knoww this. There are many PC exclusive titles in the future. IL2 Cliffs of Dover is just one. I have stats to back this up if anyone disagrees, or thinks differently, and it comes straight from Intel. Also, soon the processor will be part of the copy protection. Good luck getting around that.

big whoop

theres always been a shit ton of pc exclusive games, always. doesnt mean theyre any good.

but hardware anti-piracy? might be effective, but pirates are crafty; itll probably make things more complicated, which will drive the lazy away, but the hardcore will find a way, as they always have.

No Good????

So Crysis, Starcraft II, IL2, Civilization, Tribes, BF1942, BF2, Numerous Adventure, RTS, and Strategy games with Meta scores above 85 are no good?? Whatever. All systems produce crap games, and there are far more good games for the PC than for consoles. Especially if strategy and thinking is your kind of thing. The PC has the most diverse game base, and it will be impossible to get around processor based DRM, unless you have a clean room, which by the way costs millions of dollars. Ive also heard of the possibility of hard drive, MB and videocard based DRM. I think software based DRM is on the way out. Eitherway, Serial codes for all games linked to specific hardware is only a few years away across all platforms. Good bye piracy, its about time. I can honestly say Ive never pirated a game. I own over a thousand PC and console titles. Software pirates are no better than shoplifters. All this why pay for a crap title is just nonsense. Read the reviews, and decide to buy it or not, just like any other product. Stealing software is horrible, people deserve to be paid for thier work, wheter good or bad. Our responsability is to be a smart consumer. If a product is bad, dont buy it, instead of giving them reasons to blame piracy for bad sales. This will improve game quality and lower prices.


I read something about that. I heard Steam is looking to make thier accounts more secure by linking it to a specific processor in order to change passwords n such. Right now its through email. Intel is working hard with microsoft the make copy protection hardware based and not software based. More than half of Microsofts OSs are over license limits or pirated. Linking a license to a processor will definately decrease pirating, especially if it involves continusouly generated security/authorization packets, but that would require constant online connection. That combined with AMD's integrated CPU/GPU, PC gaming will see hugh advances in the future.

Limited sight...

Steam already implemented it. No, its not through email, the email part is just so they can verify with you if a pc can be added to your list of pcs that are allowed to login to your account. They don't use Intel's hardware protection yet, though.

"Linking a license to a processor will definately decrease pirating"

Will it now? Consider other possibilities. What would you do if you wanted to pirate windows and Intel cpus had built in copy protection? Go AMD maybe? Or maybe go for a server CPU that doesn't have this crap?

"especially if it involves continusouly generated security/authorization packets"

Yea, ubisoft is a great example of how that works and pleases customers so much. Forget it, companies (which are the bread and butter for windows) sometimes require offline pcs and wouldn't buy into this.

The rest of your post is just gibberish.

Just my opinion, as in everyones case

I love it when people comment and degrade other peoples posts, without having the slightest clue what they are talking about.

I'm not going to chyme in on much except these few points:

1.try changing your Steam password and note how it asks to confirm this through your email.

2. Hardware DRM WILL decrease piracy, after all, who has a clean room in thier basement.

3. Sure they may be ways "around" DRM, but if the ways "around" DRM rely on hardware that does not support DRM technonlgy, then see how long programmers will support that hardware.

4. Try playing a game on a "server CPU" and tell me how that works out, considering how much money you'd be spending on it, and the MB just to get good frame rates, rather than just buying the 40+ or so games you could have bought at 60 bucks each.

5. I believe that within 10 years, most of urban America will be "always on" with wireless internet, so Always on DRM is not such a hassle (I remember the uproar over serial keys, now its expected and accommodated). Japan and Sweden already have free always-online wireless, and at speeds 10x faster than in the US. The Asian market for videogames is much larger than the US. In Some countries, it is considered a sport. If people in the US are so content on pirating games, then in the future we will be lucky to have English versions of them, along with cultural referances to our socioty.

Just my opinion....


"While Crytek has put on a brave face in the fallout of the Crysis 2 leak, it has admitted that piracy is still a big problem for PC developers. The PC Gaming Alliance, however, reckons that the practice is on the decline."

Due to Steam and more complex software. When Hardware DRM arrives that is processor pased, HD based and GPU based, piracy will be eliminated. To modify a processor, you'd have to open it up. To do this you would need a clean room. 100% dust free, 100% microwave free, !00% static free. Totally infeasable. remember all those intel commercials with thse people dressed in nuke suits, that is what you would need, on top of the clean room. Not gonna happen.

bingo is on to it

they seem to like to throw the blame around saying its this direct x api crap but really u think about it, it takes years to do a game and not only that look at the quality difference between the console and pc. the pc versions r usualy smoother cause of res and mayb a few more textures n crap but if say they fixed this api issue they point there finger at, who or wat gaming company could afford to spend more years making way better textures n all just for the pc version of the game. in my opinion things wont change for along time.

also now take a look at amd,intell,ati and nvidia. to do such a drastic change to make performance look at all the $$$ they r going to have to spend to reserch and build the new so called operation systems to reach such performaces... i think it myt b a leap to much for them to take on board with the risk of it all going down hill if it dont work out.


This is old BS or written by a single employee that clearly hates MS, ATI has been long experimenting with Fusion enabling CPU and GPU to pass calls. Now AMD is going to be releasing the worlds first CPU/GPU in one processor, how exactly does this post hold up against that?

first? intel already released

first? intel already released their own. just a little beyond that HOWEVER they're still separate cores so they'd have to transfer between eachother (though that would be faster than between a dedicated gpu it'd still slow things down) but the speed up you'd notice is minimal since.. it'd be even worse than having a old school onboard gpu. its not something thats intended for playing games beyond shittly little casual games... and even then it'dl ikely struggle


Both of you appear to be confused. Mentioning things that are completely unrelated to what the article says.

You don't really seem to know what you're talking about so please go educate yourselves on the subject.


At first I thought..... nahhhh can't be true...

Then I thought of what the original xbox had, 700 MHz, GeForce3.

On a PC with 700 MHz and a geforce3 ti, I wasn't even capable of running Halo, even with the appropriate CPU upgrade, it wasn't close to the way it was on the xbox.

Just FYI...

The Xbawks didn't have a GeForce3 as was/is so widely had its own GPU, the "NV2A", which was actually a GeForce4 in architecture. The Xbawks got the better GPU before the PC did. Point being, not really a fair comparison.

huh? AMD lets this guy talk?

so, uh, you want to talk directly to the hardware like back in the days where you needed a specific chipset to run a game? fuck that.

want tons of control? go quasi-software rendering. build a game out of freaking spheres or whatever instead of triangles.

i get it, every unreal engine 3.+ game looks basically the same, so do something about it. create the environment AMD.

Indeed they do

"so, uh, you want to talk directly to the hardware like back in the days where you needed a specific chipset to run a game?"

No, what the guy seems to be suggesting is that developers work 5x more in order to code natively for each major gpu out there.

"do something about it. create the environment AMD."

That's not something they can do. This issue is much bigger than any single company.

Nah, in order to translate

Nah, in order to translate those the game engine would have to be recompiled, and there's no way in hell a company who invested millions to make one (or license one) would just send the source code to people just so it can run faster.

The same guy who sell his

The same guy who sell his console is building your API. Go figure why it doesn't perform as well as it should.
The ultimate way would be an API created for gamers, you boot your computer on the Gaming OS and then black magic occur. No more ram for the running windows in background. No matter if you are on Win,Linux,Mac: it works at it's full capability and unite gamers.

Problem is, no one got the same hardware so the OS would need "presets" in which he/we choose the performance depending on the cpu/gpu/ram clocking capability.

.. Maybe in 30 years? Stop buying stupid useless cards at 550$ and maybe Nvidia/AMD will come out with that solution to sell high end cards which would have a use.. finally.


"Go figure why it doesn't perform as well as it should"

Because PC's have literally hundreds of different GPU's while each console has one type of GPU. The API must translate the graphic instructions into whatever your gpu understands. With consoles no translation is needed as there is just one GPU (per console type, 360/ps3/wii).

"The ultimate way would be an API created for gamers"

Obviously you don't know what an API is. Only programmers use API's.

"you boot your computer on the Gaming OS and then black magic occur. No more ram for the running windows in background. No matter if you are on Win,Linux,Mac"

Really? You boot into the Gaming OS while running another OS like win/linux/macos? You don't even seem to know what an OS is.

"Problem is, no one got the same hardware so the OS would need "presets""

Hello DirectX, how are you? Is that one of the things you do? Yes? Oh cool!

"Stop buying stupid useless cards at 550$ and maybe Nvidia/AMD will come out with that solution to sell high end cards which would have a use.. finally"

If they did, then there would be no difference between Nvidia and AMD cards besides clocks, memory amount and stream processor amount. I'm sure neither likes the idea of not being able to get an edge over the other.

All in all, you're just a kid who has no idea what you're talking about (I think the black magic part gave you away D:).

This is a faulty conclution

The issue here had nothing to do with DirectX It would be a problem with any api that tries to unify the way you write code across different hardware. the same is true for openGL or any api that tries to hide the differences in hardware apis. the only way to "fix" this is either to have all devices use the same hardware api or code in the hardware api for each specific spec

Maybe in the next Windows?

Maybe this will come in future windows apps, but be damned If I have to go buy new hardware. Why dont the software developers collaborate to develope thier own OS for playing games? It would be nice to see what a doped out PC could do. Alas, I doubt the payoff would be worth it.

I'm not sure I get it

I have a PC with 2 hd5850 and a i7 920. The graphics for the PC are far beyond the 360 in dx11 mode. I can tell a world of difference. I can see how the extra step would limit the ability, but I disagree that "Its not that much better".


I agree. If they used a unified API that was identical for all GPUs regardless of manufacturer, this whole duplication effort would be a thing of the past.

Of course a bigger problem is the cross-over between console and PC games. Developers likely wouldn't want to make PC versions of their games that wouldn't run on consoles (more expensive, makes the console-gamers feel bad), meaning we would still face the same bottleneck on the PC. Exception to that of course being PC-exclusive titles.

Well that might be the case,

Well that might be the case, but it's also possible that next-gen consoles (we're talking future after all) would also run games created this way. In other words what AMD say's is that DirectX is a thing of the past and it should be droped. But just how big of a performance gain we're talking about anyway?

you're talking huge

the performance gain would be phenomenal. most idiots in this forum (prior to this thread) really don't understand that this isn't an ATI problem, a direct X problem, or a microsoft problem.
it's a lack of unity, that forces PC developers to talk to both the CPU and GPU in their games. think about it - unifying PC and major consoles of the future could REALLY crank out 'next gen' titles, instead of 'more textures/higher res. gen' titles


You and all the poster before you are just ignorant about the subject.

"develop a dedicated API that communicated between the Gpu and games"

That's what DirectX is for.

"If they used a unified API that was identical for all GPUs regardless of manufacturer, this whole duplication effort would be a thing of the past."

And there would be no difference between GPU's thus destroying any possibility of competition between manufacturers, meaning the market would simply plunge into an abyss.

"In other words what AMD say's is that DirectX is a thing of the past and it should be droped"

Nope, working without directx is a thing of the past, even though it does yield better performance. It was dropped somewhere between one and two decades ago. No dev studio wants to code graphics for 10 or 20 different gpus anymore. If they do, it can increase development time considerably between coding and debugging, not to mention the need for even more test machines.

"it's a lack of unity, that forces PC developers to talk to both the CPU and GPU in their games"

Really? Because that doesn't happen in consoles? You sir are as uninformed as one can get. That's not a problem at all, in consoles they even have more stuff they talk to like specialized processors you don't even know about.

The whole problem is with PC's, the pc is a hererogeneous platform, you can build it with several different components from different manufacturers. The difference between these components creates the need for an unifying api like DirectX. It makes dev work so much easier, the drawback is a loss of performance. There is no easy way around that. But as horsepower increases on both CPU and GPU so does the maximum amount of drawcalls (do you even know what those are? Didn't think so).

Add new comment