NVidia Acknowledges Tomb Raider Performance Issues On GeForce Cards

NVidia has acknowledged the performance issues of Tomb Raider on GeForce graphics cards and promised to release a new driver fixing them as soon as possible.

Tomb Raider is an AMD Gaming Evolved title, so it was developed in close cooperation with AMD. This shouldn’t mean unfavoring NVidia, but the company noted politely that it didn’t receive the final game code of Tomb Raider until this weekend.

Either way, NVidia is now working closely with Crystal Dynamics to finish the fixes as soon as possible. Some of the fixes will come through in NVidia’s next driver update while other will be available only through a game patch.

NVidia issued a statement regarding the matter. The statement reads as follows:

We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.

Add new comment

This question is for testing whether you are a human visitor and to prevent automated spam submissions.


WTF difference does it make?

Are you a dumb or somethin'? Why do you care if you can set a bunch of parameters to maximum that provide no visible difference except trashing your framerate? I haven't played Crysis 3 yet but Far Cry 3 & Crysis 2 both run fine on 1 Radeon 6850 with max settings. My res is only 1280x1024 since I have a 4:3 monitor but no game makes any appreciable use of the power of a top end card from either company. You do not notice a difference between 8X & 24X AA in a game in HD unless your monitor is a large screen TV that you sit as close to as a comp mon. I've never seen a difference between 4X & 16X AF. Just because a game can bog down a setup doesn't mean it looks any better while doing it. You're a dumb dumb. I bet you're one of those basement dwellers who goes around bragging all about his superior dorkware in his comp like it makes up for the pencil in his pants LOLZY LOLZY!!! :D :D :D

You run at 5:4 resolution on

You run at 5:4 resolution on a 4:3 monitor? You're not very clever are you? Just because you can't see the difference between 4x AF and 16x AF on a normal PC monitor doesn't mean others cannot. Also, to some people, "maxed out" is running with high settings and 30 fps, others it's 60 fps without ever getting drop below that.

It matters because someone claims otherwise you douche

Yes, I'm dumb, but not dumb enough to see bullshit and not call it as such, maxed means maxed, claim something is maxed and it's not and I call bullshit every time. Maybe if your comprehension skills were better you'd see the claim and call bullshit as well. Just because you can't see a difference doesn't make the lie true, if the OP had simply said "crysis looks and runs great on my rig" then there is no reason for me to call bullshit, comprende douche-bag?


Words up like yo' on HL-2 it was an "ATI" game & they alls says yos, it runs like a load of ladies ***** on a patch of ice covered in their own feces. It was especially stank on the FX series, so they said. I ran that ***** wide open on mah 5800 no probs.

Ahh, still remember a time in

Ahh, still remember a time in which you would get blown away with custom built games for custom built hardware. Great for innovation and competition but, it was hell at times. Going to your friends house and leaving ****** with how good a game looked and how bad it worked for your own system, waiting to take it back to him or her with another title you knew would do the same for your system. A time in which consoles didnt make 4 year old graphic cards work like a charm. You had Crysis games coming up each year knowing you where on the curve.

GTX 560

If what the guy with the GTX260 says is true, than there are issues with the Geforce 500 series too. To run the game at constants 60 fps on my 560 GTX, i have to dumb down alot of the graphics to near medium, so yeah , there are some issues.

I loved my 260 GTX SO. It's

I loved my 260 GTX SO. It's the best card I've ever owned (not to mention the largest - the thing was the size of a house brick!) I had to upgrade only at the start of this year, not because the card's power was lacking but because the amount of VRAM newer games need is so high -.-' I probably would have been fine if I could SLI another one, alas I only have a single-PCIE motherboard.

Add new comment