NVidia Explains Why It Refused To Develop PlayStation 4 GPU

According to NVidia’s senior vice president of content and technology, Tony Tamasi, NVidia was approached by Sony to develop and produce PlayStation 4’s GPU, but they refused after deeming the deal worthless.

“ I'm sure there was a negotiation that went on and we came to the conclusion that we didn't want to do the business at the price those guys were willing to pay,” he said.

“Having been through the original Xbox and PS3, we understand the economics of the development and the trade-offs.”

Of course developing PlayStation 4 GPU would’ve been profitable for any company, or else AMD wouldn’t have accepted it. In fact, NVidia was more concerned with the opportunities they’d miss if it directed its resources to that venue.

“ If we say, did a console, what other piece of our business would we put on hold to chase after that?,” explained Tamasi. “In the end, you only have so many engineers and so much capability, and if you're going to go off and do chips for Sony or Microsoft, then that's probably a chip that you're not doing for some other portion of your business.”

PlayStation 4 was announced last month. It runs on AMD's 8-core 64-bit x86 Jaguar CPU and a customized Radeon GPU capable of churning out 1.84 TFLOPS.

Except for a couple of quarters at the beginning of 2010 and 2011, AMD has been accumulating losses for the last 5 years with $473 million loss accrued during Q4 2012 alone. Clearly, the company hopes that a lucrative long-term multi-million units deal such as next gen consoles’ GPU and CPU production would provide a safe and guaranteed revenue source for years to come.

Add new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

Comments

Must be nice not to know platforms

I assume this is a typo listed above "AMD's 8-core 64-bit x86 Jaguar CPU" 64bit are not x86 but x64 most other news outlets list this as "AMD's 8-core 64-bit x86-x64 Jaguar CPU" which makes more sense then saying its a 64bit architecture for the x86 platform.

fail

Both guys below are correct and you, sir, are an idiot. Stop spreading your ignorance around the internet, it has enough of that without people like you adding to it.

Re: Must be nice not to know platforms

LOL - Guy with name "TechGuy123" makes post implying someone doesn't know platforms, then mentions "x64". "x64" is a Microsoft marketing department name shoved into a few GUIs and brochures. It's not a proper name for anything. The correct names are "x68 64" or "AMD64". Even Windows itself used the name "AMD64" on the disk, and refers to itself as "Windows 64-bit" in the properties pages.

x86 refers to the

x86 refers to the architecture of the processor based on the old IBM. Any processor built with this architecture is labelled x86, whether it is 32-bit or 64-bit. Some examples of processors that are not x86 are the ARM and the PowerPC chips.

Hey Bro

Bro, when your birthday and address? Let me send you a dictionary. Churning does exist, it means to output.... google and save making yourself look like a dumbass.

derp

Don't blame people for utilizing a language correctly. If you don't understand it its your fault. Also there is no such thing as an x64 bit processor, all PCs use x86 processors, x86 is the architecture. We have 32 bit x86 processors and 64 bit x86 processors.

So "churning" is a technical

So "churning" is a technical term now? The Internet really is infested with fucking moronic pieces of shit these days, isn't it? I'm going to just kill myself now because the pain of living in this world, full of idiots, has just leapfrogged my ability to cope with it. Goodbye.

Well said

As you said they are making LOTS of money with Tesla, selling cheap GPUs to MS or Sony is not the same as selling the same or a better chip on the OEM market... Also is possible that they want AMD to make money because a full monopoly on gaming GPUs wouldn't be good legally for them if AMD goes bankruptcy, and lets face it Intel GPUs are a joke as competition... Intel fanboys rage in 3... 2... 1...

I have a theory, I think its

I have a theory, I think its Autodesk that's keeping Nvidia afloat. If your a product maker like Autodesk that dumps millions every year in order to sell visualization software of all kinds, it only makes sense your partners design something that meet their customers needs in this case Nvidia.

For a theory you need proof

I recommend you to read about the scientific method, your "theory" is not like the "universal gravitation theory" <--- This is a REAL theory. In real life your "theory" it's only an idea, a conjecture... that needs solid proof...

Exept for...

In my experiance autodesk "workstation" (Autocad 3dmax Revit etc) products work better with AMD products. Rendering on the other hand, is where Nvidia are making their bigger numbers.

amd in trouble

I hope this deal makes amd profitable again. I believe competition is a good thing for the consumer. If amd dies I feel the cpu and gpu markets will be in more of a monopoly than they already are. If amd dies, intel and nvidia will be able to charge more and they will research less. This is of course my opinion. :-)

True, competition increases

True, competition increases progress a bit, lowers prices for sure, etc. But to be honest, desktop CPU and GPU progress are kind of stale due to: mobile market growth and consoles. Why do we need awesome GPU's if all we get is Dx9 console ports instead of real Directx 11.1. From 2000-2007 I had to upgrade GPU every 2 years or I was stuck at playing new shit in low settings. I still have a 8800 Ultra from 2007 and it plays most games at high. So half a decade and barely a need to upgrade, that is how slow games have increased graphically, so why does hardware need to progress if software is stale.

This is true - I've only

This is true - I've only bought two GPU's in the last 6 years, and neither of them were particularly expensive or fancy (A 260 and now a 7850) yet still kick(ed) ass. Game devs are also learning to do more with less powerful hardware because of the long last generation of consoles, and the optimisations really shine through.

Add new comment