Next Unreal Engine In 2014 And Supports 20 Cores

The next version of Unreal Engine won't be out before 2014 but it will be worth the wait, Epic Games CEO Tim Sweeney claimed.

"I spend about 60 percent of my time every day doing research work that's aimed at our next generation engine and the next generation of consoles," he said.

"This is technology that won't see the light of day until probably around 2014, but focusing on that horizon enables me to do some really cool things that just aren't practical today, but soon will be."

Sweeney then explained his belief that the biggest challenge and opportunity facing graphics engines in the near future is dealing with multiple CPUs.

"Once you have 20 cores, you can't easily say this one is going to be for animation and this one is going to be for details on the face of the character, because all these parameters change dynamically as different things come on screen and load as you shift from scene to scene," he explained.

"The big challenge will be redesigning our engine and our workload so that we scale more of these different computer tasks between CPU cores seamlessly in real-time and dynamically so that you're always getting the maximum computing power out with the engine, regardless of what sort of work you're doing."

Add new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.

Comments

Misinformation

There's a lot of it in this thread.
360 has 3 cores with 2 threads per core. Thats a total of 6 threads. Or essentially 6 virtual processors available to all do different things simultaneously.
Intel Xeon chips now have 8 cores with HT, which is also two threads per core. Or 16 virtual processors. this is because most servers use VMware or virtual server technology to allow a single linux based OS to run 20-30+ virtual copies of windows, or linux. All seperate individual operating systems where you can choose the amount of processors or ram you want to allocate to each one.
As far as I have been lead to believe by software developers i used to work with. If you make an application that can use multiple threads, it will use as many as you can throw at it. So if a game uses 4cores or 8cpu threads, it will also use 12 or 16+. In the same wasy 64bit applications will use as much RAM as the OS will give it, while keeping itself stable.
Just something to think about...

You seem to be confusing

You seem to be confusing threads and cores here. If you make an application that uses 2 threads, it will only ever run 2 threads but these threads will be allocated to random cores dinamically by the OS. While coding you specify which pieces of code (or classes if you're oop savvy) can be run on threads that are not the main one, however you still specify the amount of threads that will be running. That's where it gets tricky, because threads are executed independently from each other it gets hard to simply break your engine down into independent blocks that can be executed in any random order and still run faster than a single threaded system as threads would need to check each others state to make sure they're not too far ahead. As an example, what if the engine renders frame 350 before frame 250? Someone (by that I mean a piece of code) has to check the order of frames before displaying and thats extra work thats not required on single threaded engines that do things in sequence. Efficient multicore programming remains one of the biggest challenges in moderm programming. Its easier for some apps like compression or video/audio encoding (where the end result can be assembled in any order because its a file) and harder for others like games where everything is done and outputted in real time.

If you know anything about reality...

Its not 2000+ cores in one CPU though, its 2000+ cores using several CPUs with a few cores each. Quite different and used for supercomputers only, not home pcs. Thus irrelevant regarding the article. That and forget nanotech as it has nothing to do with CPUs yet (no matter how much manufacturers may say otherwise), it's still in its infancy and will take many years to be applied to market products.

Meh

Next the companies are going to realize packing sandwich layers out of cpus isn't enough, they will eventually start folding the sandwiches and compressing the goo into tiny fragments. Then depress more folded sandwiches on top and stick a hair pin in it because the average toothpick is too brittle.

Xbox

the year of new Xbox
is this the specs for new xbox maybe

new xbox will have ate least about 10 ou 20cores

is Epic working for new XBOX and microsoft will rock again

I think it's a little

I think it's a little ridiculous to be talking about 20 cores when game engines havn't broken the 2 core threshold.

Also it would be good to point out that we have had 4 core cpus for the past 5 years. That really means the next 3 years will not produce 16 more cores out of nowhere.

The maximum we should be looking at for 2014 is 8 core systems running at anywhere between 2.4GHz-3GHz; anything extra would probably just be a waste of time and waste of resources. We havn't gotten many programs taking advantage of 2 cores, let alone 4 or 20.

your a lil off

we have 12 core server processors now, next year they will be in every day computers processors. There for, its not unreasonable to thing that 18-24 will be out in 2014

nah

Actual there is currently alot of support for at least 4 cores in games on pc. Obviously the 360 only has 2 cores avaliable and has been the standard for too long but we are past that now.
I also know alot of programs that support at least 8 of my cores, I havent really seen it go past that yet though.
And intel has studied the effect of wasting resources and they reckon its around the 1000 core mark.
Also the latest amd bulldozer 8 cores released last week are much closer to 4 GHz which you should find will be the norm by 2014 although ghz rating is not really a good way to judge a cpu we have had 3ghz since the pentium 4 days, the higher the ghz the more cooling needed basically, not to mention the 10Ghz cap that copper itself has so dont expect much advancement anytime ever.

if the different between a

if the different between a dual core and quad core at a low res like 1280x1024 is only ~4fps, saying a CPU in games is a bottleneck is a nearly useless statement.

honestly, cores dont matter much in todays games when it comes to CPUs, its the architecture that matters. thats why a new shiny CPU makes your games run a little better.

of course, the higher the resolution the less a CPU affects framerate.

that said, theres absolutely no need for a high end CPU if you play games at somewhat high resolution (~1920x1080) and have a good GPU.

go middle of the road with CPU, and as high as you can with GPU.

I may be wrong but I belive

I may be wrong but I belive there are already 8 core CPUs in the market, not that there are many uses for them ATM though (unless your running it in a server). Putting that aside, this will be the next game engine so it should have a life of about 4-5 years, if it comes in 2014 we could still be playing games with this engine in 2020. I do agree with you that game devs can't seem to be too good at coding games for more than 1 or 2 cores... but considering how CPUs are getting more and more complex in that way, they are probably gonna have to work with that in the future.

20 core CPU

Linux servers are running 12+ cores right now. Sure, consumer markets haven't really broken past 8 cores but that doesn't mean they won't soon.

My friend has a 6 core CPU and mine is a 4 core. You're right about games not really using 4 cores yet but that means the game can use two and background applications can use the other two.

Add new comment