Can Intel's Iris graphics chips offer alternative to gamers?

Can Intel's Iris graphics chips offer alternative to gamers?

For more than two decades, if you wanted to play the latest games with the best graphics, you've always needed a graphics card. That was true of Unreal Tournament in the '90s and it's true of Star Citizen today. But of course gaming is quite different now than it once was. We have smartphones that are more capable than consoles of not too long ago, and game streaming that means you need almost zero hardware of your own.

But building a gaming PC ultimately still means shelling out for a dedicated GPU from either AMD or Nvidia, because the performance there is necessary for anything but the more simplistic games. Or is it.

It's certainly the case that if you want to be a Citizen of the Stars you are going to need one, but what about for less high-end gaming? If you prefer your games to come from the indie crowd, could you make do with Intel's on board GPUs?

HD Graphics might be a push, as despite big, regular gains in performance, those just aren't as impressive as most will need them to be, but Iris and Iris Pro, now that's something worth discussing.

Intel itself has been quoted as stating Intel Iris graphics are more than 30 times more powerful than what top-end HD Graphics were capable of five years ago, Part of that has been down to the way the chip giant builds its CPUs. They've shrunk in form factor repeatedly over the years – following Moore's Law – but instead of dumping all of those extra transistors into the CPU as you might expect, much of them have gone into improving the iGPUs instead.

Indeed just looking at some benchmarks of systems running some of the more recent Iris Pro graphics cores shows them to be perfectly capable of playing newer games like Bioshock Infinite at low settings, older games like Half Life 2 without difficulty and even some truly beautiful titles like GTA V with a few tweaks to drop the resolution down.

Of course it does help if you are using a hefty CPU in the first place. Although the graphics chip is still going to be the bottleneck if you try high-end gaming, making sure you're using an i7 to get those extra cores and a K chip if you want to overclock to get a few extra MHz wouldn't hurt.

But when it comes to Iris and Iris Pro graphics, you are actually quite restricted in your options. Unless you opt for a Broadwell chip, there are no desktop choices. Skylake chips with Iris and Iris Pro graphics are restricted to mobile CPUs only, which is a shame, as there is certainly something to be said for a desktop Iris supporting chip (with that on board eDRAM) as they can help boost overall performance.

But that aside, in reality if you are getting a desktop PC you will probably opt for a dedicated graphics card, so it's good that Intel has focused on its mobile range with improving graphical power for integrated GPUs.

And power they do have. It's easy to suggest that Iris and Iris Pro can offer a comparable level of performance to many discrete GPUs, especially the really underpowered ones often included in mid-range laptops. Anantech found it capable of handling low-level Alien Isolation and Total War: Atilla gameplay.

Frame rates weren't great and of course these games were on lower settings, but you have to imagine that less demanding games like League of Legends, Hearthstone or other popular competitive titles shouldn't struggle too much.

While it seems unlikely that Intel's Iris graphics are going to take the fight to either Nvidia or AMD any time soon, the interesting aspect may be that it encourages AMD to new heights with its next-gen Zen CPUs. It can already hold its own with its APU against Iris in some benchmarks, so with its next-gen and better support for multi-core CPUs under DirectX12, perhaps we'll see AMD mount quite an assault on all of its long time rivals.

But then again Pascal and Polaris aren't far away either and they are promised to give us unprecedented levels of graphical performance. They could make Iris and just about everything that's come before them look pedestrian in comparison.