The ongoing debates surrounding frame rate and resolution are always on the lookout for a new target, and most recently it’s been Watch Dogs. Of course frame rate and resolution has only been a concern of the PC master race up until now, and it’s only since Sony and Microsoft effectively turned them into buzzwords with the new consoles that suddenly all the casual gamers and console gamers are joining in on the discussion. And all of a sudden it’s a brave violent new world.

It’s arguable whether many of the bandwagon jumpers know how to spot the difference between 30 FPS and 60 FPS and 720p and 1080p, but I will applaud the likes of Konami and Kojima Productions for at least attempting to show us with their extensively detailed Metal Gear Solid V: Ground Zeroes console comparison video. It’s been a big battle though thus far.

In a recent opinion piece I’ve stated that the current situation is a case of history repeating itself, because I remember when I owned a PS3 in the early years of the last console generation and was always attacked by Xbox fans who bragged about their superior multi-platform ports. Remember how some PS3 multi-platform games had texture pop-ins and frame rate drops and this was due to the idea that the “CELL architecture was difficult to develop for”, which became a mix of buzzwords? This is the very same thing, only there are now ten million billion more gamers than there were at the start of the last generation, so it’s much more vocal.

Personally, I am both a PC and PlayStation gamer, so I very well know the differences when it comes to higher resolution and frames per second. I honestly would rather have 60 FPS than 1080p, because the former can impact response time and make for a much more fluid experience when it comes to shooters, fighting games and even cinematic action games.

Otherwise though, the question of today is: does frame rate and resolution actually matter to you as a console gamer?