2237: The Insufferable Frame-Rate Obsessives May Have a Point

0237_001

I upgraded the processor on my PC yesterday. It was the last bit that needed upgrading to make it decently up-to-date, and I’d been meaning to do it for a while. It was also a good excuse to wipe everything, reinstall Windows and have a nice fresh, clean system that wasn’t clogged up with all manner of crap. For a little while, anyway.

PC gaming, for many people, is the relentless pursuit of ever more impressive frame rates, preferably at ever more impressive resolutions. I’ve never felt particularly strongly about either, given that my PC is hooked up to my TV and thus is limited to a maximum of 60 frames per second at 1920×1080 resolution; in other words, anything above 60 simply wouldn’t benefit what was on screen at all, and in fact would often result in unsightly “screen tearing”, where different parts of the screen update at different times. Consequently, I habitually play everything with VSync on, which limits the frame rates to 60 and completely eliminates any tearing. It’s kind of deliberately hobbling performance to look better.

That said, even with a theoretical maximum frame rate of 60, my old processor couldn’t quite keep up with some of the more modern games. I have a decent graphics card, so nothing was actually unplayable, but I knew that I could probably get more out of said graphics card with better base hardware. Final Fantasy XIV, for example, ran perfectly well at anywhere between about 30 and 60 frames per second depending on how much was going on at the time — it would be pretty damn smooth in the relative peace and quiet of instanced dungeons, while the frame rate would drop a fair bit in densely populated areas or busy battle scenes with lots of players. I’m not someone that these frame rate disparities bothered a great deal, but they were noticeable.

So with some degree of curiosity, after assembling the new bits and pieces and putting my computer back together, I fired up Final Fantasy XIV to investigate if the performance was any better. After a little fiddling with settings — previously, it ran better in “borderless windowed” mode, while now it runs better in dedicated full-screen mode — I was very pleased to discover that it was now running at an absolutely rock-solid 60 frames per second, constantly, regardless of what was happening on the screen at the time. It didn’t make a massive difference to the visual fidelity of the game, but it was nice.

Then I jumped into a dungeon, and the true nature of the improvements better hardware brought on became apparent. While the graphics had never really struggled much in dungeons — except with the bizarre bug in the old DirectX 9 version of the game where facing certain directions would cause your frame rate to tank, presumably because the game was trying to render more “out of sight” stuff at once — what really became obvious as I was running with my new hardware was how much more responsive everything was. While the background graphics never really struggled much on my old rig, you could occasionally see things like the interface elements juddering a bit, particularly the damage numbers and status messages that scroll up and down the screen during combat, keeping you informed of what’s happening.

Now, those messages are just as smooth as the animations and effects. More importantly, the controls are significantly more responsive, because there aren’t any “dead frames”, for want of a better word, where the game doesn’t register a button input for whatever reason. It was a minor issue before; now it’s completely absent, which is lovely. I hadn’t anticipated quite how lovely it would be, but it really is; knowing that my performance can no longer be hampered by the complexity of the visuals on screen or how much is happening at the same time around me is a thoroughly pleasant feeling, and, surprisingly, makes the game more enjoyable.

So okay, I’ll admit it; frame rate does make a difference. Sometimes. I maintain that “cinematic”-style experiences such as adventure games and their ilk don’t particularly benefit from 60fps visuals — they can look nice, but if you’re going with realistic imagery, 30fps can sometimes look more “natural” as it’s closer to the frame rate of film and TV — but in games where precision and split-second timing are important — fighting games, shoot ’em ups, arcade games, MMOs such as Final Fantasy XIV — smoother hardware performance leads to smoother player performance. Which is kinda cool.

Oh, and no, I haven’t tried Crysis yet.

1577: Resolutiongate

The absolute most tedious thing about the new generation of games consoles is the endless parade of news stories with the headline “[game name] runs at [resolution] and [frame rate] on [platform]”.

This has happened before, of course, back when the PS3 and Xbox 360 first came out. Billed as “HD” consoles, people were quick to jump on any games that didn’t run at the full, promised 1080p resolution — usually for performance reasons. It was tedious and pointless fanboy baiting back then; it is tedious and pointless fanboy baiting now. And yet still it goes on and on and on, because, well, it’s fanboy baiting that attracts clicks and comments.

I do get the arguments why resolution and frame rate are important. 1080p resolution is noticeably crisper on a large screen, and particularly useful for games where you need to perceive fine details like first-person shooters where you do a lot of fighting from a long distance, or strategy games where you need to be able to parse a lot of information at once. Likewise, 60 frames per second looks lovely and slick — it’s almost impossible to physically perceive anything higher than 60 — and is particularly suited to games in which precision timing is important like, say, manic shooters, music games and driving games.

These two things are not the most important or interesting things about games, though. There are any number of interesting things you could say about upcoming games for Xbox One and PlayStation, and yet it always comes down to this, with the most recent example being Ubisoft’s upcoming Watch Dogs — a game which, embarrassingly, Sony bragged about running at 1080p and 60fps “only on PlayStation 4” only for Ubisoft to subsequently go “aaaaactually, it’s 900p and 30fps…” As the team behind The Witcher commented the other day, these numbers are little more than marketing figures, as Sony’s ill-advised use of them clearly demonstrates; I seriously doubt that Watch Dogs will be an inferior experience for running at a lower resolution and half the frame rate.

In fact, sometimes these things are almost irrelevant, or at least of significantly lesser importance than the universal “EVERYTHING MUST BE 1080p 60FPS” attitude that is starting to take hold these days. Take something like the visual novel Katawa Shoujo, for example, which I’ve been digging up screenshots for to post in articles over on MoeGamerKatawa Shoujo runs at 800×600 — yes, the 4:3 resolution that your old 386 used to run Windows 3.1 in — and looks beautiful due to its gorgeous art style. Likewise, from a frame-rate perspective, heavily cinematic-inspired games such as Uncharted, Heavy Rain and Beyond: Two Souls actually benefit from lower frame rates closer to the 30 mark because it makes them look more like — you guessed it — film, which has historically run at around 24fps. In these cases, bumping up to 60fps may look smooth and slick, but also looks very artificial and unnatural.

In other words, looking good is less a matter of technical proficiency and more a matter of art style and direction — picking the appropriate means of presenting your title, in other words. Like so much else in gaming, this isn’t a “one size fits all” situation, and I look forward to the day where the industry collectively stops assigning such importance to arbitrary numbers and focuses on the important things.