Menu Close

Naughty Dog Split On 60FPS vs. 30FPS Gameplay Issue?

When The Last Of Us: Remastered hits store shelves next month, it'll boast an upgraded 60 frames per second.

One might assume this is a 100 percent positive, right? Well, maybe not. Uncharted 4 lead game designer Kurt Mugenau certainly loves the upgrade, as he said on Twitter :

"Played some #TLOU at 60 fps tonight for the first time (60fps cutscenes and all). Transformitive. I’m a believer. Don’t think I can go back."

Dynamics artist Neilan Naicker added that now he "can't play it any other way," referring to the game's multiplayer. But what about the single-player – and more cinematic – experience? 3D environment artist Anthony Vaccaro said in fact, he prefers the slower FPS rating :

"MP [multiplayer] I totally agree with 60 but single player I still prefer 30, feels more cinematic to me. Might be in the minority."

What do you think? Might the higher frame rate negatively impact the cinematic nature of the game? Is it possible that a higher FPS count isn't always better all the time?

Related Game(s): The Last Of Us: Remastered

Subscribe
Notify of
guest
37 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
SaiyanSenpai
SaiyanSenpai
9 years ago

For fast paced multiplayer where you are moving the camera quickly and constantly – yeah, that higher frame rate is a must.

But with single player, 30 fps works fantastic and no one would know or care unless the frame rate drops at times.

Killzone Shadow Fall did exactly that, 60 fps for the mp and 30 fps for sp. There is a reason for that and it worked extremely well.

WorldEndsWithMe
WorldEndsWithMe
9 years ago

He has a valid point, the same is true of movies shot with a high frame rate. They lose a little something… but I think it's worth sacrificing.

The same thing takes place with the Tomb Raider situation, but it's impossible to say I'd rather have 30fps cuz the smoothness and improvement of 60 is palpable.

Corvo
Corvo
9 years ago

<3 ND.

Temjin001
Temjin001
9 years ago

From reading this article alone I can't tell if the guy supporting 30FPS is suggesting that 30FPS is better in that the graphic quality can go higher at 30FPS, thus making it closer to a cinematic experience, or if the smoothness of 60FPS is hindering a cinematic feel.

My thought is this. Games from the third-person with a free-cam are anything but cinematic so bowing down a frame rate to an already non-cinematic representation of game play is a logic error. There's virtually little cinematography involved in TPS game play.

I say the best implementation for a game like TLoU would be 24FPS cutscenes, and 60FPS game play. This way, the interaction of the game world is as good as it can be while the cutscenes still feel cinematic.


Last edited by Temjin001 on 6/20/2014 11:31:25 PM

LimitedVertigo
LimitedVertigo
9 years ago

I'd say anything between 60-100 is the sweet spot for me. 30fps is icky

SaiyanSenpai
SaiyanSenpai
9 years ago

Limited, the human eye can't see anything more than 60 fps, so the only reason to do more than 60 is in 3rd or 1st person shooters where you are moving the camera around hella fast – If you move the camera faster than the frame rate (or even refresh rate of your screen) can keep up with then you will see jumps and screen tearing. Which your eye CAN see.

Games with fixed or semi-fixed camera angles like God of War or something, 60-100 fps would be completely unnecessary.


Last edited by SaiyanSenpai on 6/21/2014 1:16:37 AM

Temjin001
Temjin001
9 years ago

I'm so sick of hearing all the myths about what the human eye can and can't see regarding frame rates. I remember someone trying to tell me that like 16 frames a second was the most a person could see. I heard this like 15 years ago.
Then it became anything more than 30. Now it's anything more than 60. I've chalked it up to a whole bunch of people speaking on something they know little about.

EDIT: btw I did try Quake 3 on a 90hz refresh rate CRT back in the day and yes it did feel smoother than 60fps.


Last edited by Temjin001 on 6/21/2014 1:30:34 AM

sawao_yamanaka
sawao_yamanaka
9 years ago

It is true about the eye not noticing. You can tell by the responsiveness and not your vision. The max an eye can see is 60.

DIsmael85
DIsmael85
9 years ago

Uhm from the research I've been gathering your eye can see more than 60fps. Most folks who don't own a PC that has a monitor with a High refresh rate probably won't notice or are not use to seeing games move at rates higher than 60. Trust me if you PC game and your monitor has a high refresh rate you can see well beyond 60fps as stated it also helps with responsiveness for the player as well. Getting games on consoles to 60fps should be standard. I'm ok with the every once in a while locked 30, but 60 has been around since the PS2 with games like Devil May Cry and Tales of the Abyss.


Last edited by DIsmael85 on 6/21/2014 2:07:09 AM

LimitedVertigo
LimitedVertigo
9 years ago

Thank you Temjin and DIsmael for coming to my aid. I think it's ridiculous whenever someone suggests 60+fps is meaningless. They're either completely misinformed or don't own a rig capable of taking advantage of 60+fps.

I've been enjoying this for decades and I can tell the difference.

Temjin001
Temjin001
9 years ago

like why do they always have to say the HUMAN eye? As if just saying "your eyes" can't tell the difference isn't specific enough. they always have to specify the species it is in a nice and scholarly manner. as if we could be confused with a lion's eye or bat's eye or whatever.


Last edited by Temjin001 on 6/21/2014 10:30:23 AM

Underdog15
Underdog15
9 years ago

"The max the eye can see is 60fps"

That's simply not true.

Temjin001
Temjin001
9 years ago

Oh come on, Underdog! God designed our eyes since the dawn of man knowing that early televisions would refresh at 60HZ. So basically he optimized us for this very era of mankind. That's the long and short of it!


Last edited by Temjin001 on 6/21/2014 11:08:43 PM

PHOENIXZERO
PHOENIXZERO
9 years ago

We don't see in frames per second but still the 60FPS thing is an annoying myth that was debunked ages ago.

Akuma_
Akuma_
9 years ago

The day I watched a side by side comparison of Tomb Raider old gen/next gen I was 110% convinced that higher frame rates are ALWAYS better.

60fps just looked sooooooooo much smoother than 30.

There are many people that say "30fps is fine!!!" well yeah sure, any frame rate that doesn't ruin gameplay is fine, but you can't deny that higher frame rates are always better.

ethird1
ethird1
9 years ago

Naughty Dog should stfu and make a 60 fps Crash Bandicoot for ps4.

DIsmael85
DIsmael85
9 years ago

The game running at 60fps will not take away the cinematic feel at all. If anything the game will run and play so much smoother.

Gamer46
Gamer46
9 years ago

To start, because I know it's going to be brought up… I realize the Xbox One isn't up to par hardware wise and I don't have one, nor do I plan on buying one any time in the immediate future. Maybe if Sunset Overdrive is really good I'll get one in Dec., maybe, but right now, I'm not much interested in X1 so save the 'you favor Xbox' nonsense because you have no other argument to counter the facts about the weak hardware inside your precious piece of plastic the PS4 (the console I also own) from wonderful and perfect corporate mother Sony.

I am extremely bothered that people keep talking about frames per second in regards to the supposed most powerful console of this gen. This whole '30 fps feels more cinematic' nonsense is the biggest bunch of bull. In 2014 every game should be running 60 fps, period, but the bottomline is the PS4 is simply not a capable console. Now, developers can't come out and just say that, though I'd have more respect for them if they did. Instead they bring up this garbage about being 'cinematic' and the fanboys eat it up. The fact is both Sony and MS had to make their new consoles affordable and unfortunately the hardware suffered massively because of it. I was hoping we'd be seeing the vast majority of games going 1080p, 60 fps, but that simply will not be the case and the sooner the devs stop talking about it the sooner everybody can move on. It sucks to have to 'deal with it' but that's the way it is. Maybe PS5 and the Xbox Two will provide powerful, up-to-date hardware at an affordable price when they release, unfortunately for now we're stuck with underpowered machines for at least 5 years, probably longer.


Last edited by Gamer46 on 6/21/2014 4:16:31 AM

DIsmael85
DIsmael85
9 years ago

Can't argue with you there, but it's a new console, let's give this another year or so before we jump on the it just can't do it. The hardware on the inside isn't just old crap that can't handle it, it's just that Devs need to figure out how to do so. I know they should already know blah blah, but this isn't like developing for a PC, this is a "Console". So while I do understand and mostly agree with you, let's see how things turn out in 2015.

LimitedVertigo
LimitedVertigo
9 years ago

The problem is they're chasing the dragon. Each new console allows for greater detail which results in a tradeoff with fps. I don't see this trend ending on the console side.

Lord carlos
Lord carlos
9 years ago

The 1st year of a consoles lifespan tends to be lame compared to year 4!
Plus devs are pressured to get games out asap in the launch year.

Underdog15
Underdog15
9 years ago

I don't think the issue is as much about getting 60fps so much as the fact that making that commitment means you need to have resources in place to achieve that. Sometimes sacrificing 10-15 fps can mean a tremendous amount of detail and other calculations can take place in it's stead. That rule will apply to everything in the future until obtaining 60fps is so incredibly minor in terms of calculations that it's a moot point.

SaiyanSenpai
SaiyanSenpai
9 years ago

Whether it's a console or video card, you can get games to run at crazy high resolutions with crazy high frame rates, it's just a matter of how much money you are willing to spend on the hardware to make it happen.

No one is going to spend $2,000 on a console capable of such things. Nor a PC, if they have any sense.

And personally, I'm done with the pointless crusade of chasing the high frame rates on my PC and happy to have a $400 console that doesn't seem to age as harshly as my PC does, all while providing me with great gaming experiences.

It's really three variables to deal with – "expense," "FPS," "Fidelity," and each have an impact on the other. Keep arguing about which is better all you want, but it's something that won't go away, not in 2014, not in 2024. As the hardware keeps improving, so will developer's ambition to do more and push those limits.

LimitedVertigo
LimitedVertigo
9 years ago

"Whether it's a console or video card, you can get games to run at crazy high resolutions with crazy high frame rates, it's just a matter of how much money you are willing to spend on the hardware to make it happen."

The difference being that new video cards come out monthly that increase the visuals while consoles are fixed in during their life cycle other than maybe a few minor boosts in performance. Your comparison is flawed.

"And personally, I'm done with the pointless crusade of chasing the high frame rates on my PC and happy to have a $400 console that doesn't seem to age as harshly as my PC does, all while providing me with great gaming experiences."

I've been PC gaming since the mid 90s and I've never had to "adjust" my hardware more than every 4 years to keep up with 60+fps in max settings.

I think your misinformation does a disservice to this community by misleading them on the cost/time ratio in regards to PC gaming. Owning both PC and console is the best of both worlds. I'm currently enjoying Ni No Kuni on my PS3 and later tonight I'm going to enjoy some BF4 on PC with some friends.

Boom

SaiyanSenpai
SaiyanSenpai
9 years ago

@Limited

"Misinformation?" hahaha! Experience my friend.

You think that more powerful video cards coming out every month is an advantage over a slower console release; however, I see that as a flaw.

Much like myself at one point, you seem to be under the misguidance that everything just boils down to what has more powerful hardware, but operating environment has everything to do with that as well. Console developers can develop a lot "closer to the metal" as they say, enabling them to squeeze everything they can out of the hardware. It doesn't change on a monthly basis so the hardware and all the data bottlenecks are known quantities and they can really work to that. AMD's Mantle is big deal for PC gamers because it will slim down that operating layer that has been hindering PC efficiency.

Just look at The Last of Us – it looked and played great on 8 year old hardware that is nothing by today's standards.

Boom! Misinformation, psshh!

I agree that every 4 years is a good time for a video card upgrade, and a $200 one should suffice for most games. But every 8 years it's time for a whole new rig, which is the milestone I'm at right now (7 years for me to be exact). If you've been PC gaming since the 90's, then you should be well aware.

And I also agree that comboing PC and console gaming is the best of both worlds. I really hope my dying PC can hold out a little longer until the Broadwell chips come out, but even then, I don't think I will be doing much gaming on my next PC. Unless those steam controllers do something to revolutionize my couch PC experience, but I don't see that happening.


Last edited by SaiyanSenpai on 6/21/2014 4:54:14 PM

LimitedVertigo
LimitedVertigo
9 years ago

""Misinformation?" hahaha! Experience my friend."

It's misinformation when your "experience" is based on flawed logic and clearly wrong.

"You think that more powerful video cards coming out every month is an advantage over a slower console release; however, I see that as a flaw."

How could more variety and improved technology ever be a flaw?

"Much like myself at one point, you seem to be under the misguidance that everything just boils down to what has more powerful hardware,"

No, I most certainly do not. I'm just pointing out how flawed your logic is within the context of your original statements.

"Console developers can develop a lot "closer to the metal" as they say, enabling them to squeeze everything they can out of the hardware."

I'm not disputing this but I fail to see the point it makes. Are you suggesting that console developers make better looking/performing products because they are able to "squeeze" everything out of the particular console they're developing for? If that's the case you're wrong.

"AMD's Mantle is big deal for PC gamers because it will slim down that operating layer that has been hindering PC efficiency."

An issue I've yet to encounter in my years of PC gaming 🙂

"Just look at The Last of Us – it looked and played great on 8 year old hardware that is nothing by today's standards."

…and still looked similar to what I was playing on my PC years ago. Kudos to ND for maximizing the potential of the PS3 but it's ludicrous to suggest maxing out the PS3 is equal to what can be done PC side.

"Boom! Misinformation, psshh!

You didn't actually counter anything I said with valid points. You just made even more incorrect statements based on either false information or a narrow view of reality.

"I agree that every 4 years is a good time for a video card upgrade, and a $200 one should suffice for most games. But every 8 years it's time for a whole new rig, which is the milestone I'm at right now (7 years for me to be exact)."

Which just happens to be longer than the average gamer changes consoles. So basically you've just pointed out that I require a PC "upgrade" less frequently than a new console.

"I don't think I will be doing much gaming on my next PC. Unless those steam controllers do something to revolutionize my couch PC experience, but I don't see that happening."

Or you could just do what most PC gamers do in the living room, use a dualshock or a 360 controller.

SaiyanSenpai
SaiyanSenpai
9 years ago

Why on Earth would I use a gamepad with an FPS on PC against people that would most likely be using mouse-and-keyboard? And you accuse me of flawed logic…

And let me get this straight, having to update my PC twice within 7 years (once to upgrade video card and again a few years later for another video card along with EVERYTHING ELSE), how is that less frequent than a console? By my calculations, that averages to an upgrade every 3.5 years for me, with the second upgrade costing me over a grand. And that's not including when I had to replace the power supply, AND the fact that my B-slot RAM died three weeks ago.

Your inability to comprehend rudimentary math has made conversing with you very unappealing.

And I get the impression that you have taken all that I've said as some kind of personal attack. Let me assure you that has not been the case – well, maybe with the exception of the above "rudimentary math" part. My bad… Now, that might not be the case but your seeming need to counter everything with *something* no matter how ridiculous, makes it feel that way to me.

So sorry if you felt like I was attacking the very core of your being, but what I am most sorry about, is that you probably missed this opportunity to actually learn anything.


Last edited by SaiyanSenpai on 6/21/2014 8:02:49 PM

LimitedVertigo
LimitedVertigo
9 years ago

"Why on Earth would I use a gamepad with an FPS on PC against people that would most likely be using mouse-and-keyboard? And you accuse me of flawed logic…"

When did I ever suggest you use a gamepad for a FPS? I don't do that nor do I know any PC gamers that use them. I was referring to your "PC couch experience", you are aware that there are plenty of game genres on PC other than FPS, right?

"And let me get this straight, having to update my PC twice within 7 years (once to upgrade video card and again a few years later for another video card along with EVERYTHING ELSE), how is that less frequent than a console?"

Once again you're attempting to do the whole "apples vs oranges" thing. If you bought a video card that ran max settings it will still run games with similar detail 7 years later. Just like a PS3 will play similar looking games 7 years into its cycle. If you choose to upgrade your video card twice in that span in order to get the best graphics possible than more power to you but that isn't something you're able to do with consoles so it isn't at all fair to lump that in with your example.

"By my calculations, that averages to an upgrade every 3.5 years for me"

Thankfully I don't buy/build to the level you do.

"with the second upgrade costing me over a grand"

Sucks for you 🙂

"And that's not including when I had to replace the power supply, AND the fact that my B-slot RAM died three weeks ago."

Your having faulty hardware or inferior hardware doesn't add validity to your point. I know people that have had PS3s die. I don't boast "Omg if you're a console gamer it costs you $1200 in the span of 3 years".

"Your inability to comprehend rudimentary math has made conversing with you very unappealing."

Are you referring to my inability to see the future where you change the formula and the numbers? Sorry, I wasn't aware there was a new type of math called "Future Math".

"And I get the impression that you have taken all that I've said as some kind of personal attack."

Then you're new around these parts. I don't take anything said on here as personal. It's a videogame site…

"well, maybe with the exception of the above "rudimentary math" part."

I'm glad you wrote it. It made you look stupid.

"your seeming need to counter everything with *something* no matter how ridiculous, makes it feel that way to me."

Is that not the common flow to a conversation/debate/argument? Are you used to talking to a wall? I'd love for you to point out one ridiculous thing I've said.

"but what I am most sorry about, is that you probably missed this opportunity to actually learn anything."

The irony in this comment of yours is overflowing.

Gamer46
Gamer46
9 years ago

Six thumbs down? Really? Yeah, I'm just 'trolling' nothing I posted is true. Damn fanboys.

Underdog15
Underdog15
9 years ago

Maybe people just disagree with you. Graphical technological advances are slowing as silicon starts to plateau.

PC_Max
PC_Max
9 years ago

The increased frame rate will affect the overall look and feel of the game. It comes down to what people are willing to except and if its first time players buying the new version…. they will not know any different, but they should still get an overall experience as most of us did playing it on the PS3.

Keep playing!

Ninja_WafflesXD
Ninja_WafflesXD
9 years ago

What is with this mentality?! First Ready At Dawn with their comments about striving for 30 fps because of a "cinematic feel"? And now Naughty Dog?!

First, and foremost, you're designing and making a game. Not a movie. The term "cinematic" should not come into the equation, ESPECIALLY concerning the gameplay aspect.
Yes, 30 fps is more ACCEPTABLE for certain genres, but every developer should strive for 60. It plays better, it feels better. There should be no valid reason for settling for 30 fps unless whatever ancient piece of hardware you're designing on can't handle it.

I understand consoles aren't as powerful as PC's, so compromises are going to be made during the development process.
HOWEVER, I don't ever feel it is justified to sacrifice framerate for the game to look that "little bit better or shinier" in the graphics department. Gameplay over Graphics; I don't believe it should be any way else.

Absolutely moronic. "Cinematic" Pfft…whatever.

Gamer46
Gamer46
9 years ago

The cinematic thing is the excuse we will hear from this point until the end of the generation because developers aren't allowed to just admit the truth about PS4 (and X1)that we can all see plain as day. It's funny, Nintendo takes a lot of heat for being cost effective and not making the Wii U a high powered console. Sony and MS do the exact same thing and nobody says a damn thing, especially the Sony fanboys. Oh they'll make fun of the X1, but mention that PS4 isn't where it needs to be and you get slammed over the head with numbers and quotes from Sony execs that are supposed to prove you wrong. It's hilarious. All of the 3 home consoles are trash as far as the hardware is concerned. Absolute trash. These companies need to just get the good games out and leave it at that. No talk of resolution and frames, we know the hardware is weak, no reason we need to be reminded of it, we will be when we see the game running on our televisions.

Underdog15
Underdog15
9 years ago

@Gamer
There are many more factors to consider in regards to performance, and I think you know that.

ransomink
ransomink
9 years ago

I think 60fps is great for multiplayer where it is critical and much needed. A heavy factor is the type of game as well; Shooters and action/adventure games benefit from the higher frame rate, while, imo, RPG's and strategy-type games wouldn't have much use for it.

30fps fits for single player. It really does affect the game and give it a more cinematic presence. Plus, you can't argue with more power for special fx and processing with using half the frame rate.

Some movies shot at high frame rates look off-putting and fake, so more isn't necessarily always better…

richfiles
richfiles
9 years ago

As a person with a hobby in neural network based robotics, and a fascination with biological neural structures (to the point that I occasionally drop in and contribute to EyeWire to map out retinal neurons), I suspect that the myths about how the eye "can't" see over 30 FPS, or over 60 FPS are a load of manure.

First off, neural impulses are typically asynchronous, and often in massively parallel groupings of axons. There tends to be a high degree of interconnectedness between neurons, with a combination of both excitatory and inhibitory synaptic connections.

In plain english… Lots of neurons fire in groups, and while they are slow individually, they don't always fire at the same time. Groups of neurons firing at different times can simulate faster data rates.

Lets use a simple example. Let's just say a neuron by itself can only pick out 30 frames per second, give or take. I believe that puts the neuron's response time at around 33.3 milliseconds or mS. Lets also say you have a group of 5 near by neurons carrying similar data, but firing at different times, independently of one another, but with some slight delays as signals propagate. The brain may receive 5 separate signals that mean almost the same thing, that are spaced out over time. Each cell fires in response to the light at the moment of it's firing, combined with those excitatory or inhibitory signals from adjacent cells.

Lets say a wave of neural firing is passing over a region with those 5 cells. The wave fires the cells at different times, but very closely to one another…

The timing scale uses mS x10, or in other words, the 3 equals 30 mS.

[mS x10]
0 1 2 3 4 5 6
_|_____|_____ Neuron A
__|_____|____ Neuron B
___|_____|___ Neuron C
____|_____|__ Neuron D
_____|_____|_ Neuron F

Notice that Neuron A takes 33.3 mS to fire a second time? That correlates to about 30 frames per second. (33.3 mS x 30 times fired is about one second). Now, see how there is a timing difference between every one of the 5 neurons? They are out of phase with each other. Individually, none of them can detect any rate faster than 33.3 mS. Now, instead of looking at their individual limitations, look at their rate of fire as a GROUP…

In 33.3 mS, you have 5 pulses fired, relating to light detected at 5 intervals between 33.3 mS by different cells. That comes to 6.6 mS between pulses… How many frames a second would those 5 cells be able to decode? 1000 mS in once second… So, 1000 divided by 6.6 equals… drum roll…

150 frames per second!

Now that is just an example. I am not saying every photoreceptor and every neuron in your eye and brain will always work exactly like that, but the configuration is certainly feasible. The optic nerve has a proposed bandwidth similar to 10 Mbit ethernet, or USB 1.1. It's quite slow. The eye does a LOT of tricks to encode visual information as efficiently as possible, and in fact, your eye has neural tissues… Not simply nerves. There is a difference. Your eyes actually process information, not just transmit it.

That said, between reduction of tearing at higher frame rates (an effect ENTIRELY related to the screen refresh rate, and nothing to do with the visual system of the eye and brain), and population rate coding in retinal neurons, yes, I believe science has proven the eye can detect far higher frame rates than we ever believed possible in the past!

LimitedVertigo
LimitedVertigo
9 years ago

Wow…this was a really great read. Thank you!

37
0
Would love your thoughts, please comment.x
()
x