Jump to content
Banner by ~ Ice Princess Silky

gaming To Console Gamers: Graphics or Framerate?


Luna the Great of all the Russias

Graphics vs Framerate  

33 users have voted

  1. 1. If you're a console gamer, do you prefer better graphics or higher framerate?

    • 60FPS. Developers should focus more on smooth gameplay; games look good enough already.
      24
    • 45FPS. Developers should balance smooth gameplay and graphics; both could use some improvement.
      5
    • 30FPS. Developers should put more focus on game graphics; I don't mind the lower framerate for better visuals.
      4


Recommended Posts

Don't really need to type an essay on this. Choosing one over the other: frame rate. Though, honestly, I don't really care much about either option in the vast majority of games. Essentially the average game hits the "good enough" or "better" rating in both graphics and frame rate for me.

Link to comment
Share on other sites

(edited)

I don't really think about these things. Game concept, gameplay, story, music and the implementation of all of the above are far more important.

Edited by Discordian
Link to comment
Share on other sites

(edited)

Frame rate>>>>Graphics.

 

I just wish the devs on current gen consoles(ps4/xb1) cared more about frame rate than "hur durr pretty graphix!"

 

1080p is great and all(i play pc at times so i see how nice it is) but honestly id take 720p or 900p if i can get a fairly consistant 60fps.

 

Sure Watchdogs and Infamous: Second Son are pretty and all,but jeez i rather of had them run at a steady 60fps

 

Of course to each their own and in some cases i can see someone not caring when it comes to certain genres.

 

 

Oh yeah, if you prefer 30FPS over 60FPS because of the "cinematic experience" it supposedly provides, I implore you to read this.

 

img-2731797-1-MEcfrOf.jpg

 

Also THIS. If i recall the developers of The Order 1886(RaD and SCE) are using this lame duck "cinematic experience" excuse to make the game as pretty as possible,just so they can play into the "omg best graphics" craze that's going on right now with the consoles. Im sure their game will hit 1080p,and im sure it'll give sony fanboys another reason the dis the xb1 in terms of graphics but i doubt ill enjoy playing a shooter at 30fps so i wont even bother buying it.  -_-

 

If a dev can get 60fps regardless of  whether its 720p or 900p or w/e other p resolution that is a game i can live with.

 

Idk why console devs think us console gamer's aren't willing to make the compromise for a better frame rate... Stupid graphics hyperbole...

Edited by Pinkamena-Pills
Link to comment
Share on other sites

Frame rate>>>>Graphics.

 

I just wish the devs on current gen consoles(ps4/xb1) cared more about frame rate than "hur durr pretty graphix!"

 

1080p is great and all(i play pc at times so i see how nice it is) but honestly id take 720p or 900p if i can get a fairly consistant 60fps.

 

Sure Watchdogs and Infamous: Second Son are pretty and all,but jeez i rather of had them run at a steady 60fps

 

Of course to each their own and in some cases i can see someone not caring when it comes to certain genres.

 

 

 

Also THIS. If i recall the developers of The Order 1886(RaD and SCE) are using this lame duck "cinematic experience" excuse to make the game as pretty as possible,just so they can play into the "omg best graphics" craze that's going on right now with the consoles. Im sure their game will hit 1080p,and im sure it'll give sony fanboys another reason the dis the xb1 in terms of graphics but i doubt ill enjoy playing a shooter at 30fps so i wont even bother buying it.  -_-

 

It took them this long to get games running at 1080p. I'm just surprised the first two games I bought for the system are actually 1080p. Granted, I prefer 60fps as much as the next PC gamer, but if they can't muster 60fps without the framerate fluctuating wildly all over the place, then I'd take the 30fps for single player.

 

Given the choice, I would get the PC version but no telling if the port isn't total crap either.

Link to comment
Share on other sites

 

 

Also THIS. If i recall the developers of The Order 1886(RaD and SCE) are using this lame duck "cinematic experience" excuse to make the game as pretty as possible,just so they can play into the "omg best graphics" craze that's going on right now with the consoles. Im sure their game will hit 1080p,and im sure it'll give sony fanboys another reason the dis the xb1 in terms of graphics but i doubt ill enjoy playing a shooter at 30fps so i wont even bother buying it. 
 You want to know some more details of The Order: 1886?

The game runs in 1920x800 because the PS4 cannot handle those graphics in a full 1080p frame without the frame rate dropping to... I'd guess about 5 - 10FPS, then the game was locked to 30FPS because the PS4 still can't handle those graphics at 800p/60FPS. Then, to justify the letterboxed aspect ratio (compared to a standard 16:9 frame) and low frame rate, the devs claimed they were going for a "filmic" look, as if the letterboxing and frame rate were design choices.

 

 

They're not. They're limitations, and frankly, I'd rather that Ready at Dawn compromise the graphics for a better frame rate and aspect ratio and drop the "filmic look" bullshit. Games are not movies, you're meant to interact with them, and if you want to go for a "cinematic" experience, then make it a fucking movie, not a game.

  • Brohoof 2
Link to comment
Share on other sites

It took them this long to get games running at 1080p. I'm just surprised the first two games I bought for the system are actually 1080p. Granted, I prefer 60fps as much as the next PC gamer, but if they can't muster 60fps without the framerate fluctuating wildly all over the place, then I'd take the 30fps for single player.

 

Given the choice, I would get the PC version but no telling if the port isn't total crap either.

 

Im sure they could give us a steady 60fps on certain games,but they wont,mainly because apparently every ps4/xb1 user expects $1000 GPU graphics quality from a $400 system.

 

But yes,your right for certain games 30fps is fine,especially for single player rpg's ect,but for shooters and racers it should be a standard because it really does enhance the experience  :(

Link to comment
Share on other sites

Framerate? What even is that? I hear about it, but I never notice it. Apparently Mario Kart 8 runs with 59FPS, but it is a gorgeous game. I don't care about pointless specs like that.

Frames...at a rate. its in the word.

Link to comment
Share on other sites

 You want to know some more details of The Order: 1886?

 

The game runs in 1920x800 because the PS4 cannot handle those graphics in a full 1080p frame without the frame rate dropping to... I'd guess about 5 - 10FPS, then the game was locked to 30FPS because the PS4 still can't handle those graphics at 800p/60FPS. Then, to justify the letterboxed aspect ratio (compared to a standard 16:9 frame) and low frame rate, the devs claimed they were going for a "filmic" look, as if the letterboxing and frame rate were design choices.

 

 

They're not. They're limitations, and frankly, I'd rather that Ready at Dawn compromise the graphics for a better frame rate and aspect ratio and drop the "filmic look" bullshit. Games are not movies, you're meant to interact with them, and if you want to go for a "cinematic" experience, then make it a fucking movie, not a game.

 

 

lol so much for the almighty The Order 1886 being the graphical flagship of the ps4,perhaps uncharted 4 lol?

 

And honestly i think devs are SCARED to just voice the limitations of the consoles,especially the 1st party devs. Again,i believe that console gamers are expecting TOO much TOO soon.. Imean look at the reaction to ANY game this ISN'T 1080p 60fps on consoles...

Link to comment
Share on other sites

lol so much for the almighty The Order 1886 being the graphical flagship of the ps4,perhaps uncharted 4 lol?

 

And honestly i think devs are SCARED to just voice the limitations of the consoles,especially the 1st party devs. Again,i believe that console gamers are expecting TOO much TOO soon.. Imean look at the reaction to ANY game this ISN'T 1080p 60fps on consoles...

I gotta say, even as a PC gamer, this is the most disappointing console generation ever. Nintendo's not doing so hot and Sony and Microsoft both put out underpowered machines that keep on trying to bite off more than they can chew. I miss the days of the Gamecube, PS2 and Xbox...

  • Brohoof 1
Link to comment
Share on other sites

(edited)

I gotta say, even as a PC gamer, this is the most disappointing console generation ever. Nintendo's not doing so hot and Sony and Microsoft both put out underpowered machines that keep on trying to bite off more than they can chew. I miss the days of the Gamecube, PS2 and Xbox...

Agreed.. I miss when a game was defined by gameplay not graphics.

 

But alas,here we are in the 8th gen,were if a game isnt 1080p 60fps it is deemed "unworthy" of the console. But perhaps i guess its because your right,they did release underpowered consoles(especially wii u but nintendo does what t wants :lol: ).

 

It is a VERY dissapointing generation so far indeed.... 

 

You know i wonder how long 8th gen will last..i like to guess 5 years before another playstation or xbox. I highly doubt it'll last as long as gen 7th  :huh:

Edited by Pinkamena-Pills
Link to comment
Share on other sites

The now current generation consoles are still very early on in their lifecycle, so I don't think it's completely fair to dismiss them practically right out of the gate. While they're a bit down on CPU power, the emphasis appears to be shifting towards compute tasks on the GPUs, in addition to greater parallelization. For $400, the specs really aren't half bad. My GPU alone cost that much. My opinion, a very good design choice was to unify the graphics and system memory into a single large pool.

 

This was a sore spot with the previous generation of consoles, so this combined with better graphics power and matured development will only mean better games down the road. I do agree that the emphasis right now appears to be set on eyecandy and 1080p'ing everything at the cost of framerate, but that seems to me very much a development decision. There's enough power to shift that balance towards smoother gameplay, and I'm certain many developers are aware of this.

 

That being said, the Wii U coming out with only slighly more power than what are now 8 year old consoles... not the best decision imo, but Nintendo was never about the best graphics anyway and they seem to be doing alright even with a complete absence of third-party support. I can't really say that about Microsoft or Sony, though.

Link to comment
Share on other sites

 

 

My opinion, a very good design choice was to unify the graphics and system memory into a single large pool.

 

Just to refute that, Crytek (who is perhaps best known for Crysis) has come out and said that 8GB unified RAM on the PS4 and Xbox One will be a limiting factor in the consoles' lifespans. This is why most PCs typically have separate RAM pools for system and graphics RAM, so that one thing won't take up the other's memory. Also, keep in mind the PS4 reserves 3.5GB of that 8GB RAM and two cores for system tasks while the Xbox One reserves 3GB RAM for itself (although all eight cores are available for games).

 

As for 1080p/60FPS, I can see where gamers are coming from with that as both have been a standard on the PC for around 7 years now. The fact that the PS4 and Xbox One can barely reach that resolution and maintain that frame rate is disappointing, at least from a PC gamer's perspective. Then again, console devs are more concerned with having "the best graphics" over frame rates and resolutions, so...

Link to comment
Share on other sites

(edited)

Frames...at a rate. its in the word.

 

Well, yes, I'm sure that means something to people really into tech, but those words don't really mean anything to me.

 

And from what I've seen, it's just another area of needless nitpicking.

 

This image that I saw on Twitter pretty much perfectly describes exactly how much some people seem to exaggerate these most tiniest of things:

 

Boc4iI6IYAAiSPQ.jpg

Edited by Envy
  • Brohoof 3
Link to comment
Share on other sites

(edited)

As a major console and PC gamer, I think developers should focus more on if a game is fun and interesting. If it's 720p and 30 fps I could care less, just make sure it doesn't look like shit, and keep it above 30 fps.

Edited by Scoutmaster91
Link to comment
Share on other sites

Just to refute that, Crytek (who is perhaps best known for Crysis) has come out and said that 8GB unified RAM on the PS4 and Xbox One will be a limiting factor in the consoles' lifespans. This is why most PCs typically have separate RAM pools for system and graphics RAM, so that one thing won't take up the other's memory. Also, keep in mind the PS4 reserves 3.5GB of that 8GB RAM and two cores for system tasks while the Xbox One reserves 3GB RAM for itself (although all eight cores are available for games).

 

As for 1080p/60FPS, I can see where gamers are coming from with that as both have been a standard on the PC for around 7 years now. The fact that the PS4 and Xbox One can barely reach that resolution and maintain that frame rate is disappointing, at least from a PC gamer's perspective. Then again, console devs are more concerned with having "the best graphics" over frame rates and resolutions, so...

 

I definitely get where you're coming from. While there are downsides to a unified memory architecture, moving a significantly larger quanity of memory in addition to x86 compared to the paltry 256/256 or 512 of the 360 and disparate Power-architecture CPUs is a significant improvement. The advantage to consoles is since there's only one spec out there, the devs can optimize heavily for it and extract the best possible performance. Not saying they all do this, but when it's done properly (i.e. Metal Gear Solid) the results can be impressive.

 

As for Crytek, I recall a point of discussion back when Crysis 2 was released. A DX11 "upgrade" pack was released in response to criticism that Crytek had "gone consolized." The engine was basically tessellating the hell out of everything in sight. It didn't really improve the general appearance of the environment, but absolutely killed framerate. It was even rendering full-resolution tesselated water below levels that didn't have any water to begin with. There's a difference between pushing the envelope and simply wasting power.

 

I do agree though, the current emphasis is on eye-candy over playability and that is a shame. I'm hoping more devs will take a more balanced approach in the future. In the meantime, I will continue to buy my $5 AAA titles on Steam. It's a great time to be a PC gamer.

Link to comment
Share on other sites

For Console Gaming? I've never noticed any problems unless it's something like AC3 where serious lag tends to happen. 

 

As for PC gaming? As long it's not looking like total shit(ie: Fallout 3 cranked down to low on everything) I'll gladly take a hit in the graphics department to get 48FPS+. I could run ePSXe at 1080p for example, but I run it as 720p instead because it plays smoother.

 

Same with Fallout 3, I could play at 720p/30FPS, but I prefer playing at 600x800/50FPS.

 

Yeah, I need a new PC. 

Link to comment
Share on other sites

I definitely get where you're coming from. While there are downsides to a unified memory architecture, moving a significantly larger quanity of memory in addition to x86 compared to the paltry 256/256 or 512 of the 360 and disparate Power-architecture CPUs is a significant improvement. The advantage to consoles is since there's only one spec out there, the devs can optimize heavily for it and extract the best possible performance. Not saying they all do this, but when it's done properly (i.e. Metal Gear Solid) the results can be impressive.

 

As for Crytek, I recall a point of discussion back when Crysis 2 was released. A DX11 "upgrade" pack was released in response to criticism that Crytek had "gone consolized." The engine was basically tessellating the hell out of everything in sight. It didn't really improve the general appearance of the environment, but absolutely killed framerate. It was even rendering full-resolution tesselated water below levels that didn't have any water to begin with. There's a difference between pushing the envelope and simply wasting power.

 

I do agree though, the current emphasis is on eye-candy over playability and that is a shame. I'm hoping more devs will take a more balanced approach in the future. In the meantime, I will continue to buy my $5 AAA titles on Steam. It's a great time to be a PC gamer.

First point is absolutely correct. 8GB (or 4.5 - 5GB) is much better than 512 or 256/256MB, and the fact that the PC, Xbox and PlayStation are all based on x86 should mean better ports across those three platforms. I woulda still preferred to see the consoles use a 8GB DDR3 pool for system tasks with a separate 2GB GDDR5 pool for graphics, but the decision to use a unified pool was probably because of cost savings compared to two different pools using two different RAM types.

 

Crytek made that up with Crysis 3, and have you seen some of the things CryEngine 4 is powering, like Star Citizen and Kingdom Come: Deliverance? Granted, neither of those are being developed by Crytek, but the engine is, and damn do those games deliver amazing graphics. If I must though, I'd still tone down graphics settings for better performance. I had to in Battlefield 3 to get it running in 1080p on my machine (laptop, haha) and I will in other games if I must.

 

It definitely is a great time to be a PC gamer, especially with the Steam Summer Sale soon.

Link to comment
Share on other sites

Whats framer rate. I only know what graphics are. and I love to see an environment. Especially when I'm playing games like Grand Theft Auto. I want to see detailed graffiti in the hoods and Beautifully done sports cars in the 'snob zone'. Thats what I call the place with expensive looking houses

Link to comment
Share on other sites

First point is absolutely correct. 8GB (or 4.5 - 5GB) is much better than 512 or 256/256MB, and the fact that the PC, Xbox and PlayStation are all based on x86 should mean better ports across those three platforms. I woulda still preferred to see the consoles use a 8GB DDR3 pool for system tasks with a separate 2GB GDDR5 pool for graphics, but the decision to use a unified pool was probably because of cost savings compared to two different pools using two different RAM types.

 

Crytek made that up with Crysis 3, and have you seen some of the things CryEngine 4 is powering, like Star Citizen and Kingdom Come: Deliverance? Granted, neither of those are being developed by Crytek, but the engine is, and damn do those games deliver amazing graphics. If I must though, I'd still tone down graphics settings for better performance. I had to in Battlefield 3 to get it running in 1080p on my machine (laptop, haha) and I will in other games if I must.

 

It definitely is a great time to be a PC gamer, especially with the Steam Summer Sale soon.

 

Not gonna lie, CRYENGINE looks absolutely amazing these days. Holding out till Star Citizen for an upgrade.

 

Since we're all on the topic of framerates in this thread, I wanted to point out that Black Ops II on the PS3 is a complete joke. Tried 4-player split screen with some friends (I know, no one plays CoD split screen, but STILL...)

 

I've seen higher framerates during PowerPoint presentations.

 

I still don't understand how/why QA would green light that.

 

Yes I do.

 

Activision. :eww:

Link to comment
Share on other sites

To be completely honest, get the game at 30, keep the graphics from looking like utter crap, and focus more on fixing bugs and better writing. These two things are honestly some of the smallest influences when it comes to an actually good game that focusing too much on either of these is somewhat pointless overall. There are many things that make a game, and there are much better things to focus on that graphics OR framerate.

 

That being said, I'm totally in support with a game playing at 60 and looking worse than playing at 30 and looking better. But again, the impact on the overall game as a whole is kind of low in both cases.

Link to comment
Share on other sites

post-3094-0-50646700-1402627942_thumb.png

 

The only people I hear going on about frame rate and graphics are Xbox fanboys.  Must be because they're butthurt about paying $500+ for a console that is behind in both departments.

Link to comment
Share on other sites

Well, yes, I'm sure that means something to people really into tech, but those words don't really mean anything to me.

 

And from what I've seen, it's just another area of needless nitpicking.

 

This image that I saw on Twitter pretty much perfectly describes exactly how much some people seem to exaggerate these most tiniest of things:

 

img-2733180-1-Boc4iI6IYAAiSPQ.jpg

That particular image doesn't work because framerate=/=graphics; would have made more sense if it was 1080p vs 1079p.

 

Mentioning The Order 1886 and what they said about the "filmic" look, I was reminded of what Ubisoft said about the resolution and framerate of Watch_Dogs, “Resolution is a number, just like framerate is a number," which caused many people to ridicule that statement.

 

Speaking of 8th gen consoles already seemingly hitting their limitation this early, many people seem to believe that this is the last hardware based console generation; the next one, many expect, will be a streaming service which may mean that console gamers may finally be able to play games at 60FPS at 1080p (or 4K if it becomes more affordable to support that resolution by the time gen 9 comes out). Though to be able to do that, ISPs will need to make some major improvements.

Link to comment
Share on other sites

That particular image doesn't work because framerate=/=graphics; would have made more sense if it was 1080p vs 1079p.

 

The point of that image went completely over your head.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Join the herd!

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...