The Ponyville Critic 304 Posted November 25, 2013 Share Posted November 25, 2013 I can run Dead Space 3 at 280 FPS. And mine is mid-range. Intel Core i3 2120 @ 3.3 GHz MSI Twin Frozr III GTX 660 2GB GDDR5 8GB 1333MHz Ram. Link to comment Share on other sites More sharing options...
Grumpy Enchantress 584 Posted November 25, 2013 Share Posted November 25, 2013 I don't play a lot of games but, I'm so confused when I hear people go on about high fps and the standard in films is still 24. Link to comment Share on other sites More sharing options...
Radiance64 7,046 Posted November 25, 2013 Share Posted November 25, 2013 Eh, I like seeing 60 fps in games, but if a game is 30 or has frame drops and whatnot, as long as the game itself is still fun I'll be fine with it, unless the frame drops are really bad or something. Link to comment Share on other sites More sharing options...
Golurk 467 Posted November 25, 2013 Share Posted November 25, 2013 ...um... I feel really out of place here. I couldn't even tell the difference between standard def and HD until a few months ago... Regardless, I honestly don't tend to notice fluidity and graphics when gaming. I'm more concerned about the gameplay and storyline than anything else. As long as those are good, I'm happy. ...That said, if we can get to the 10 trillionth digit of pi, I'm pretty sure we can tack on a few more frames. Link to comment Share on other sites More sharing options...
SPLinux 374 Posted November 26, 2013 Share Posted November 26, 2013 (edited) In my opinion its not that 30 fps is bad, its just outdated. There are plenty of games in 30 fps that feel fine as long as it doesn't drop too much below that. With how supposedly powerful these "Next-Gen" consoles are its pretty disappointing 60fps isn't being met as much as it should. Also its kind of funny that Nintendo with the Wii U is meeting the 60fps standard just fine with their recent releases some even 1080p as well when the PS4 and Xbox One are supposed to be so much more powerful. Some people argue that its early in the generation and its going to take time to optimize and get used to the new hardware but aren't PS4/XB1 using x86 architecture? x86 is the same PC architecture that's been around for over 20 years so as long as a developer has made a decently running PC game before then these consoles shouldn't be difficult at all to work for, this tells me the hardware in these systems isn't as impressive as they make themselves sound and are going to be outdated incredibly fast by PC's at the same price point, many PC Builders would say this already is the fact and will only become more so over the years. Edited November 26, 2013 by ~Harmonic SPLinux~ Link to comment Share on other sites More sharing options...
Hell Patrol 41 Posted November 27, 2013 Author Share Posted November 27, 2013 I don't play a lot of games but, I'm so confused when I hear people go on about high fps and the standard in films is still 24. Because you're not interacting with films. You're not actively influencing what goes on in a film. Higher fps means you have a shorter delay between when you tell the game to do something, and when you see what you've done on screen. It also means things are more fluid, a high fps is ABSOLUTELY VITAL for say a rhytm game, where you need to see your input quickly, and you need to see what's going on extremely well to react. 24 fps with motion blur is fine for a film (In my opinion it isn't but people are stupid) in that it simulates motion well enough to not appear particularly choppy. But on a game, 24 fps is a nightmare, 30 is bare minimum and even then if you play a game at 30 and then the same game at 60 you will notice the differences. It's not just visual fluidity, it's how the game responds to your input. It's something you feel. Link to comment Share on other sites More sharing options...
Grumpy Enchantress 584 Posted November 27, 2013 Share Posted November 27, 2013 Because you're not interacting with films. You're not actively influencing what goes on in a film. Higher fps means you have a shorter delay between when you tell the game to do something, and when you see what you've done on screen. It also means things are more fluid, a high fps is ABSOLUTELY VITAL for say a rhytm game, where you need to see your input quickly, and you need to see what's going on extremely well to react. 24 fps with motion blur is fine for a film (In my opinion it isn't but people are stupid) in that it simulates motion well enough to not appear particularly choppy. But on a game, 24 fps is a nightmare, 30 is bare minimum and even then if you play a game at 30 and then the same game at 60 you will notice the differences. It's not just visual fluidity, it's how the game responds to your input. It's something you feel. I think you're making a big deal out of nothing. Link to comment Share on other sites More sharing options...
Hell Patrol 41 Posted November 27, 2013 Author Share Posted November 27, 2013 (edited) I think you're making a big deal out of nothing. I'm not, if the game is taking a tenth of a second to display your input, you NOTICE it. It hurts the competetive potential of any game, and it hurts the capability of the player as well overall. Edited November 27, 2013 by Hell Patrol Link to comment Share on other sites More sharing options...
Zygen 6,062 Posted November 27, 2013 Share Posted November 27, 2013 Well, i dunno, sometimes i can tell the difference sometimes i can't, but frankly i don't play as much as some people so idk. I do think that its nice to have smooth 60 FPS though, but if you told me to tell you which games run 30FPS and which don't on a console, i won't be able to tell you honestly. Link to comment Share on other sites More sharing options...
Grumpy Enchantress 584 Posted November 27, 2013 Share Posted November 27, 2013 I'm not, if the game is taking a tenth of a second to display your input, you NOTICE it. I don't, personally. It seems like a minor issue, in my opinion. Link to comment Share on other sites More sharing options...
Lateon 588 Posted November 27, 2013 Share Posted November 27, 2013 I agree, to a degree (oh, me). I used to play WoW on a laptop that, on the best of days, got 20 fps. Sometimes, it would "fps out," as I named it, and that went down to around...1 frame every 1.5 seconds or so in a 25 man raid. That, as I'm sure you're aware, is unplayable. 20 fps is, these days, horrifyingly bad. But I dealt with it, and played through it, and sucked. Then I switched to an old beater of a computer that got a steady 15 fps, and began to destroy the meters again. However, this is only possible because WoW has a function where you can queue your actions. Lower fps matters less because of this. A couple of years ago, I switched to a higher end laptop that got ~80 fps in 10 man dungeons. Naturally, making myself look at 15 fps again makes my eyes bleed. Even 30 fps annoys me these days, and therein lies the point of this entire spiel (personally, I'm comfortable at around 50 fps). Not only is 30 fps harder to look at (I'm looking at you, Halo 4...I'd rather not, but I am), but I can indeed rationalize why playing a first person shooter would be harder at 30 fps than at 60 fps. Since queueing your actions isn't a thing in shooters, it takes twice as long to do fucking anything. What if my input landed in between frame refreshes, and thus, isn't registered? Guess what - I'm dead! Link to comment Share on other sites More sharing options...
GrimCW 656 Posted November 27, 2013 Share Posted November 27, 2013 (edited) Meh, I still find anything about 35 and up just fine personally. (45+ perfect) Most games are still fully playable, and look just fine with 30+ though. But generally those would need be non-shooter games.. Spunkgargleweewee games need the extra speed at times just to see enemies coming into view, and getting the right bead on them. The extra clarity when moving helps a lot. While I still maintain 60 is excessive... a STEADY framerate is still necessary, and most can't even pull that off still (Hence they lock down to 30) Another reason I've seen for the 30 fps though was how it has the ghost/fade look when moving. It simulates that annoying motion blur that many games are trying to put in these days. And since real MB costs massive resources... well.. faking it is easier. That one side effect though is what screws with the game when its fast paced. Edited November 27, 2013 by GrimCW Link to comment Share on other sites More sharing options...
The Pulse 210 Posted November 27, 2013 Share Posted November 27, 2013 Doesn't 60fps mean that videos will be twice as larger, because it's gonna have double the frames? Because there's no point in recording a 60fps game with a 30fps recorder. No, your thinking about if they made it into a movie. If it were a movie, and the file was being played at 60 fps the file would be huge because the video file has a set amount of frames, say 1000. If your playing a movie at 30 fps, it would take around 33 seconds. If you have the same amount of frames, but are playing the video file at 60 frames per second, the length would be about 16 seconds, and the video would look sped up. So in order to accommodate 60 fps, they increase the amount of frames the video has to say 2000 to have the same amount of time and content, but with smoother fps. However it's different in gaming. In gaming fps means that the computer/console is rendering however many frames per second but the amount of frames is not set, because the computer is rendering and outputting frames of what your seeing as fast as possible, this becomes your fps. The faster your computer can render and output frames becomes what you see as fps. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Join the herd!Sign in
Already have an account? Sign in here.
Sign In Now