Originally Posted by
morsdraconis
You've clearly never played Minecraft with only 512mbs or 1 gig of RAM allocated. Frames drop when massive loads hit the processor instead of the RAM. Hell, Minecraft still has the occasional frame dip on the 10 gigs of RAM I have allocated to it sometimes. More RAM is a HUGE deal. Loading new items into memory means loading textures, frames, etc into the RAM so that the CPU and video card can process the movement, physics, and other changes made to the objects being loaded. Without the proper amount of RAM, you have frame hiccups as the processor attempts to loads the new items from scratch instead of pulling them from the RAM like normal. Thus, without the proper amount of RAM (or optimization in the sense of cutting down on the graphical quality in one area or another), you have less frames per second being put on the screen at a time and a chunkier looking experience.
It's amazing to me that developers were able to do what they've done on this generation of consoles with only 256mbs (PS3) or 512mbs (360) of RAM. The amount of workarounds and lack of visual depth some games had to do to keep even 30 FPS is amazing to look at (look at a video comparing GTAIV on a high end PC compared to the 360 version for instance to see the STARK differences between the two) and, sadly, most people won't even notice the things that they are doing with the new hardware because, graphically, we've hit the peak of what can be done within the human eye spectrum. This generation is going to be about the amount of things on the screen at once, since they've hit the wall with how graphically impressive they can make things look.
Also, eww to playing a FPS at only 40 fps. I feel bad for ya.