Originally Posted by Avonosac
You're using terribly optimized games on mid to high end PC's to justify your argument, and you're examples still are barely using 2GB as the process in windows. The problem is if you can't quickly do whatever you want within 2GB, you aren't going to be ABLE to do it with more on 6 low clock APU cores, I don't care HOW optimized they make the code. Sure, people who love overkill were like z0mg we have 8 gigs in the new playstation, but the architecture is not able to capitalize on all of it the way people were thinking it could. The reason the reserve cache is so big, is so the background processes won't ever be paged to disk, to get maximum benefit from the 2 dedicated background cores (minimize IO busy wait).
Engines are not being optimized to the same levels anymore, and this is a problem because the industries are getting lazy and just throwing more memory at stuff, which could be done MUCH faster
if they would take the time to tweak the engine. You believe just adding more memory will adequately solve the issue of bad code, I hate to be so direct to say it like this but you are wrong. Good code, even next gen crazy good engine code should be able to fit within < 2GB, if it can't they are simply doing
You're using games that were made today to try and get a measure of performance of games in 4 years...I've provided plenty of examples of why that just doesn't work.
Really? You've used Jaguar and know we'll hit a performance wall before you can make use of more than 2GB RAM? By that logic, how much RAM would you say a 1.9Ghz Athlon XP single core can use? I'd say one Jaguar core is probably about roughly as fast (Going by Atom roughly being equal to a P4 in IPC, Bobcat being ~10-15% above that and Jaguar being 10-15% or so above that) and the PS4 gets to have 6 of them while I have one...Hint: Your 2GB guess is very, very low, and you're severely underestimating what a Jaguar core likely can actually do...Remember, if you don't use the SPEs/need the types of maths that they're not good at the PS3 effectively has a single core vs the 360s 3 and besides, the typical usage of the CPU is vastly over-rated in games.
You can always find ways to improve the users experience with adding more RAM nearly regardless of CPU speed, take Skyrim for example on the consoles, its loading times are abysmal and quite often having too much user data causes it to start having issues with FPS and stuttering...Any idea why? Here's a hint: It's random and they only have 512MB of it.
And yes, that is true...It's also going to happen regardless of specifications, the more hardware we give developers the more they'll use it both with optimizations and without (ie. Sure, we'd get a bunch of games using 7GB unnecessarily if Sony made the PS4 OS only use 1GB but then by the time 5.5GB would start being a limitation, you'd still see improvements and benefits coming from the extra RAM...And that then comes back to what I was saying before, do you want developers to have to spend more time getting the game to actually work or spending more time on the gameplay and graphics to make them better?)
Originally Posted by mushroomboy
The thing is, we are talking about a closed system where optimization can be had even further. If I were to re-compile the windows kernel for strictly AMD hardware, how much smaller would it get? If we were to do the same with a game, how much smaller would the memory footprint be? That's my point, generic PC games rarely use 2GB of system resources. Even then, as you point out, they suffer from poor coding.
You ever seen what a custom done Gentoo/Slackware system can do? Or better yet, LFE? When you can pass optimizations that aren't normally passed in generic binary packages, your memory footprints become crazy small.
Anyways, this "wall" won't get hit that fast and I doubt it's going to limit PC games severely. We've got new tricks now to keep memory footprints down. Plenty to tweak and work on so "large" worlds can still happen with enormous amounts of detail.
Last bit, I DID say vanilla games. VANILLA. Let me re-phrase that: vanilla
, meaning no mods. Should I make that large to? To go further into that, most mods aren't done in code but scripts. They tend to take up more space, due to the inability to compile them (defeats the purpose of a script now doesn't it?).
My point exactly. I am saying, we have a lot to do before we actually hit a RAM limit. My point for using AAA games was just that, they are already barely using 2GB vanilla (there it is again for those who didn't catch it, not Avon, the others). If we optimized those games, many would be using 500MB to 1GB of system resources. To go even further, if we had them strictly optimized for a specific graphics solution you would see nvram go even lower. Anyways....
You're still only looking at todays games and saying "It's enough", once again back in 2005 or even 2008 at a pinch 512MB was enough for the consoles, not so much today, we'd still be at 30fps 720p due to their GPUs not being really fast enough to push the pixels for 1080p at an acceptable FPS (Apart from a few games that manage it such as GT5) but stuff like texture size, loading times, AI, etc as well as plenty of small details could be improved if the current gen consoles even just came with 768MB or 1GB RAM instead of 512MB...That's why I mentioned Oblivion with mods. (Note that it's a game from 2006
using up 2GB of RAM)
Oblivion was already using well over 1.3GB completely vanilla for me, too and that's from a time when 1GB total system RAM was common, I went from 1GB to 2GB while I was playing through it in 2007 and noticed a big difference in performance and loading times simply because it could preload more textures and data into the RAM, you won't get it so much right now but if Bethesda ended up doing two TES games on the next gen consoles you can bet that the second one (Like Skyrim) would run into bugs and limitations on the console versions regardless of mods because of the RAM count. (Where do you think that glitch that caused PS3 saves to become unusable when they got too large came from? There was a clear bug obviously but if the PS3 had more RAM it'd have taken longer to notice the effect.)
Your logic doesn't work, GTA San Andreas runs happily on systems with 512MB system RAM and ran on the PS2 with 32MB RAM, yet even though we got a 16x increase in memory capacity we're still
at a RAM limitation; can you really say for sure we won't hit one again? Can you really say that developers couldn't make use of it simply because they don't have a Core i5 or something inside these consoles?
Then again, why am I even bothering to argue...After all it's been conclusively proven that 640kB is enough for anyone by now, right?
(In fact, in response to that quote Bill Gates actually said "In fact, every couple of years the amount of memory address space needed to run whatever software is mainstream at the time just about doubles. This is well-known." which just further proves my argument...And yes, you could hammer the games down to fit in the RAM size but my point is that time is often better spent elsewhere, and besides what GrizzleBoy said is right: People got annoyed over the Xbox One only having 5GB for developers yet the PS4 gets a praised for having 5.5GB?)