Originally Posted by cdoublejj
assuming the same RPM HDDS, at the heart they are both PCs and load files from their similar storage devices at similar speeds. they both have sata hdds and they both have similar setups are far as handing the hdd neither is slower or faster both are bottle necked by the hdd.
5400rpm for both as far as I can tell. (I'm not taking apart my 360s HDD to check this)
And yes, I know that...Yet the 360 still loads games vastly slower from the HDD than my PC.
Originally Posted by mushroomboy
And you notice a decrease in loading times when you install it to the hard drive. Sounds about right.  I guess looking back, one would have to explain things to you about the inner workings more. How a console doesn't have the same amount of speed to populate certain aspects of a game. When you load a game, certain variables have to be calculated. Stuff like that, so when you compare a 1.6GHz core CPU (it down clocked when HT was enabled - talking xbox 360 here) to a 3.6GHz i7, well.... You should know which one loads faster. If you can't figure out why, leave this forum please.
Shaders aren't generally larger, you use more complexe shaders (algorithm buddy, it's not a space thing). Or you use more of them, this all puts stress on the GPU itself and not the memory it has access too *gasp*. Larger textures might have been nice, I'll agree to that one.
Ram rarely limits the game experience on PC. As for your Crysis 3 vRAM "guess". I ran that game at 50FPS with two GTX460s that each had 1GB vram. They kept on par with my 7950 I have now. What was that, at 1080p too? No way, that vram should have been sayin "woah, omg, too much". - Note: SLI doesn't add the two vRAM segments together, I had a total of 1GB vRAM for use.
I get the feeling you don't understand the flexibility of ram if it has bandwidth. This was on a set of cards where one was gimped on an x4 bus. So I'm questioning your knowledge here. RAM is rarely a limiting factor. Especially on a console that's going to be mainly designed for 1080p, I really question it.
Except loading times in most games were one of the only things to not change from my Core 2 Duo E6700 to my Phenom II x3/x4 to my FX-4170 to my current i5 3570k...There are calculations done, and it might have made a difference years ago but these days not so much.
I said exactly that about shaders, generally that (Or the amount of RAM) is the limitation in the 360 and PS3 which is why we have games at 720p with low resolution textures, increase the RAM size and watch texture size increase (To a point) and loading times decrease for a better experience overall regardless of extra features that come from the extra time from not having to hammer a game down to fit into small amounts of RAM.
On PC, yes, where we're currently being held back in graphical quality by consoles (Hence why you could buy a GPU and actually have to upgrade ~2 years later not too long ago, whereas now you could run a HD5870 and still have usable FPS on modern games
with a minimum of 31fps at 1680x1050 on Ultra settings with FXAA...That's effectively a 4 year old card maxing out a modern game) but on console you are
limited by RAM this generation, and next generation? It will happen again, mark my words.
As for my vRAM usage for Crysis 3 being a "guess", there's more than one source that shows it
...I also played Skyrim on my 1.2GB GTX 470, yet it manages to use more than 1.2GB on my HD7950 or HD7850 and guess what? As a result, it's a lot smoother, even on the HD7850 which is usually barely any faster than the GTX 470 simply because the GTX 470 is having to constantly load more textures, etc from the SSD/HDD than either of the other cards. (For reference, the GTX 470 would get decent FPS but stutter especially if I was moving around quickly, which was usually coincided by the HDD Access light flashing and Resource Monitor telling me that the HDD/SSD Skyrim was installed on being used)
a limiting factor for the current generation consoles, especially on the PS3
which tends to run lower AA and resolution than 360 titles, or be less smooth in general. They are PCs, but they haven't been upgraded since 2005/2006...If you ran a PC from 2006 (Core 2 Duo, 512MB to 1GB RAM unless you're talking a then high-end 2GB kit
, 7900GTX/x1950Pro or something) guess which 2 things you'd get the most noticeable improvement from upgrading: The RAM and GPU, which coincidentally are the limitations for current generation consoles...Do you really think it'll be different this time?
Originally Posted by Avonosac
I'm giving up on you Brutuz, you're so convinced you're right that you aren't listening to any logic. You are constantly confusing RAM / VRAM and you don't have the knowledge of programming which would qualify you to discuss any kind of optimization.
You're confusing the fact that it's shared in a console and you're going "It runs fine today, so it'll run fine in 5+ years!", remember consoles aren't just a 2-3 year build like PCs tend to be as the current generation consoles are 8 years old at this point, try not upgrading your PC for 8 years and see how well 5.5GB total vRAM and system RAM will do you...Same thing as last generation, if you had a PC from when the 360 and PS3 launched it would not
run as well as a machine with much more RAM when you're running modern software, I fail to see how you don't get this.
Originally Posted by lacrossewacker
However, people must keep in mind that we're talking about a AMD 7790-7850 AT MOST and a CPU that will be competing against Intel's mobile CPU's, Bay Trail!!! Those GPU's aren't going to be pushing super high textures. They sure as hell won't be pushing 4xMSAA. They'll do everything they can to stick to FXAA to preserve their precious resources for heavier motion blur and brighter sun glares....
The PS4 is between the HD7850 and HD7870 in hardware with 1152 shaders, it also has more GFLOP/s than the HD7850...And you're severely underestimating the HD7850 as a GPU, it's actually quite fast especially for its power usage and price, it maxes out a lot of modern games with 4x AA and that's on PC, I'd wager the PS4s GPU would be able to take on my HD7950 without being embarrassed after optimization. (And remember, consoles tend to be laggier experiences in general anyway...I'd wager less console gamers care about 25-30fps average than PC gamers)
As for CPUs...The typical games that actually show a difference between CPUs on PC tend to be one of three things: MMORPG, RTS or unoptimized...Guess which ones the console gets? None, bar Diablo 3 which runs fine on a Core 2 Duo...If you think that 6 Jaguar Cores are worse than 2 Conroe cores then I'm just not even going to bother.
Originally Posted by CynicalUnicorn
I'll say it again, the PS4 has ten times the RAM of the PS3 and will only be doing 1080p tops. Guess what? You don't need that much RAM for that! My 2 GB 7850 can handle that just fine and has 80-100% overhead too. System memory? I only see 2-3 GB used, total under 4 GB usually (imcluding OS and background programs), when gaming at the highest smooth settings I can get, including 4x or 8x MSAA (dpending on the game and whether it makes anything look different at all) on 60 Hz, and TVs are stuck at 30, I think. I'm not sure how higher bandwidth, higher latency RAM will affect system memory though, and that's what will be the important thing to see (same with XB1's low bandwidth, low latency RAM for graphics).
The PS3 has 16x the RAM of the PS2 and yet is still limited by its RAM, 10x isn't really that
much when you're talking 8 years...For reference, 8 years ago I had 1GB of RAM and that was considered a decent amount at the time, most OEM machines were coming with 512MB or 256MB. (In fact, the machine I bought had 256MB RAM until I upgraded it)
Originally Posted by BinaryDemon
Seriously, consoles arent designed to compete with PC's. These next-gen consoles are designed for 1080p @ 60fps. 5.5gb shared between CPU and GPU is adequate for that task. Any visual compromises will be far less obvious than current-gen hardware already currently has to make.
You're going off of todays games...Not what will happen in future.Edited by Brutuz - 7/30/13 at 3:16pm