Originally Posted by mushroomboy Warning: Spoiler! (Click to show)
And it ran with as much fps as my Phenom II x4 could run, go watch youtube vids. People get 50 fps with sli gtx 460s. Makes you look silly.
And guess what, i never said current gen consoles had enough ram. When released they had equal or less than thae average mid range desktop. I always thought current gen was lacking. You should read, iv not argued with you on that.
However this time they are above the average desktop, even high gaming rigs don't need more than 8gb ram. 4 gb does us well, very well.
And to further the point, why do you care about console graphics? This is a pc forum, so what if consoles look lesser. The importance this gen is the similar hardware.
So i don't see your point.
I didn't say that you said that, I was pointing out that nearly every console generation has ended up limited by the amount of RAM in at least a few games. (Open world ones especially...Remember the texture pop-in on some later PS2 games?)
I care about console graphics because at the end of the day, apart from slightly higher texture resolution, more AA/AF and higher resolution in general console graphics dictate PC graphics for most games until you start reaching the twilight years of a consoles lifespan, my point is that if the consoles had more RAM we'd end up seeing better games in general.
Originally Posted by CynicalUnicorn
At what resolution and settings are you using your 7850? And 7950? From the sounds of it, PS4 and XB1 have graphics processing capabilities roughly between a 7850 and 7870, but the APU allows lower latency so that bumps the performance a bit. If you're expecting 1440p, 3x 1080p, or 4k on either console, you're crazy. 1080p is the maximum, and my 2 GB card handles it fine. I could probably get away with 1 GB, but that restricts some ability for mods, not normal performance.
1080p, 4x or 2x AA usually. Skyrim runs at 4x AA on my HD7950 but only 2x on my HD7850. Once again, you're using todays games to compare it...a 256MB card was fine in 2005 but isn't now, texture sizes increase and while it might not happen over night they will in future.
Besides, if everyone actually read my point: The amount of RAM will work and probably be fine if a bit of a bottleneck on some areas much like this generation, however having more RAM means we'd get better games in general for a cheap upgrade.
Originally Posted by Carniflex
Let me give an arbitrary example just sucked out of my pen for making a point about increased RAM being a good thing. Let's say you have some moderate fps game. Your graphics processing takes, for example, about 1.5 GB of memory with pretty large textures and level or two of AA and your game engine is happy as hippo doing it's AI and asset juggling within 2 GB of RAM. You have 2 GB to spare - assuming you don't half ass optimizations and just get sloppy. As you have some spare resource you utilize it, for example, for adding an pressure simulation into the game world for 2 assets "liquid A" and "Gas B", you do it in single precision and throw this calculation at GPU - it is not CPU/GPU heavy as long as not a lot is happening but it is somewhat memory intensive as you have to keep track of it, so say, keeping track of pressure for these 2 things consumes 2 GB of memory you had to spare. Let's say, you have 1 GB more - hey all the sudden you can add one more item into the list, lets say "Liquid B" and you can do something nice with it. Have even more RAM - no problem - lets keep track of temperature, even more, even greater, flash point, pollutants, you name it - and this is just with simple scalar fields. You can optimize physics to take less memory, but it gets usually computationally heavier and more complex to set up, for example instead of fixed grid you use adaptive grid or even some mesh free method, like, for example, finite volumes in correct formulation.
Want a great example of it? Skyrim on a GTX 470 with texture mods.
It fills up the 1.2GB vRAM quickly, and if you turn around fast you have to wait for new textures to load...Not so bad on an SSD (There's a small pause and stutter) but on a HDD it virtually makes the game unplayable IMO.
Originally Posted by Avonosac
I have explained 5 times why you are wrong, RAM != VRAM. And adding more will only EVER help when you have the computing power to utilize it. The graphics core of the APU is not strong enough to use much more than 3GB of VRAM. The APU cores, even WITH if the IPC claims from AMD are 100% true, will not be able to utilize much more than 2 GB of memory, even if the engine is poorly written.
You continue to confuse VRAM and RAM requirements, and which portion of the hardware would utilize them respectively. Please stop posting until you have educated yourself, because you are simply wrong
When did I say RAM = vRAM? You're severely underestimating how much computing power you need to fill up 3GB vRAM or 2GB system RAM...My Core 2 Duo E6700 can easily use over 4GB RAM and not be fully loaded, it's also slower than a 6 core Jaguar APU. (CB11.5 for a 4 core Jaguar @ 1.5 is 1.5
, and CB11.5 for the E6750 with its faster memory access than my E6700 is 1.44
Do you have any
idea how computers even work? My server, despite being slower
than the CPU in the PS4 and Xbox One is sitting at 99.7% idle with 1.5GB RAM used...Yet the faster CPU in those can't use "much more than 2 GB, even if the engine is poorly written." (For reference, the main RAM consumer is actually a Minecraft server...something that runs on every single-player Minecraft game in the background and is poorly written)
The best bit is that's not even close to the full usage that chip has seen, when I used this in my main PC with 8GB of RAM, it also easily went over 2GB while gaming. If I was doing work (Chrome, Photoshop and Word among other programs open at once usually) it'd easily go over 4GB. For reference, each additional player tends to add about 10% (20% in Linux terms, where a single core being maxed out is 100% and two cores is 200%) usage to the CPU...and a lot more RAM usage, Minecraft is also a highly inefficient game too. So, do you still think that 6 Jaguar APU cores can't use over 2GB without lagging? (Never mind the fact that on PC, you're hitting a GPU bottleneck most of the time...Let alone on console when you can optimize heavily for the CPU)
Now, for vRAM? You're confusing those 4GB GT640s and the like here, those cards would never be able to reach their full capacity short of a CUDA app specifically designed to use it all...But a HD7850? It can sit at 50-60fps at 1080p and can fill its 2GB buffer up when overclocked. (Which would still likely be slower than the PS4s GPU considering the extra latency, fewer shaders, etc)
Considering console games tend to start hitting 30fps fairly quickly in their lifespan and the other reasons as to why the PS4s GPU would be faster I'd wager 3GB is a bit of a low guess, especially as open world games age...You're still going off of what games today use, if I did that at the start of the 360s lifetime and looked at the PC version of San Andreas, Prince of Persia, Half Life 2, etc, I'd have come to the conclusion that 512MB was enough when it clearly wasn't even when the consoles reached the 4 year mark, let alone the 8 years they're at now...Extra RAM only becomes even more important as console lifespans increase; how long do you think this generation will last? 5 years like most did? 8 like the last generation? 10? More? It's very likely that by the time they're 5+ years old, a lot of games will be 64bit and eat up 4GB system RAM and TESVI will eat up 6GB+ of vRAM, 5 years is an extremely long time in computing and you're trying to say what works now
will work for that long...5 years ago, GPUs came with 512MB to 1GB vRAM and CPUs tended to come with 2-4GB, now gamers are using 8GB-16GB. (I know 16GB isn't needed for games yet
but who knows what the future holds?)
Originally Posted by hajile
As to the ability to use all the RAM, Carmack said in his talk that not only do they have plans to use it all, but that even more would still be desirable so that large amounts of uncompressed textures would be an option. As he said, it may not be necessary, but it certainly makes life a whole lot easier. The majority of the RAM will be allocated for GPU use -- not because it can use all of it at one time (even a titan can't actually use more than a few KB at a time), but because when it needs a resource, having it in RAM is faster than loading from disk.
This is exactly my point, having more RAM helps in so many ways people don't seem to think of...You gain additional development time for more important things from not having to make your game fit into the RAM and swap to the HDD in ways that don't make performance drop off a cliff, you get higher resolution textures (Especially when the shader, and not the ROPs/TMUs are the bottleneck as appears
to be the case with the 360 and PS3, at that point you're simply reducing the amount of time those areas of the GPU spend idle for the most part) among other things...Yet people still seem to think that more RAM won't be better.