Overclock.net banner

1 - 13 of 13 Posts

·
Registered
Joined
·
193 Posts
Discussion Starter #1
I'm a little confused on how the xbox 360 can run games at 1080i or 1080p(1920 Ã- 1080)
and still run at 30fps but PC gamers that have rigs better than the 360 have trouble running games like GRAW and other GPU intense games above
1280x960 or 1600x1200 res why is this?
 

·
Premium Member
Joined
·
5,047 Posts
the game console is designed just for that.. games dont try and wonder why it is.. becuase thats just how it is.. your pc is doing a lot of other stuff besides running the game.. and the games are coded to be ran under many differnt scenarios with the 360 and ps3 its ment to do 1 thing and 1 thing.. hd gaming
 

·
Retired
Joined
·
2,445 Posts
Well, first, you dont have the overhead of an OS taking up system resources. Second, you dont really have ram limitations on a console, as the game is read dynamically from a disk. There are other things to consider also, for example, xbox360 games only run at 2-4x AA, where as pc games can run upwards of 16qAA and 16AF. Lastly, the drivers for pc's are always changing and are made so that a wide variety of systems can support them, where as the console has the advantage of only needing "drivers" for a single setup, as such, they can be optimized much more then PC drivers. Lastly, the 360 graphics chip is no slouch, it is actually based on R600 and utilizes the unified shader architecture... I think it has 64 unified shaders.
 

·
Premium Member
Joined
·
7,397 Posts
Because a console is a 100% known piece of hardware, and that hardware is 100% dedicated to one thing: gaming.

Contrast that with a PC, which is NOT designed for gaming and which has literally infinite hardware combinations that a game developer must contend with. Even with the APIs that are supposed to abstract that stuff to a uniform development environment, obviously different hardware messes with that (which is why you see graphics drivers released to address specific issues in specific games, for example).

Give it a year, the DX10 games on Vista on then-modern PCs (i.e., 2nd half 2007 PCs) will begin to surpass the X360 and PS3. It happens every console cycle: consoles look better for 12 to 18 months, then the PCs catch up and move beyond.
 

·
Premium Member
Joined
·
65,162 Posts
Here's a benchmark of a 7950GT playing GRAW at 1680x1050 and high settings. It can get around 47FPS average and 29FPS was the low. It will certainly be less at 1920x1080 but still not too bad. Turn down the settings a bit and overclock, it should then be able to run arpund 30FPS fine.

http://techgage.com/print/asus_en7950gt_htdp_512mb

There was an Oblivion comparison awhile back... PC had a hard time running it but it looked better on the PC with a top-end card.
 

·
Premium Member
Joined
·
7,397 Posts
Quote:


Originally Posted by tubnotub1
View Post

Second, you dont really have ram limitations on a console, as the game is read dynamically from a disk.

Actually, that's a big limitation for the consoles. Less RAM is always bad, and consoles are always constrained in their RAM. Streaming off a disc is possible in PCs too, but it's not done because it sucks; physical I/O is the worst bottleneck in a system, and optical I/O is waaaaaaaayyyy worse than hard disk I/O. Console game designers often have to come up with complex swapping algorithms to keep the right assets in memory at the right time; which, while it demonstrates their l33t coding skillz admirably, they would likely trade in a minute for the ability to just "load the whole level" into RAM and forget about it.
 

·
Registered
Joined
·
3,499 Posts
as was stated above with a console you have one video card one cpu, a set amount of ram and so with that you can create your game and optimize it for that specific setup. with a pc you can't do that you have to create a generic setup that goes through directx or opengl which then interfaces with the driver and that driver itself is kinda generic as it can support multiple video cards. sure stuff can be optimized a little bit but it would be hard to optimize for every single video card out there and every cpu, and every amount of ram with every sound card with every hard drive. instead they use generic setups which offer decent performance with better compatibility.
 

·
Retired
Joined
·
2,445 Posts
Good point Vulcan, thats something I never thought about, thanks for correcting me. I would give you a rep but your a Mod.
 

·
Banned
Joined
·
5,890 Posts
Quote:


Originally Posted by tubnotub1
View Post

Well, first, you dont have the overhead of an OS taking up system resources. Second, you dont really have ram limitations on a console, as the game is read dynamically from a disk. There are other things to consider also, for example, xbox360 games only run at 2-4x AA, where as pc games can run upwards of 16qAA and 16AF. Lastly, the drivers for pc's are always changing and are made so that a wide variety of systems can support them, where as the console has the advantage of only needing "drivers" for a single setup, as such, they can be optimized much more then PC drivers. Lastly, the 360 graphics chip is no slouch, it is actually based on R600 and utilizes the unified shader architecture... I think it has 64 unified shaders.

Um Xbox 360 GPU < 7800GTX. Not an ATI chip
 

·
Registered
Joined
·
3,473 Posts
Quote:


Originally Posted by VulcanDragon
View Post

Actually, that's a big limitation for the consoles. Less RAM is always bad, and consoles are always constrained in their RAM. Streaming off a disc is possible in PCs too, but it's not done because it sucks; physical I/O is the worst bottleneck in a system, and optical I/O is waaaaaaaayyyy worse than hard disk I/O. Console game designers often have to come up with complex swapping algorithms to keep the right assets in memory at the right time; which, while it demonstrates their l33t coding skillz admirably, they would likely trade in a minute for the ability to just "load the whole level" into RAM and forget about it.

Thats why the ps3 with it 256 megs of ram, has a hard time at 1080 resolutions.
if the 360 had a gig of ram, that would really solve a lot of hassle for them.
EDIT: the edRam is what wins the money too. HDR+4xAA with maybe 5% performance loss. Also i believe is CSAA, as the r600 is going to utilize CSAA.

Quote:


Originally Posted by reberto
View Post

Um Xbox 360 GPU < 7800GTX. Not an ATI chip

Wrong the 360's GPU is an ATI Xenos, which is pre r600. It has 10Mb of edRam for seperate processing of AA and AF. Which is why the 360's GPU is so powerful.

sorry for the numerous double post but i was wrong about the 64 shaders. its actually 16 shaders organized on die per three SIMD untits. 3x16=48 and they do 4 ops per cycle. So thats 48x4 which is 192 shaders ops per cycle.
 

·
Retired
Joined
·
2,445 Posts
Quote:


Originally Posted by reberto
View Post

Um Xbox 360 GPU < 7800GTX. Not an ATI chip

From what I understand, and what has been stated, the 360 chip uses the unified shader architecture, and is indeed made by ATI. With 48 unified shaders it is most definitely better then the 7800 GTX.
 

·
Registered
Joined
·
3,473 Posts
Before the 360's release, a lot of rumors were popping up to compare the 360's graphics to pc. I think that is where you got mislead. But the 360 defiantly uses an ATI Xenos.
EDIT: instead of me just saying all of these things i can give you a very indepth article. If you have any questions understanding any of this, come back and ask. article here.
 
1 - 13 of 13 Posts
Top