Overclock.net banner
1 - 3 of 61 Posts

· Premium Member
Joined
·
9,684 Posts
Quote:

Originally Posted by Kuntz View Post
Ehhhh Duckie, I don't think you are following why they chose to do it this way. You claim they could buy a 4850 for $85, this is incorrect, since this card needs to be plugged into something that is also running Windows Vista. Suddenly your $85 is now $1,000 per machine.

The reason the Xbox 360 is better is because it's a complete computer for $199, which includes an Operating System and Development Library built into the cost. If you read the article, you'd note that a big part of this research was the visual output of the math being done to a Monitor/TV screen.

The guy used to work for Rare/Microsoft as a software engineer and already knows full well how to program the Xbox 360 to do this stuff, so it makes perfect sense he chose to program a $199 platform than a more expensive $1,000 PC platform.

You can buy 5 Xbox 360's for the price of a single PC.
.
What is the point of buying 5 extremely unreliable systems for $1,000 when you have vastly more powerful, and massively more reliable computers for that price?

Lets give a budget of $2,000, thats what, 10 Xbox's? For that much you could easily use 2 high-end Nvidia GPU's, or ATI GPU's that would be hugely superior. Heck a 5780 is theoretically +27 times faster (and thats not taking into consideration shader frequency advantages).

Quote:

Originally Posted by Coma View Post
Probably not... CUDA is awful. Try picking it up and you'll run screaming into the night.

Given its incredible performance with all sorts of special applications, performance is great. Also given that many colleges are teaching courses on it at the 4th year level, I wouldn't say it is impossible, but certainly not a "Saturday project".

Though if you thought CUDA was bad, just imagine the fun Stanford had with the ATI folding application.
 

· Premium Member
Joined
·
9,684 Posts
Quote:

Originally Posted by DuckieHo View Post
I was running a P4 2.8GHz.... it took 10-15 mins to run my code.

However, I finally got a C2D 2.8Hz so I have to go back to work now....
'bout time the developers get some love!

Though it is odd, at my work, the developers are always given way overpowered equipment (like 3.0 ghz C2Q's w/ 4gb of RAM and 7650GT's to replace 3.2ghz Pentium D's w/ 4gb of RAM and 7650GT's).
 

· Premium Member
Joined
·
9,684 Posts
Quote:


Originally Posted by r4zr
View Post

OK, why does everyone always assume that he could have even made use of the extra power of a nVidia card?

I have two options: one system that costs $200 (a) and one that is 40x as powerful but costs $1000(b).

The job will take 20 minutes on a.

Why would I buy b?
Why would I rent b?

This is what this article is saying. If you don't need the power, there is a cheaper way.

There is an easier way, and it doesn't involve spending hundreds of thousands of dollars t write on an obsolete platform. Heck $40 and you can get a card for his workstation that will beat the heck out of that Xbox.

Quote:


Originally Posted by BizzareRide
View Post

48 x 500MHz x 34,000,000 = 816,000,000,000 floating point operations per second

Developing a GPU client for distribution over Xbox Live would be cheaper and more productive than buying a thousand, 216 core nVidia GPUs.

Where is that 34,000,000 coming from?
 
1 - 3 of 61 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top