Overclock.net banner
1 - 4 of 61 Posts

· Banned
Joined
·
2,476 Posts
Quote:

Originally Posted by DuckieHo View Post
Using CUDA probably would have been easier and faster to develop.... and much much much more powerful.

48 ATI shaders at 500MHz... HA HA...

Quote:

Originally Posted by DuckieHo View Post
Do you have a better parallel optimized API that I could use? Is OpenCL finalized yet?

Please provide your many times of proof before insinuating ass talk.

You missed a part too.... Why bother? When there are parallel programming APIs like CUDA and Stream avaliable? Xenos is a 500MHz 48 shader ATI GPU. He could have used Stream and just programmed on a better ATI GPU. A $85 HD4850 has 18 times the processing power. You know what's an even easier platform to develop for? A PC.
Ehhhh Duckie, I don't think you are following why they chose to do it this way. You claim they could buy a 4850 for $85, this is incorrect, since this card needs to be plugged into something that is also running Windows Vista. Suddenly your $85 is now $1,000 per machine.

The reason the Xbox 360 is better is because it's a complete computer for $199, which includes an Operating System and Development Library built into the cost. If you read the article, you'd note that a big part of this research was the visual output of the math being done to a Monitor/TV screen.

The guy used to work for Rare/Microsoft as a software engineer and already knows full well how to program the Xbox 360 to do this stuff, so it makes perfect sense he chose to program a $199 platform than a more expensive $1,000 PC platform.

Quote:
A new study by a University of Warwick researcher has demonstrated that researchers trying to model a range of processes could use the power and capabilities of a particular XBox chip as a much cheaper alternative to other forms of parallel processing hardware.
You can buy 5 Xbox 360's for the price of a single PC.

Quote:
He was convinced that this chip could, for a few hundred pounds, be employed to conduct much the same scientific modelling as several thousand pounds of parallel network PCs.
There is the jackpot quote.

Quote:
The good news is that his hunch was right and the XBox 360 GPU can indeed be used by researchers in exactly the money saving way he envisaged.
 

· Banned
Joined
·
2,476 Posts
Quote:


Originally Posted by trueg50
View Post

What is the point of buying 5 extremely unreliable systems for $1,000 when you have vastly more powerful, and massively more reliable computers for that price?

I can't answer that question, but if some Uni Prof & Engineer says he found a cheaper way to do something, and his article was published in a peer-reviewed journal, he's probably right.

Quote:


Lets give a budget of $2,000, thats what, 10 Xbox's? For that much you could easily use 2 high-end Nvidia GPU's, or ATI GPU's that would be hugely superior. Heck a 5780 is theoretically +27 times faster (and thats not taking into consideration shader frequency advantages).

May be true, but this guy already knows how to program for the Xbox 360, and from the sounds of it, has some libraries left over from his Rare/Microsoft days.

These guys have limited budgets and time when it comes to research, so the shortest and easiest path will always win out. I'm sure with infinite money and time, they'd have some elaborate GTX 285 setup going on.
 

· Banned
Joined
·
2,476 Posts
Quote:

Originally Posted by DuckieHo View Post
You are assuming that a professor at a University does not have an existing PC.... I would bet $1000 that he does have access to a PC. Even if it did cost $1000, the PC would be 20-40 times more powerful for 4-5 times the cost.
You are incorrect. I will now quote all areas of the article which lead me to believe you are incorrect:

Quote:
A new study by a University of Warwick researcher has demonstrated that researchers trying to model a range of processes could use the power and capabilities of a particular XBox chip as a much cheaper alternative to other forms of parallel processing hardware.

Dr Simon Scarle, a researcher in the University of Warwick’s WMG Digital Laboratory, wished to model how electrical excitations in the heart moved around damaged cardiac cells in order to investigate or even predict cardiac arrhythmias (abnormal electrical activity in the heart which can lead to a heart attack). To conduct these simulations using traditional CPU based processing one would normally need to book time on a dedicated parallel processing computer or spend thousands on a parallel network of PCs.

Dr Scarle however also had a background in the computer games industry as he had been a Software Engineer at the Warwickshire firm Rare Ltd, part of Microsoft Games Studios. His time there made him very aware of the parallel processing power of Graphical Processing Unit (GPU) of the XBox 360, the popular computer games console played in many homes. He was convinced that this chip could, for a few hundred pounds, be employed to conduct much the same scientific modelling as several thousand pounds of parallel network PCs.

The results of his work have just been published in the journal Computational Biology and Chemistry under the title of “Implications of the Turing completeness of reaction-diffusion models, informed by GPGPU simulations on an XBox 360: Cardiac arrhythmias, re-entry and the Halting problemâ€. The good news is that his hunch was right and the XBox 360 GPU can indeed be used by researchers in exactly the money saving way he envisaged. Simon Scarle said:

“This is a highly effective way of carrying out high end parallel computing on “domestic†hardware for cardiac simulations. Although major reworking of any previous code framework is required, the Xbox 360 is a very easy platform to develop for and this cost can easily be outweighed by the benefits in gained computational power and speed, as well as the relative ease of visualization of the system.†However his research does have some bad news for a particular set of cardiac researchers in that his study demonstrates that it is impossible to predict the rise of certain dangerous arrhythmias, as he has shown that cardiac cell models are affected by a specific limitation of computational systems known as the Halting problem.
Since this argument is ridiculous I will leave my final argument as such; If a Peer-Reviewed journal featured an article written by a University Professor, Researcher, and Engineer, I am going to believe said article is accurate and factual.
 
1 - 4 of 61 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top