[GameSpot] PS4 not worth the cost, says Nvidia - Page 8 - Overclock.net

Forum Jump: 
Reply
 
Thread Tools
post #71 of 271 Old 03-14-2013, 11:38 AM
4.0ghz
 
Join Date: Jan 2006
Location: Phoenix, AZ
Posts: 1,906
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 93
You know what, I'm glad nvidia didn't get into these consoles.

ATI did the graphics in the old gamecube. Metroid Prime looked AWESOME in my opinion.

BE-2400 3.105ghz 8 hours orthos stable
BE-2400 3.484ghz on AIR proof.gifPROOF!
7750 BE 3.654ghz suicide Validation

Nenkitsune is offline  
Sponsored Links
Advertisement
 
post #72 of 271 Old 03-14-2013, 11:41 AM
New to Overclock.net
 
Blameless's Avatar
 
Join Date: Feb 2008
Posts: 29,121
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 3129
Quote:
Originally Posted by xoleras View Post

Again, there are two implementations of physx. The most common implementation is for CPU only and does not involve GPU accelerated effects - GPU accelerated effects such as those found in Batman: AC will always require CUDA hardware. That type of GPU accelerated physx cannot be implemented on the CPU or Non-CUDA hardware..

To the best of my knowledge, all PhysX requires CUDA, but CUDA itself can be run on the CPU.

Strictly speaking, I do not believe there are any GPU only PhysX effects, just effects that would run too slowly on the CPU if allowed.
Quote:
Originally Posted by DrBrogbo View Post

PhysX (the proprietary Nvidia technology, not physics in general) in its current state requires CUDA. Hence, PhysX on consoles would also require CUDA. Hence, it will not be the typical GPU-accelerated effects that you see when running certain games with Nvidia hardware (Borderlands 2, Mafia II, Batman: AC, Hawken, Planetside 2, etc).

PhysX includes CUDA libraries.

I am able to force level 2 PhysX in Hawken (by setting PhysXLevel=2 in the HawkenEngine.ini), and see all the supposedly NVIDIA only effects, with no NVIDIA hardware in my system. However, performance is horrible.

I haven't tried forcing it in these other games, but I suspect I could. There are no GPU only PhysX effects, to the best of my knowlege.

Current and future consoles have/will have some PhysX titles and the degree of PhysX effects is likely only limited by their available CPU power.

...rightful liberty is unobstructed action according to our will within limits drawn around us by the equal rights of others. I do not add 'within the limits of the law,' because law is often but the tyrant's will, and always so when it violates the right of an individual. -- Thomas Jefferson
Blameless is online now  
post #73 of 271 Old 03-14-2013, 11:42 AM
WaterCooler
 
Vagrant Storm's Avatar
 
Join Date: Nov 2005
Location: Rochester, MN
Posts: 11,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 415
Quote:
Originally Posted by GunSkillet View Post

This is a good thing, imagine if all console ports of this generation started to use PhysX sicksmiley.png

The PhysX usage in Borderlands 2 is quite nice. I consider it to be the only true GPU accelerated PhysX game since all the rest you could accidentally disable the PhysX and not even notice the entire game. Though I've seen some videos of Hawkin and that seems to use them quite well, but I haven't played the gave save for some early beta access a while back.
Quote:
Originally Posted by Blameless View Post

To the best of my knowledge, all PhysX requires CUDA, but CUDA itself can be run on the CPU.

Strictly speaking, I do not believe there are any GPU only PhysX effects, just effects that would run too slowly on the CPU if allowed.
PhysX includes CUDA libraries.

I am able to force level 2 PhysX in Hawken (by setting PhysXLevel=2 in the HawkenEngine.ini), and see all the supposedly NVIDIA only effects, with no NVIDIA hardware in my system. However, performance is horrible.

I haven't tried forcing it in these other games, but I suspect I could. There are no GPU only PhysX effects, to the best of my knowlege.

Current and future consoles have/will have some PhysX titles and the degree of PhysX effects is likely only limited by their available CPU power.

Well CUDA is sort of a programing API in a manner of speaking. it is code that gets compiled with a special CUDA compiler so that it can be run in parallel. You don't really have CUDA libraries...you can have CUDA C, CUDA C++, CUDA Fortran, CUDA whatever...it is really just normal code run on the CPU via SSE and AVX, but executed in a parallel fashion...and I believe can only be done if there is more than one thread available on the CPU...so like an old single core CPU without HT would be out of luck. Never tested that...

MINNESOTA OVERCLOCKERS | The Climate Phenomenon
If it ain't broke...MAKE IT GO FASTER!!!devil.gif
Vagrant Storm is offline  
Sponsored Links
Advertisement
 
post #74 of 271 Old 03-14-2013, 11:45 AM
Overclocker
 
lacrossewacker's Avatar
 
Join Date: Apr 2012
Location: Northern Virginia
Posts: 11,388
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 427
Quote:
Originally Posted by Nenkitsune View Post

You know what, I'm glad nvidia didn't get into these consoles.

ATI did the graphics in the old gamecube. Metroid Prime looked AWESOME in my opinion.

and a Nvidia card wouldn't be able to produce the same visuals?
lacrossewacker is offline  
post #75 of 271 Old 03-14-2013, 11:50 AM
 
Masked's Avatar
 
Join Date: Aug 2010
Location: Connecticut
Posts: 7,160
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 342
Quote:
Originally Posted by ZealotKi11er View Post

Here is the thing. When Xbox 360 and PS3 came out both these consoles used GPUs or better said GPU architectures that where just being replaced in the PC. Now it different. PS4 is going to be using GCN which AMD plans to keep for at least 2 more generations. Even if they dont make much money from PS4 its all bout the PC market. Games will just run better in PC when you have a GCN GPU. That alone is enough in my opinion. Nvidia is but hurt right now.

Ummmm, no. /facepalm, level no.

When the Xbox 360 and PS3 came out, our micro-architecture wasn't on the same level as desktops. Now it is. In fact, the PS3/XBOX 360 were arguably 50% behind desktops at launch...

Now, Maxwell samples are already out there on mobile devices and next launch, you will see mobile devices almost equal desktops in performance...Not price/performance...Performance, period.

PS4 is using GCN because AMD wanted to lower the cost of the product. They already have samples out and wanted to undercut Nvidia with a cheaper total package...And they did.

Nvidia was offering Maxwell technology practically at cost which, was 30-40% higher then AMD's undercut. With a marginal mark-up.

At the end of the day, AMD's tech does get the job done but, it's a cheaper GPU/CPU then Nvidia was offering all-together. It's a major step DOWN.

This isn't Nvidia being butthurt, this is Nvidia laughing all the way to the bank, making it KNOWN that AMD is using cheaper parts and giving Sony the finger.

If anything, this opens Nvidia to bidding in other markets where their lines will do much better.

Look at project Shield ~ 2-3 years in development and orders are FLYING by...

When you are dead, you do not know that you are dead. It is difficult, only for others. It is the same when you are stupid. - Ricky Gervais.
Masked is offline  
post #76 of 271 Old 03-14-2013, 11:51 AM
PC Gamer
 
Mopar63's Avatar
 
Join Date: May 2010
Posts: 1,866
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 142
Quote:
Originally Posted by Vagrant Storm View Post

The PhysX usage in Borderlands 2 is quite nice. I consider it to be the only true GPU accelerated PhysX game since all the rest you could accidentally disable the PhysX and not even notice the entire game. .

while I will grant the PhysX in BL2 look nice they became a pain real quick. Get in a fire fight when you and three buddies are tanking down a mob and watch all the elemental weapons make it IMPOSSIBLE for anyone to accurately aim or snipe. I mean it looks amazing but in the end I found the game much more playable with it turned off.

GAME ON!!!!!
Mopar63 is offline  
post #77 of 271 Old 03-14-2013, 12:04 PM
PC Gamer
 
Mopar63's Avatar
 
Join Date: May 2010
Posts: 1,866
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 142
Quote:
Originally Posted by Masked View Post

Ummmm, no. /facepalm, level no.

When the Xbox 360 and PS3 came out, our micro-architecture wasn't on the same level as desktops. Now it is. In fact, the PS3/XBOX 360 were arguably 50% behind desktops at launch...

This argument gets used a lot but is actually completely false. As enthusiasts we tend to judge this kind of stuff by what we use but the truth is MOST people do not use the horsepower level of the people in this forum. The Xbox 360 and PS3 at launch were actually pretty close to the average PC used for gaming when they launched. Event today they are really not far off the mark when you realize how many people are gaming on Intel based graphics.
Quote:
Originally Posted by Masked View Post

Now, Maxwell samples are already out there on mobile devices and next launch, you will see mobile devices almost equal desktops in performance...Not price/performance...Performance, period.

Interesting how you mixed your argument points because if we use the same argument used against the Xbox and PS and then compare by the same standard Maxwell does not come close. However if you do a realistic standard as I suggested it does. This is a pure marketing double talk you presented.

As for the why a decision was reached, I do not doubt that AMD was tapped because it was able to deliver a better overall price and reach the performance level Sony wanted. You need to understand these companies do not care what is the latest greatest tech, just as 99% of consumers don't. They set a performance standard in their design they wanted met and then they looked for the best price they could get and hit the performance envelope they were looking for.

You act like this is a bad thing but we, as enthusiast do it everyday. Otherwise everyone would buy $1000 video cards. As for Shield I think it is a great looking product and the only flaw I saw was that I wish it di not have to be wired to the TV. Last thing I want is a hand held that is tethered part of the time. Will it sell well, I am sure it will, the concept is cool and looks well executed. However at the end of the day I think it will be a nitch market item.

In the end however this has nothing to do with AMD's design win. Does it hurt nVidia, likely not but they are not untouched by it. Losing a multi-million dollar bid is still losing the bid.

GAME ON!!!!!
Mopar63 is offline  
post #78 of 271 Old 03-14-2013, 12:11 PM
 
Masked's Avatar
 
Join Date: Aug 2010
Location: Connecticut
Posts: 7,160
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 342
Quote:
Originally Posted by Mopar63 View Post

This argument gets used a lot but is actually completely false. As enthusiasts we tend to judge this kind of stuff by what we use but the truth is MOST people do not use the horsepower level of the people in this forum. The Xbox 360 and PS3 at launch were actually pretty close to the average PC used for gaming when they launched. Event today they are really not far off the mark when you realize how many people are gaming on Intel based graphics.
Interesting how you mixed your argument points because if we use the same argument used against the Xbox and PS and then compare by the same standard Maxwell does not come close. However if you do a realistic standard as I suggested it does. This is a pure marketing double talk you presented.

As for the why a decision was reached, I do not doubt that AMD was tapped because it was able to deliver a better overall price and reach the performance level Sony wanted. You need to understand these companies do not care what is the latest greatest tech, just as 99% of consumers don't. They set a performance standard in their design they wanted met and then they looked for the best price they could get and hit the performance envelope they were looking for.

You act like this is a bad thing but we, as enthusiast do it everyday. Otherwise everyone would buy $1000 video cards. As for Shield I think it is a great looking product and the only flaw I saw was that I wish it di not have to be wired to the TV. Last thing I want is a hand held that is tethered part of the time. Will it sell well, I am sure it will, the concept is cool and looks well executed. However at the end of the day I think it will be a nitch market item.

In the end however this has nothing to do with AMD's design win. Does it hurt nVidia, likely not but they are not untouched by it. Losing a multi-million dollar bid is still losing the bid.

It's actually not false...We had samples available that far exceeded both systems.

You're discussing 2 entirely different launch platforms. There's the public releases and private. Businesses and development are on a 4-5 month earlier schedule then Joe Public, is. When many Vendors/Businesses (Like myself) discuss this, that's the date-line we're discussing.

I didn't present any double-talk.

AMD did not match the same level of technology that Nvidia did, period. Nvidia was undercut by a cheaper option.

Maxwell samples have been available to us for some time...As they were equally available to Sony.

As I said earlier, I've been doing business with Nvidia directly for almost a decade...Maxwell was pushed as an option for the PS4...Would've been a tremendous marketing tool but, at the end of the day, Sony went with the "older" product.

You can call it whatever you want but, Sony chose an existing product, AMD's offering to Sony is actually currently available on the commercial market and Nvidia's won't be for another year.

Shield is due out very soon as is an entirely new platform. Mobile samples are already in existence.

I assure you, Nvidia "losing" the bid, is of no consequence because there are 3/4 projects currently that AMD does not have the production facilities to match Nvidia.

While it may appear they "lost", they really didn't.

When you are dead, you do not know that you are dead. It is difficult, only for others. It is the same when you are stupid. - Ricky Gervais.
Masked is offline  
post #79 of 271 Old 03-14-2013, 12:15 PM
PC Gamer
 
Aparition's Avatar
 
Join Date: Jan 2011
Posts: 5,733
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 247
Aparition is offline  
post #80 of 271 Old 03-14-2013, 12:15 PM
Silent but Deadly
 
Join Date: May 2008
Location: California
Posts: 10,808
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 609
Quote:
Originally Posted by Billy O View Post

That headline is a bit misleading. That it is not worth the opportunity cost to Nvidia I agree with. The headline reads as if they are saying it's not worth the cost Sony is charging consumers.

I think a lot of people don't know what opportunity cost means.
Quote:
Originally Posted by Masked View Post

I assure you, Nvidia "losing" the bid, is of no consequence because there are 3/4 projects currently that AMD does not have the production facilities to match Nvidia.

While it may appear they "lost", they really didn't.

Which production facilities are you talking about? I was not aware Nvidia had any in-house fabbing capability.

Audio setup: Xonar Essence ST into Niles SI-275 amplifier driving CSW Tower II speakers paired with M&K MX-70 subwoofer. Headphones: DT880 Pro 250ohm
Mygaffer is offline  
Reply

Quick Reply

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off