Overclock.net › Forums › Industry News › Hardware News › [Fud] ATI Probing PhysX on GPU
New Posts  All Forums:Forum Nav:

[Fud] ATI Probing PhysX on GPU - Page 2

post #11 of 20
Quote:
Originally Posted by darkcloud89 View Post
They don't ignore it, but it doesn't really effect their strategy. Think about it this way: If CUDA takes off, for certain applications where a fast CPU pays off right now, the GPU begins to be more important. Intel profits a lot more from faster CPUs than they do slower ones. CUDA could potentially mean more people buying less powerful CPUs and more of the less expensive ones. For Intel, this is going to mean a loss of money. In order to prevent this from happening, Intel needs to cut back Nvidia's marketshare. Because they don't currently have a competing product themselves, that job is left to ATI. So, it makes sense for them to be willing to help ATI make their product more compelling. It's basically the same story with SLI+Nehalem, it's not really a mistake. Intel just doesn't want SLI support.

Yes, AMD does have the potential to make money because of this, and they are Intel's competitor. However, consider the circumstances. The most competitive market between AMD and Intel right now is servers. Intel's biggest problem here is scalability, and their answer to this problem is Nehalem. Nehalem isn't meant to be a revolutionary desktop processor, and in fact for single threaded applications I think you will have to look hard to find a big improvement. What it really is, is Intel's answer to Opteron scalability and the real improvement with Nehalem is with multithreaded applications and scaling. So, even if ATI gains marketshare in consumer graphics, Intel is planning to cut in on AMD's server marketshare so it will balance out in the end.

Edit: Wow, post number 200.
That's what I think, too.
Ferberite
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel(R) Core(TM) i5-2450M CPU @ 2.50GHz Lenovo Thinkpad Edge E520 AMD Radeon HD 6630M 6GB DDR3 @ 1333MHz  
Hard DriveHard DriveOSOS
Samsung 850EVO HITACHI HTS727550A9E364 7.2krpm Debian 7.0 Win7 
OSMonitorMonitorKeyboard
Win8 15.6 Zoll 16:9, 1366x768 Pixel, AUO23EC, spiege... 24" Dell U2412M, 1920x1200 Integrated + External 
PowerCase
20V, 4.5A Lenovo Thinkpad Edge 
  hide details  
Reply
Ferberite
(14 items)
 
  
CPUMotherboardGraphicsRAM
Intel(R) Core(TM) i5-2450M CPU @ 2.50GHz Lenovo Thinkpad Edge E520 AMD Radeon HD 6630M 6GB DDR3 @ 1333MHz  
Hard DriveHard DriveOSOS
Samsung 850EVO HITACHI HTS727550A9E364 7.2krpm Debian 7.0 Win7 
OSMonitorMonitorKeyboard
Win8 15.6 Zoll 16:9, 1366x768 Pixel, AUO23EC, spiege... 24" Dell U2412M, 1920x1200 Integrated + External 
PowerCase
20V, 4.5A Lenovo Thinkpad Edge 
  hide details  
Reply
post #12 of 20
Quote:
Originally Posted by stargate125645 View Post
Why can ATI support Havok but Nvidia not?
nVidia is in a war of sorts with Intel and wants to make sure the industry doesn't adopt Intel's system. Additionally nVidia just bought AGEIA and some games already have support for AGEIA's system so why would they turn around and ditch AGEIA?
Lee XT
(17 items)
 
  
CPUMotherboardGraphicsRAM
AMD FX-6300 Asus M5A97 SAPPHIRE Radeon HD 7850 AMD 4GB DDR3 1333MHZ 
RAMRAMRAMHard Drive
AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ OCZ Vertex 4 256GB 
CoolingOSMonitorKeyboard
Corsair H80 Windows 8.1 Pro MCE Dell P2414H WHXV7  Microsoft Generic 
PowerCaseMouseMouse Pad
Ultra 600W Limited Edition NZXT Black Steel Razer Deathadder Razer Goliath 
Audio
Realtek HD Audio 
  hide details  
Reply
Lee XT
(17 items)
 
  
CPUMotherboardGraphicsRAM
AMD FX-6300 Asus M5A97 SAPPHIRE Radeon HD 7850 AMD 4GB DDR3 1333MHZ 
RAMRAMRAMHard Drive
AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ OCZ Vertex 4 256GB 
CoolingOSMonitorKeyboard
Corsair H80 Windows 8.1 Pro MCE Dell P2414H WHXV7  Microsoft Generic 
PowerCaseMouseMouse Pad
Ultra 600W Limited Edition NZXT Black Steel Razer Deathadder Razer Goliath 
Audio
Realtek HD Audio 
  hide details  
Reply
post #13 of 20
Quote:
Originally Posted by Licht View Post
nVidia is in a war of sorts with Intel and wants to make sure the industry doesn't adopt Intel's system. Additionally nVidia just bought AGEIA and some games already have support for AGEIA's system so why would they turn around and ditch AGEIA?
Where did I imply Nvidia would ditch Ageia?
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
post #14 of 20
Quote:
Originally Posted by Crazy9000 View Post
It's because AMD is a third party in this. Since nvidia owns phsyx, it wouldn't make sense for them to use the competing technology, Havok.

AMD gets the opportunity to support both technologies and let the best one win, while nvidia and Intel duke it out to see which one takes dominance.
isn't it already a for gone that havok is two steps ahead of physx, in the sense that havok is the base of physics on most games ... eg source engine (valve), unreal engine (like every game released on consoles), (the list is really long so i'll leave it there). were as physx is used only 28 titles

so to that sense isn't nvidia just running in the hope that marketing will fool stupid people into think that the whole "way it was meant to be played" will prevail ...
Shrunk ....
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 B3 @ 3.2 GHz Asus P5E-VM HDMI EVGA GTX280 SSC OCZ Reaper 8GB 
Hard DriveOptical DriveOSMonitor
OCZ Core V2 Raid 0 & WD 500GB Samsung DVD-RW Windows 7 64bit Dell ST2410 24" 
KeyboardPowerCaseMouse
Filco Majestouch (Cherry Black) Antec TruePower Quattro 850W Antec P180 Mini Razer Mamba 
  hide details  
Reply
Shrunk ....
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 B3 @ 3.2 GHz Asus P5E-VM HDMI EVGA GTX280 SSC OCZ Reaper 8GB 
Hard DriveOptical DriveOSMonitor
OCZ Core V2 Raid 0 & WD 500GB Samsung DVD-RW Windows 7 64bit Dell ST2410 24" 
KeyboardPowerCaseMouse
Filco Majestouch (Cherry Black) Antec TruePower Quattro 850W Antec P180 Mini Razer Mamba 
  hide details  
Reply
post #15 of 20
I call BS. It was already leaked that the "game physics support" is based on a modified F@H api. And to support PhysX, they would also need to support CUDA. Ain't gonna happen. Sorry for the disappointment.
Fermi
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 Asus Rampage II Gene BFG GTX280 OCX 12GB DDR3 1600 
OSPower
Windows 7 Ultimate 64-bit Corsair VX550 
  hide details  
Reply
Fermi
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 Asus Rampage II Gene BFG GTX280 OCX 12GB DDR3 1600 
OSPower
Windows 7 Ultimate 64-bit Corsair VX550 
  hide details  
Reply
post #16 of 20
i wouldnt figure intel would help ati since they are part of amd now. thats kinda weird. hmm must be some other debate between nvidia and intel that i dont know about, and therefore dont care about.
post #17 of 20
Quote:
Originally Posted by 003 View Post
I call BS. It was already leaked that the "game physics support" is based on a modified F@H api. And to support PhysX, they would also need to support CUDA. Ain't gonna happen. Sorry for the disappointment.
Technical it is possible that ATi Cards run CUDA !
But this would require a kind of runtime environment to bring it to the CTM API.
Making CUDA on Radeon way slower than on GeForce.

Same would go for PhysX.
Bringing a wide market share to PhysX while it would run better on NV Cards.



What I find really strage on this:
when ATi implements CUDA or PhysX, nVidia Software developers would get very internal informations on the R600 (shader) architecture!

Very clever move by nVidia.
Therefore we will have to wait for an official statement of AMD/ATi.
Compu15
(13 items)
 
  
CPUMotherboardGraphicsRAM
Xeon X3230 @ 3200 abit AB9 Quad GT Club3D HD4870 1024 OC 8GB - (2x G.Skill DDR2 F2-6400CL5D-4GBPQ) 
Hard DriveOptical DriveOSMonitor
2x [WD Raptor 74GB] + 2x [Samsung 640GB] Samsung SH-S183A Windows Vista Business 64 2x Benq FP71E 17" 
KeyboardPowerCaseMouse
Noname (<10€) + Sharkoon Rush Pad Cooler Master iGreen 600W Cooler Master Centurion 534 (black + window) Logitech MX518 
Mouse Pad
Sharkoon 1337 Gaming Mat 
  hide details  
Reply
Compu15
(13 items)
 
  
CPUMotherboardGraphicsRAM
Xeon X3230 @ 3200 abit AB9 Quad GT Club3D HD4870 1024 OC 8GB - (2x G.Skill DDR2 F2-6400CL5D-4GBPQ) 
Hard DriveOptical DriveOSMonitor
2x [WD Raptor 74GB] + 2x [Samsung 640GB] Samsung SH-S183A Windows Vista Business 64 2x Benq FP71E 17" 
KeyboardPowerCaseMouse
Noname (<10€) + Sharkoon Rush Pad Cooler Master iGreen 600W Cooler Master Centurion 534 (black + window) Logitech MX518 
Mouse Pad
Sharkoon 1337 Gaming Mat 
  hide details  
Reply
post #18 of 20
Quote:
Originally Posted by horror View Post
i wouldnt figure intel would help ati since they are part of amd now. thats kinda weird. hmm must be some other debate between nvidia and intel that i dont know about, and therefore dont care about.
Here is possible, viable reasoning for why such a thing could happen.

Quote:
Originally Posted by darkcloud89 View Post
They don't ignore it, but it doesn't really effect their strategy. Think about it this way: If CUDA takes off, for certain applications where a fast CPU pays off right now, the GPU begins to be more important. Intel profits a lot more from faster CPUs than they do slower ones. CUDA could potentially mean more people buying less powerful CPUs and more of the less expensive ones. For Intel, this is going to mean a loss of money. In order to prevent this from happening, Intel needs to cut back Nvidia's marketshare. Because they don't currently have a competing product themselves, that job is left to ATI. So, it makes sense for them to be willing to help ATI make their product more compelling. It's basically the same story with SLI+Nehalem, it's not really a mistake. Intel just doesn't want SLI support.

Yes, AMD does have the potential to make money because of this, and they are Intel's competitor. However, consider the circumstances. The most competitive market between AMD and Intel right now is servers. Intel's biggest problem here is scalability, and their answer to this problem is Nehalem. Nehalem isn't meant to be a revolutionary desktop processor, and in fact for single threaded applications I think you will have to look hard to find a big improvement. What it really is, is Intel's answer to Opteron scalability and the real improvement with Nehalem is with multithreaded applications and scaling. So, even if ATI gains marketshare in consumer graphics, Intel is planning to cut in on AMD's server marketshare so it will balance out in the end.

Edit: Wow, post number 200.
What you say here makes a lot of sense. Why make the better product when you can simply find a way to knock the competition down a little. Forward progress FTL I guess
post #19 of 20
Quote:
Originally Posted by darkcloud89 View Post
So, even if ATI gains marketshare in consumer graphics, Intel is planning to cut in on AMD's server marketshare so it will balance out in the end.
And once Intel succeeds in crushing Nvidia (if that ever happens), then they will start restricting the **** out of ATI/AMD and crush it as well--leaving the consumer in a really, really bady place.

Nvidia made PhysX public, so anyone can use it--ATI or Larrabee alike. I think that was a smart move by Nvidia for both them, and the consumer. That will allow for the advance of gaming MUCH faster than if Intel created a gap, where Larrabee and ATI users were using Havoc, while Nvidia users were using PhysX.

I'm definitely rooting for Nvidia and booing Intel in this battle. If Intel wins, not only will the integration of advanced physics and faster software processing significantly become reduced in the near future, it will slow to a crawl in the distant future.
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
post #20 of 20
Quote:
Originally Posted by darkcloud89 View Post
They don't ignore it, but it doesn't really effect their strategy. Think about it this way: If CUDA takes off, for certain applications where a fast CPU pays off right now, the GPU begins to be more important. Intel profits a lot more from faster CPUs than they do slower ones. CUDA could potentially mean more people buying less powerful CPUs and more of the less expensive ones. For Intel, this is going to mean a loss of money. In order to prevent this from happening, Intel needs to cut back Nvidia's marketshare. Because they don't currently have a competing product themselves, that job is left to ATI. So, it makes sense for them to be willing to help ATI make their product more compelling. It's basically the same story with SLI+Nehalem, it's not really a mistake. Intel just doesn't want SLI support.

Yes, AMD does have the potential to make money because of this, and they are Intel's competitor. However, consider the circumstances. The most competitive market between AMD and Intel right now is servers. Intel's biggest problem here is scalability, and their answer to this problem is Nehalem. Nehalem isn't meant to be a revolutionary desktop processor, and in fact for single threaded applications I think you will have to look hard to find a big improvement. What it really is, is Intel's answer to Opteron scalability and the real improvement with Nehalem is with multithreaded applications and scaling. So, even if ATI gains marketshare in consumer graphics, Intel is planning to cut in on AMD's server marketshare so it will balance out in the end.

Edit: Wow, post number 200.
Very Good reasoning +rep
   
CPUMotherboardGraphicsRAM
AMD Phenom II X6 1090T Black Edition Thuban 3.2...  Gigabyte MB Model : GA-890FXA-UD5, NB : AMD 89... Radeon HD 6850 1GB 256-bit GDDR5 775MHz 960 SPU... G.SKILL Sniper 8GB (2 x 4GB) 240-Pin DDR3 SDRAM... 
Hard DriveOptical DriveCoolingOS
SAMSUNG EcoGreen F4 HD204UI 2TB 32MB Cache SATA... LG WH10LS30 10X Blu-ray Burner - LightScribe Su... XIGMATEK LOKI SD963 92mm HYPRO Bearing CPU Cooler  Windows 7 Ultimate 
MonitorMonitorMonitorKeyboard
Hanns-G 28'' 3ms Full HD 1080P 1920 x 1200LCD M... Samsung 19" 1440x900 Envision 22" 1680x1050 RAZER Lycosa Black USB Wired Gaming Keyboard 
PowerCaseMouseMouse Pad
TOPOWER Galaxis MX Series GAPS-500MX 500W ATX12... COOLER MASTER Storm Scout RAZER DeathAdder Precision Optical Gaming Mouse... RocketFish (Speed+Control) 
Audio
Altec Lansing VS4221 2.1 35 Watts Three-Piece C... 
  hide details  
Reply
   
CPUMotherboardGraphicsRAM
AMD Phenom II X6 1090T Black Edition Thuban 3.2...  Gigabyte MB Model : GA-890FXA-UD5, NB : AMD 89... Radeon HD 6850 1GB 256-bit GDDR5 775MHz 960 SPU... G.SKILL Sniper 8GB (2 x 4GB) 240-Pin DDR3 SDRAM... 
Hard DriveOptical DriveCoolingOS
SAMSUNG EcoGreen F4 HD204UI 2TB 32MB Cache SATA... LG WH10LS30 10X Blu-ray Burner - LightScribe Su... XIGMATEK LOKI SD963 92mm HYPRO Bearing CPU Cooler  Windows 7 Ultimate 
MonitorMonitorMonitorKeyboard
Hanns-G 28'' 3ms Full HD 1080P 1920 x 1200LCD M... Samsung 19" 1440x900 Envision 22" 1680x1050 RAZER Lycosa Black USB Wired Gaming Keyboard 
PowerCaseMouseMouse Pad
TOPOWER Galaxis MX Series GAPS-500MX 500W ATX12... COOLER MASTER Storm Scout RAZER DeathAdder Precision Optical Gaming Mouse... RocketFish (Speed+Control) 
Audio
Altec Lansing VS4221 2.1 35 Watts Three-Piece C... 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [Fud] ATI Probing PhysX on GPU