Overclock.net › Forums › Industry News › Software News › [AnandTech] GeForce+Radeon: Previewing DirectX 12 Multi Adapter with Ashes of the Singularity
New Posts  All Forums:Forum Nav:

[AnandTech] GeForce+Radeon: Previewing DirectX 12 Multi Adapter with Ashes of the Singularity - Page 22

post #211 of 326
Quote:
Originally Posted by looniam View Post

NOPE.
http://www.ngohq.com/graphic-cards/16223-nvidia-disables-physx-when-ati-card-is-present.html
nvidia does not support their physX when another card is rendering graphics. they cannot guarantee a user having a "great physX experience" using a non cuda card so they disable it.

problem?

i find it soooooo funny that folks are making a stink about physx here but post about physX features or games with physX when asking between a red or green card and you'll find a lot of - just "garbage middle ware" hate going on.

so which is it AMD users? you want/like physX or not?

Regardless of how good or the experience is, the point is that it should be up to the user. No support is one thing, disabling is a bit ridiculous. If someone has an NVidia GPU, it's not like they are getting GPU PhysX capability for free.

It's pretty obvious that it is a marketing move. Power to them, it is smart. They don't want people buying the competitors mid to top end cards without having to sacrifice PhysX capability.

Ultimately the reason people say that it is not needed is because NVidia made it a black and white choice. It would be nice to have, but in my opinion it shouldn't be a reason to sway wallets.
Sorry AMD
(13 items)
 
   
CPUMotherboardGraphicsRAM
i5-8400 ROG STRIX Z370-I GAMING ZOTAC GTX 1080 Ti Mini 16gb Gskill Ripjaw V 3600mhz CL16 
Hard DriveCoolingOSMonitor
256gb m.2 Crucial noctua nh-l9i Windows 10 Pro AOC G2460PF 
KeyboardPowerCaseMouse
Mechanical Eagle Z-77 (Cheap) Corsair SF600 Fractal Design Node 202 Logitech G400 
Audio
Sennheiser PC 350 (hero mod) 
CPUMotherboardGraphicsRAM
AMD Ryzen™ 1700 Asrock Fatal1ty x370 Gaming K4 Zotac mini GTX 1080ti TridentZ 16GB (2x8GB) DDR4 
Hard DriveHard DriveOSMonitor
256GB M.2 SSD 1TB HDD Windows 10 Pro Acer XG270HU 
KeyboardPowerCaseMouse
Logitech G15 750W Thermaltake Toughpower Gold Rated Thermaltake View 27 w/ riser Logitech G40 
CPUMotherboardGraphicsRAM
fx-8350 @5ghz 990fxa-ud3 rev 1.1 PowerColor 290x @1150/1500mhz Gskill 8gig 1600mhz (2x4) @2033 mhz 
Hard DriveCoolingOSPower
1TB WD Caviar Black rs360 raystorm kit Windows 8 Pro 700 Watt Corsair 
CaseMouse
HAF 912  Logitech G400 
  hide details  
Reply
Sorry AMD
(13 items)
 
   
CPUMotherboardGraphicsRAM
i5-8400 ROG STRIX Z370-I GAMING ZOTAC GTX 1080 Ti Mini 16gb Gskill Ripjaw V 3600mhz CL16 
Hard DriveCoolingOSMonitor
256gb m.2 Crucial noctua nh-l9i Windows 10 Pro AOC G2460PF 
KeyboardPowerCaseMouse
Mechanical Eagle Z-77 (Cheap) Corsair SF600 Fractal Design Node 202 Logitech G400 
Audio
Sennheiser PC 350 (hero mod) 
CPUMotherboardGraphicsRAM
AMD Ryzen™ 1700 Asrock Fatal1ty x370 Gaming K4 Zotac mini GTX 1080ti TridentZ 16GB (2x8GB) DDR4 
Hard DriveHard DriveOSMonitor
256GB M.2 SSD 1TB HDD Windows 10 Pro Acer XG270HU 
KeyboardPowerCaseMouse
Logitech G15 750W Thermaltake Toughpower Gold Rated Thermaltake View 27 w/ riser Logitech G40 
CPUMotherboardGraphicsRAM
fx-8350 @5ghz 990fxa-ud3 rev 1.1 PowerColor 290x @1150/1500mhz Gskill 8gig 1600mhz (2x4) @2033 mhz 
Hard DriveCoolingOSPower
1TB WD Caviar Black rs360 raystorm kit Windows 8 Pro 700 Watt Corsair 
CaseMouse
HAF 912  Logitech G400 
  hide details  
Reply
post #212 of 326
As much as the concept tickles my fancy (ever since first hearing about it awhile ago now, and the more I'm reading about it through this thread, and what the Oxide Dev has to say is peaking my interest), I cannot for see Developers putting in the added effort of coding it in (other than a few obscure titles).

With the way games are so poorly coded/ported and optimized as of late, and how Publishers give Dev's such unrealistic time frames to have a solid working title out the door, it seems like this tech wont be commonplace for a long time yet (Although I sure wish it would, and I would love to be proven wrong on this matter).

I mean Lucid Virtue barely even worked, and most of the time it caused major instability and issues, and nothing has come to improve upon it to this day...iGPU's still remain unused in a system that has a dGPU, when they should be able to tackle PhysX tasks handily in my opinion.

I'm bringing this up as an example of how something that seems like a logical progression, never ends up occurring...AFR and multi-adapter as well as shared memory pool all seem like pretty logical next steps, but its taken a long time to even get to the point of its current infancy, and I can see it taking a long time again for it to be properly utilized, thanks to coding for software holding hardware back.
post #213 of 326
Quote:
Originally Posted by Ultracarpet View Post

Regardless of how good or the experience is, the point is that it should be up to the user. No support is one thing, disabling is a bit ridiculous. If someone has an NVidia GPU, it's not like they are getting GPU PhysX capability for free.

It's pretty obvious that it is a marketing move. Power to them, it is smart. They don't want people buying the competitors mid to top end cards without having to sacrifice PhysX capability.

Ultimately the reason people say that it is not needed is because NVidia made it a black and white choice. It would be nice to have, but in my opinion it shouldn't be a reason to sway wallets.
i don't agree that it ought to be the user's choice; it's nvidia's technology that they researched and developed at their own cost and that gives them the right to disable it.

i'd even lend some credibility to their concern about having a non cuda card for rendering; since prior to them disabling it w/driver 180.xx i saw quite a few complains of folks getting a inexpensive physX card w/ amd and complain it was a worse experience than not having one at all and adding to the dislike for physX. ie "i spent $150 for a (blank) and was a total waste of money - thanks nvidia!" (yes, that was some time ago - BEFORE they disabled it).

seriously, how many threads are posted in forums w/folks ranting about hard/software problems and it's really PEKAC? so nivida decided to avoid it; now that is the marketing move by avoiding a condition where your technology looks less than desirable.


again its funny how suddenly now people care about physX being disabled. oh, but that misinformation is just an excuse to spread FUD about them disabling dx12 features.


amiright?!? tongue.gif
Edited by looniam - 10/28/15 at 11:31pm
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #214 of 326
Quote:
Originally Posted by looniam View Post

i don't agree that it ought to be the user's choice; it's nvidia's technology that they researched and developed at their own cost and that gives them the right to disable it.

i'd even lend some credibility to their concern about having a non cuda card for rendering; since prior to them disabling it w/driver 180.xx i saw quite a few complains of folks getting a inexpensive physX card w/ amd and complain it was a worse experience than not having one at all and adding to the dislike for physX. ie "i spent $150 for a (blank) and was a total waste of money - thanks nvidia!" (yes, that was some time ago - BEFORE they disabled it).

seriously, how many threads are posted in forums w/folks ranting about problems hard/software and it's really PEKAC? so nivida decided to avoid it; now that is the marketing move by avoiding a condition where your technology looks less than desirable.

and FYI, you don't sacrifice physX capability but you will take a big hit in performance. wink.gif

again its funny how suddenly now people care about physX being disabled. oh, but that misinformation is just an excuse to spread FUD about them disabling dx12 features.


amiright?!? tongue.gif

Well again I would say that if a person bought an NVidia card, they bought the hardware to access that tech.

I didn't have much experience with dedicated PhysX cards in the past, though, nor did I look into it very far, so I don't know how bad it was. Regardless, I doubt it was NVidia having to deal with everyone complaining as they just wouldn't support the setup. Disabling it just eliminated the possibility for people to even consider AMD if they wanted PhysX (to perform well). I also don't expect them to try and make PhysX work at all for other brands, in fact, the ability for it to work via CPU is pretty much charity and I think they deserve kudos for even doing it.

Where I will take a turn in this discussion is that I don't see the advantage for NVidia to disable this multi-adapter capability. It's not some technology they have that AMD doesn't or vice-versa. It doesn't hinder their ability to move higher end products, and it will likely be extremely niche.
Sorry AMD
(13 items)
 
   
CPUMotherboardGraphicsRAM
i5-8400 ROG STRIX Z370-I GAMING ZOTAC GTX 1080 Ti Mini 16gb Gskill Ripjaw V 3600mhz CL16 
Hard DriveCoolingOSMonitor
256gb m.2 Crucial noctua nh-l9i Windows 10 Pro AOC G2460PF 
KeyboardPowerCaseMouse
Mechanical Eagle Z-77 (Cheap) Corsair SF600 Fractal Design Node 202 Logitech G400 
Audio
Sennheiser PC 350 (hero mod) 
CPUMotherboardGraphicsRAM
AMD Ryzen™ 1700 Asrock Fatal1ty x370 Gaming K4 Zotac mini GTX 1080ti TridentZ 16GB (2x8GB) DDR4 
Hard DriveHard DriveOSMonitor
256GB M.2 SSD 1TB HDD Windows 10 Pro Acer XG270HU 
KeyboardPowerCaseMouse
Logitech G15 750W Thermaltake Toughpower Gold Rated Thermaltake View 27 w/ riser Logitech G40 
CPUMotherboardGraphicsRAM
fx-8350 @5ghz 990fxa-ud3 rev 1.1 PowerColor 290x @1150/1500mhz Gskill 8gig 1600mhz (2x4) @2033 mhz 
Hard DriveCoolingOSPower
1TB WD Caviar Black rs360 raystorm kit Windows 8 Pro 700 Watt Corsair 
CaseMouse
HAF 912  Logitech G400 
  hide details  
Reply
Sorry AMD
(13 items)
 
   
CPUMotherboardGraphicsRAM
i5-8400 ROG STRIX Z370-I GAMING ZOTAC GTX 1080 Ti Mini 16gb Gskill Ripjaw V 3600mhz CL16 
Hard DriveCoolingOSMonitor
256gb m.2 Crucial noctua nh-l9i Windows 10 Pro AOC G2460PF 
KeyboardPowerCaseMouse
Mechanical Eagle Z-77 (Cheap) Corsair SF600 Fractal Design Node 202 Logitech G400 
Audio
Sennheiser PC 350 (hero mod) 
CPUMotherboardGraphicsRAM
AMD Ryzen™ 1700 Asrock Fatal1ty x370 Gaming K4 Zotac mini GTX 1080ti TridentZ 16GB (2x8GB) DDR4 
Hard DriveHard DriveOSMonitor
256GB M.2 SSD 1TB HDD Windows 10 Pro Acer XG270HU 
KeyboardPowerCaseMouse
Logitech G15 750W Thermaltake Toughpower Gold Rated Thermaltake View 27 w/ riser Logitech G40 
CPUMotherboardGraphicsRAM
fx-8350 @5ghz 990fxa-ud3 rev 1.1 PowerColor 290x @1150/1500mhz Gskill 8gig 1600mhz (2x4) @2033 mhz 
Hard DriveCoolingOSPower
1TB WD Caviar Black rs360 raystorm kit Windows 8 Pro 700 Watt Corsair 
CaseMouse
HAF 912  Logitech G400 
  hide details  
Reply
post #215 of 326
Quote:
Originally Posted by looniam View Post

i don't agree that it ought to be the user's choice; it's nvidia's technology that they researched and developed at their own cost and that gives them the right to disable it.
Man you don't understand basic things.

So I own an Nvidia card, let's say a GTX 750Ti. If that's the only card in my system I can use Physx no problem in any game that has support, if I also put an AMD cad in my system, Nvidia via drivers will disable Physx capabilities in case I want to use the 750Ti as a physx only card.
So I don't see how your opinions are valid.
post #216 of 326
Quote:
Originally Posted by Serios View Post

Man you don't understand basic things.

So I own an Nvidia card, let's say a GTX 750Ti. If that's the only card in my system I can use Physx no problem in any game that has support, if I also put an AMD cad in my system, Nvidia via drivers will disable Physx capabilities in case I want to use the 750Ti as a physx only card.
So I don't see how your opinions are valid.

That's the way the cookie crumbles, IIRC, AMD was offered to license PHYSX AND CUDA, but they refused and chose to go with Havok. So quit blaming NVIDIA for everything, when it's AMD's choices.

http://www.extremetech.com/computing/82264-why-wont-ati-support-cuda-and-physx
2010rig
(14 items)
 
Galaxy S3
(8 items)
 
 
CPUMotherboardGraphicsRAM
X5660 @ 4.5  ASUS P6X58D-E 980TI? 12GB OCZ Platinum - 7-7-7-21 
Hard DriveCoolingOSMonitor
1 80GB SSD x25m - 3TB F3 + F4 NH-D14 Windows 7 Ultimate LG 47LH55 
KeyboardPowerCaseMouse
Natural Wireless Keyboard Corsair 750HX CM 690 II Advanced MX 518 
CPUGraphicsRAMHard Drive
Snapdragon S4 Dual core 1500mhz Adreno 225 Samsung 2GB 16GB Onboard Flash 
OSMonitorPowerCase
Android 4.4.2 - CM11 4.8" AMOLED 1280x720 2100 mAh battery Otterbox Defender 
  hide details  
Reply
2010rig
(14 items)
 
Galaxy S3
(8 items)
 
 
CPUMotherboardGraphicsRAM
X5660 @ 4.5  ASUS P6X58D-E 980TI? 12GB OCZ Platinum - 7-7-7-21 
Hard DriveCoolingOSMonitor
1 80GB SSD x25m - 3TB F3 + F4 NH-D14 Windows 7 Ultimate LG 47LH55 
KeyboardPowerCaseMouse
Natural Wireless Keyboard Corsair 750HX CM 690 II Advanced MX 518 
CPUGraphicsRAMHard Drive
Snapdragon S4 Dual core 1500mhz Adreno 225 Samsung 2GB 16GB Onboard Flash 
OSMonitorPowerCase
Android 4.4.2 - CM11 4.8" AMOLED 1280x720 2100 mAh battery Otterbox Defender 
  hide details  
Reply
post #217 of 326
Quote:
Originally Posted by Ultracarpet View Post

Well again I would say that if a person bought an NVidia card, they bought the hardware to access that tech.

I didn't have much experience with dedicated PhysX cards in the past, though, nor did I look into it very far, so I don't know how bad it was. Regardless, I doubt it was NVidia having to deal with everyone complaining as they just wouldn't support the setup. Disabling it just eliminated the possibility for people to even consider AMD if they wanted PhysX (to perform well). I also don't expect them to try and make PhysX work at all for other brands, in fact, the ability for it to work via CPU is pretty much charity and I think they deserve kudos for even doing it.

Where I will take a turn in this discussion is that I don't see the advantage for NVidia to disable this multi-adapter capability. It's not some technology they have that AMD doesn't or vice-versa. It doesn't hinder their ability to move higher end products, and it will likely be extremely niche.

its nvidia saying they sell graphics cards (rendering) not physX cards. i don't know what the problem is with a manufacturer specifying the conditions their products are used.

but it seems not everyone is going to like that.

ok, fair enough. thumb.gif

Quote:
Originally Posted by Serios View Post

Man you don't understand basic things.

So I own an Nvidia card, let's say a GTX 750Ti. If that's the only card in my system I can use Physx no problem in any game that has support, if I also put an AMD cad in my system, Nvidia via drivers will disable Physx capabilities in case I want to use the 750Ti as a physx only card.
So I don't see how your opinions are valid.

i don't think you understand the concept of ownership. even if i sell you something that i own, i can still apply conditions under how you use it if i wish. i can disable your use of what i sell you if you don't use it under those conditions.

got it?
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #218 of 326
Quote:
Originally Posted by 2010rig View Post

That's the way the cookie crumbles, IIRC, AMD was offered to license PHYSX AND CUDA, but they refused and chose to go with Havok. So quit blaming NVIDIA for everything, when it's AMD's choices.

http://www.extremetech.com/computing/82264-why-wont-ati-support-cuda-and-physx
That is irrelevant to what I said.
I don't see why AMD would have payed to license CUDA at that point, a tech 100% controlled by Nvidia. From a business point of view they made the logical decision.

Now I was saying you can use physyx if you have a crappy Nvidia card no problem but suddenly if you use an AMD card as the primary and the GTX as a physx card Nvidia becomes concerned whit your performance. Yeah this excuse sounds really plausible.
post #219 of 326
Quote:
Originally Posted by looniam View Post

i don't think you understand the concept of ownership. even if i sell you something that i own, i can still apply conditions under how you use it if i wish. i can disable your use of what i sell you if you don't use it under those conditions.

got it?
I'm not saying thy can't do it, I'm saying they are not right to do it compared to you trying to make up excuses and justifications.
Got it?
post #220 of 326
Quote:
Originally Posted by Serios View Post

That is irrelevant to what I said.
I don't see why AMD would have payed to license CUDA at that point, a tech 100% controlled by Nvidia. From a business point of view they made the logical decision.

Now I was saying you can use physyx if you have a crappy Nvidia card no problem but suddenly if you use an AMD card as the primary and the GTX as a physx card Nvidia becomes concerned whit your performance. Yeah this excuse sounds really plausible.
Did you miss the part where they could've licensed PHYSX? I'm not saying I agree with their decision, but that's business man.

AMD chose Havok (owned by Intel) and they went the OpenCL route, that decision alone drove me to NVIDIA in the first place.

Corporations sole purpose are to make as much money as possible, and gain any possible advantage over their competition. AMD attempted this with Mantle remember? rolleyes.gif
Edited by 2010rig - 10/29/15 at 12:38am
2010rig
(14 items)
 
Galaxy S3
(8 items)
 
 
CPUMotherboardGraphicsRAM
X5660 @ 4.5  ASUS P6X58D-E 980TI? 12GB OCZ Platinum - 7-7-7-21 
Hard DriveCoolingOSMonitor
1 80GB SSD x25m - 3TB F3 + F4 NH-D14 Windows 7 Ultimate LG 47LH55 
KeyboardPowerCaseMouse
Natural Wireless Keyboard Corsair 750HX CM 690 II Advanced MX 518 
CPUGraphicsRAMHard Drive
Snapdragon S4 Dual core 1500mhz Adreno 225 Samsung 2GB 16GB Onboard Flash 
OSMonitorPowerCase
Android 4.4.2 - CM11 4.8" AMOLED 1280x720 2100 mAh battery Otterbox Defender 
  hide details  
Reply
2010rig
(14 items)
 
Galaxy S3
(8 items)
 
 
CPUMotherboardGraphicsRAM
X5660 @ 4.5  ASUS P6X58D-E 980TI? 12GB OCZ Platinum - 7-7-7-21 
Hard DriveCoolingOSMonitor
1 80GB SSD x25m - 3TB F3 + F4 NH-D14 Windows 7 Ultimate LG 47LH55 
KeyboardPowerCaseMouse
Natural Wireless Keyboard Corsair 750HX CM 690 II Advanced MX 518 
CPUGraphicsRAMHard Drive
Snapdragon S4 Dual core 1500mhz Adreno 225 Samsung 2GB 16GB Onboard Flash 
OSMonitorPowerCase
Android 4.4.2 - CM11 4.8" AMOLED 1280x720 2100 mAh battery Otterbox Defender 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [AnandTech] GeForce+Radeon: Previewing DirectX 12 Multi Adapter with Ashes of the Singularity