Overclock.net › Forums › Graphics Cards › AMD/ATI › Why Apple choose to use AMD graphic card but not the other one?
New Posts  All Forums:Forum Nav:

Why Apple choose to use AMD graphic card but not the other one? - Page 5

post #41 of 77
Image quality again? Really?

Look to the past. There are instances in the past where it was true and shots were fired from both sides regarding it.

Today? Nothing.

AMD could ruin Nvidia with this yet what do we hear? Nothing.

AMD had no smoking gun about Gameworks yet they used it against Nvidia every chance they got.

Freesync which really did nothing to help AMD except for using it against Nvidia got lots of press.

But here they should have indisputable proof according to everyone in this thread......yet nothing.

Think about it.
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
post #42 of 77

Apple used to have NVIDIA cards in the past. Really far back the Geforce 2 and 4 etc. The Macbooks had the 8400/8600/9400 and 320m series, however almost all of them had issues over time regarding solder going bad or something.

 

They had NVIDIA for a long time and I don't know why they switched. Maybe because of costs, performance or just bang for buck? or simply better "driver" support.

Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
post #43 of 77
Quote:
Originally Posted by mcg75 View Post

Image quality again? Really?

Look to the past. There are instances in the past where it was true and shots were fired from both sides regarding it.

Today? Nothing.

AMD could ruin Nvidia with this yet what do we hear? Nothing.

AMD had no smoking gun about Gameworks yet they used it against Nvidia every chance they got.

Freesync which really did nothing to help AMD except for using it against Nvidia got lots of press.

But here they should have indisputable proof according to everyone in this thread......yet nothing.

Think about it.

Yah, image quality again. lol
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 290 (2) 16 / 1866 
Hard DriveCoolingOSMonitor
1000 360/240 10 64 28 2160 
PowerCase
850 540 
  hide details  
Reply
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 290 (2) 16 / 1866 
Hard DriveCoolingOSMonitor
1000 360/240 10 64 28 2160 
PowerCase
850 540 
  hide details  
Reply
post #44 of 77
Regarding image quality: Originally Macs were for pro users and they would calibrate their macs with the graphics cards in question. This would mean that if you had a NVIDIA or AMD card you would achieve the same color quality. Since the monitor adjusts for the differences in saturation etc. My dad has been working with macs since their first model and has always used calibrated monitor (in combination with either an ATi/AMD or Nvidia GPU) and achieved the same "image" quality every time, since this was important for working in print.
 
An image is an image, i think when we say image quality we really mean color accuracy, don't we? then yes, it might be suggestable that AMD looks better, but that is in the eye of the beholder. Using a Mac, or Windows for professional use should always be done with a calibrated screen, and then you will see no difference.
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
post #45 of 77
Quote:
Originally Posted by mcg75 View Post

Image quality again? Really?

Look to the past. There are instances in the past where it was true and shots were fired from both sides regarding it.

Today? Nothing.

AMD could ruin Nvidia with this yet what do we hear? Nothing.

AMD had no smoking gun about Gameworks yet they used it against Nvidia every chance they got.

Freesync which really did nothing to help AMD except for using it against Nvidia got lots of press.

But here they should have indisputable proof according to everyone in this thread......yet nothing.

Think about it.

That's a great point. We all know that AMD will grab on to anything they can show as a shortcoming to fire at nVidia. Whether it is a real shortcoming or only one they can tilt perception on has been inconsequential. Were this really the case, AMD would have been all over it.
post #46 of 77
Quote:
Originally Posted by ciarlatano View Post

That's a great point. We all know that AMD will grab on to anything they can show as a shortcoming to fire at nVidia. Whether it is a real shortcoming or only one they can tilt perception on has been inconsequential. Were this really the case, AMD would have been all over it.

Where is AMD? We are just a bunch of users sharing our opinion. lol
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 290 (2) 16 / 1866 
Hard DriveCoolingOSMonitor
1000 360/240 10 64 28 2160 
PowerCase
850 540 
  hide details  
Reply
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 290 (2) 16 / 1866 
Hard DriveCoolingOSMonitor
1000 360/240 10 64 28 2160 
PowerCase
850 540 
  hide details  
Reply
post #47 of 77
We had a recent user who had the experience of AMD having better image quality, here;
http://www.overclock.net/t/1606942/cure-my-insanity-on-this-issue-going-back-to-amd/0_20

I did my best to try and analyze the disparity using a few examples in these posts;
http://www.overclock.net/t/1606618/480-vs-1060-ashes-of-the-singularity-why-is-the-1060-and-480-tying-each-other/100_20#post_25384132

http://www.overclock.net/t/1606618/480-vs-1060-ashes-of-the-singularity-why-is-the-1060-and-480-tying-each-other/120_20#post_25384232

http://www.overclock.net/t/1606618/480-vs-1060-ashes-of-the-singularity-why-is-the-1060-and-480-tying-each-other/120_20#post_25384488

http://www.overclock.net/t/1606618/480-vs-1060-ashes-of-the-singularity-why-is-the-1060-and-480-tying-each-other/140_20#post_25384652

http://www.overclock.net/t/1606618/480-vs-1060-ashes-of-the-singularity-why-is-the-1060-and-480-tying-each-other/200_20#post_25386009

Here's the full recap, copied from those links;
Warning: Spoiler! (Click to show)
nVidia's control panel default settings downgrades image quality and that is well-known. I do wonder how many reviewers actually check for this. My guess is none... See a video here...

And now I'll wait for the untrained eye to tell me that there's no difference, even though the difference in FPS is about 10% lower with actual max settings...

In reviews, no one mentions that they forced settings under the control panel. Chances are that they leave it at default, and only change the game settings, thinking that that gives equal image quality. There is no evidence that they change these settings to achieve equal image clarity. Conclusive proof, not yet, but everything indicates that it is like this. Multiple people have first hand experience with the image quality difference. And there is clear differences in the video posted above.

Screenshots from the previously posted video. I took two examples, but there are more differences (obviously).





Gregster's post;

Right chaps, the best quality I could get. Recorded at 1080P with 60fps using Raptr for AMD and ShadowPlay for Nvidia.

Again, nothing has been touched from either (not even the colours on the AMD side of things.

DDU - Instaled AMD Drivers, recorded with Raptr
DDU - installed Nvidia drivers, recorded with ShadowPlay
Nothing changed anywhere - Recorded with 30mbs and 1080P on both and set for 60 fps recording.

I didn't even alter the colour settings this time for AMD and left everything default. thoughts?

End of Gregster's post....

My analysis of his video;
Default quality both sides... The Fury X has a more aggressive motion blur to it, which makes the comparison a bit more complicated based on a video rather than screenshots. But still, I was able to find out some things...

Seems to be that textures are the same. Up close textures, I see no difference between them. However, AMD's AF seems to be superior to nVidia's, because from a distance you'll see that the AMD side looks sharper (see shots below). Aside from this, there's a slight difference in their colors. Hard to notice, but it's there... Let's look at the screenshots.

Look at this picture... The car is 'whiter' on the AMD card, but the difference is so small that most people won't notice. The reflection of the building is also 'more blue' on the AMD card. Also notice the shadow. AMD's shadows seem to be superior here...


Here you can also see a clear difference in color, but this is not really a color difference. You'll see in the next image that it's practically the same. But nVidia's inferior AF implementation makes it appear blurry, washing out the blues and the whites with each other. It could be argued that it's a difference in lighting, although from the video it does not appear to be the case. The board really is blurrier on nVidia.


As you can see, when closer, the boards look almost the same, (the AMD one is a longer distance away).


And lastly, here the player was standing still. The difference in AF is clearly visible here... Also notice that he's closer to that spot that I marked on the ground on the nVidia card and yet it looks blurrier on it.


So... More aggressive blur + better AF + sharper shadow edges would net some performance loss on the AMD card compared to nVidia. The colors... Too complicated to delve into them based on video screenshots only.

Remember that image quality difference is something that's easily missed... Want a recent example?

Mirror's Edge Catalyst. GTX 970 vs R9 390.

When setting the game to Hyper settings instead of Ultra, Digital Foundry concluded that the GTX 970 was a lot faster. They completely failed to notice that the GTX 970 was actually running at Ultra still, rather than Hyper. So they were completely wrong on the R9 390 having worse performance at those settings.

And when reviewers are testing, they are under a lot of time pressure. They put one card in, test a bunch of games and record the results. When they're done, they put in the other one, test a bunch of games and record those results. The chances of noticing image disparity this way is extremely small if not impossible.

They did not mention that they checked the control panel settings, so we have to assume that they use the default settings, and only look for parity within in-game settings. If default settings give a better image quality in one over the other, one is doing more work than the other, thus the performance results are not actually reliable since we're not testing with the same standard.

So right now we have;
  • Obvious difference between nVidia default setting and emphasis on quality settings
  • AMD having superior quality to nVidia in one game (BF4) at default control panel settings for both
  • User experiences of AMD having better image quality at default settings.

We still need;
  • The image quality with emphasis on highest quality for AMD cards, compared to their default setting
  • What reviewers use for their benchmarks
  • More game samples

People pretend to be interested, but when it comes to testing the differences, barely anyone wants to actually contribute to put it to the test...
Edited by NightAntilli - 8/18/16 at 6:21am
post #48 of 77
This quality is only with rendering right? Not pre-rendered images or editing. Mac users generally dont buy a mac to game on anyways.
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
post #49 of 77
Apparently some people notice it even from the desktop... How true that is, I don't know. Haven't tested it.
I know that when I switched from my Geforce MX440 to a Radeon 9600 Pro I noticed a difference on the desktop itself. But I can't confirm that one was better than the other, just that they were different. But that was so long ago...
post #50 of 77
Quote:
Originally Posted by NightAntilli View Post

Apparently some people notice it even from the desktop... How true that is, I don't know. Haven't tested it.
I know that when I switched from my Geforce MX440 to a Radeon 9600 Pro I noticed a difference on the desktop itself. But I can't confirm that one was better than the other, just that they were different. But that was so long ago...

The BF4 comparison was debunked awhile ago by the same user who made the original video.

Quote:
Originally Posted by mcg75 View Post

Image quality again? Really?

Look to the past. There are instances in the past where it was true and shots were fired from both sides regarding it.

Today? Nothing.

AMD could ruin Nvidia with this yet what do we hear? Nothing.

AMD had no smoking gun about Gameworks yet they used it against Nvidia every chance they got.

Freesync which really did nothing to help AMD except for using it against Nvidia got lots of press.

But here they should have indisputable proof according to everyone in this thread......yet nothing.

Think about it.

Yep exactly. No idea why the whole Image Quality thing gets brought up so much.
^_^
(13 items)
 
   
CPUMotherboardGraphicsRAM
Intel 2600K ASRock Z68 EXTREME4 GEN3 Gigabyte Windforce 7970 OC Mushkin Enhanced Blackline 8GB (2 x 4GB) 
CoolingMonitorPowerCase
Noctua D14 Asus VG248QE Corsair TX 750 V2 Corsair 650D 
MouseMouse PadAudioAudio
G400, MX518, G500, G5, IMO 1.1, IME 3.0, Deatha... fUnC 1030, QCK+, QCK Heavy, PureTrak Talent Whi... X-Fi Titanium HD Sennheiser HD555 
CPUGraphicsRAMHard Drive
Intel i7 2.4GHz Intel HD 4000 8GB 256GB SSD 
Monitor
2880 x 1800 Retina 
  hide details  
Reply
^_^
(13 items)
 
   
CPUMotherboardGraphicsRAM
Intel 2600K ASRock Z68 EXTREME4 GEN3 Gigabyte Windforce 7970 OC Mushkin Enhanced Blackline 8GB (2 x 4GB) 
CoolingMonitorPowerCase
Noctua D14 Asus VG248QE Corsair TX 750 V2 Corsair 650D 
MouseMouse PadAudioAudio
G400, MX518, G500, G5, IMO 1.1, IME 3.0, Deatha... fUnC 1030, QCK+, QCK Heavy, PureTrak Talent Whi... X-Fi Titanium HD Sennheiser HD555 
CPUGraphicsRAMHard Drive
Intel i7 2.4GHz Intel HD 4000 8GB 256GB SSD 
Monitor
2880 x 1800 Retina 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › Why Apple choose to use AMD graphic card but not the other one?