Overclock.net › Forums › Graphics Cards › AMD/ATI › Why Apple choose to use AMD graphic card but not the other one?
New Posts  All Forums:Forum Nav:

Why Apple choose to use AMD graphic card but not the other one? - Page 7

post #61 of 77
Quote:
Originally Posted by dmasteR View Post

Full Color Range was only a issue on HDMI where it defaulted to not using Full (who knows why).

http://www.pcper.com/reviews/Graphics-Cards/Battlefield-3-Beta-Performance-Testing-and-Image-Quality-Evaluation-Day-1/Ima

It's been tested who knows how many times as it always get brought up. Texture quality is the exact same, what the default colors are however is different. Pretty well known even from back in the ATi days.

Quote:
Originally Posted by mcg75 View Post

Image quality again? Really?

Look to the past. There are instances in the past where it was true and shots were fired from both sides regarding it.

Today? Nothing.

AMD could ruin Nvidia with this yet what do we hear? Nothing.

AMD had no smoking gun about Gameworks yet they used it against Nvidia every chance they got.

Freesync which really did nothing to help AMD except for using it against Nvidia got lots of press.

But here they should have indisputable proof according to everyone in this thread......yet nothing.

Think about it.

Think about what? You want proof?
https://www.amd.com/Documents/firepro-10-Bit-whitepaper.pdf

People comparing gaming results when Apple is clearly not for gamers.
Edited by Redwoodz - 8/18/16 at 3:31pm
Amelia
(13 items)
 
Professional
(13 items)
 
RCPC#1
(17 items)
 
CPUMotherboardGraphicsRAM
Phenom II X6 1100t MSI 890FX GD65 MSI Radeon HD5670 GSkill RipjawsX DDR3 PC3 12800 2x4GB CL8 
Hard DriveOptical DriveCoolingOS
WD Black 1TB SATA III Samsung BD Zalman 9900MAX Windows 7 64 Professional 
MonitorKeyboardPowerCase
AOC 22" LED Logitech Kingwin Lazer Platinum 500w Fractal Design R3 
Other
Samsung 470 SSD 128GB 
CPUMotherboardGraphicsRAM
AMD Phenom II X6 960T Asus M4A88T-VEVO Asus Strix R7 370 SuperTalent Perfomance 
RAMHard DriveHard DriveOptical Drive
GSkill Snipers Monster Daytona Seagate Barracuda 500GB 7,200 RPM 16Mb cache Memorex DVD/RW 
CoolingOSMonitorKeyboard
Corsair H60 Windows 8N IBM 9494 19" LCD IBM 
PowerCaseMouseMouse Pad
Corsair GS500 In Win H-Frame Wolfking OCZ Behemoth 
Audio
JBL Creature 
  hide details  
Reply
Amelia
(13 items)
 
Professional
(13 items)
 
RCPC#1
(17 items)
 
CPUMotherboardGraphicsRAM
Phenom II X6 1100t MSI 890FX GD65 MSI Radeon HD5670 GSkill RipjawsX DDR3 PC3 12800 2x4GB CL8 
Hard DriveOptical DriveCoolingOS
WD Black 1TB SATA III Samsung BD Zalman 9900MAX Windows 7 64 Professional 
MonitorKeyboardPowerCase
AOC 22" LED Logitech Kingwin Lazer Platinum 500w Fractal Design R3 
Other
Samsung 470 SSD 128GB 
CPUMotherboardGraphicsRAM
AMD Phenom II X6 960T Asus M4A88T-VEVO Asus Strix R7 370 SuperTalent Perfomance 
RAMHard DriveHard DriveOptical Drive
GSkill Snipers Monster Daytona Seagate Barracuda 500GB 7,200 RPM 16Mb cache Memorex DVD/RW 
CoolingOSMonitorKeyboard
Corsair H60 Windows 8N IBM 9494 19" LCD IBM 
PowerCaseMouseMouse Pad
Corsair GS500 In Win H-Frame Wolfking OCZ Behemoth 
Audio
JBL Creature 
  hide details  
Reply
post #62 of 77
Quote:
Originally Posted by Redwoodz View Post


Think about what? You want proof?
https://www.amd.com/Documents/firepro-10-Bit-whitepaper.pdf

People comparing gaming results when Apple is clearly not for gamers.

What exactly is this proof of? Both companies workstation cards support true 10 bit color.

Some gaming cards also support it. But if you think the people talking about image quality here are using true 10 bit panels to actually see the benefit, you're mistaken.
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
post #63 of 77
10 bit is supported even on my GTX 1060.

No difference when it's enabled.

I have an AOC 28" 4k that supports 10 bit.
Power Bac G5
(6 items)
 
  
CPUGraphicsRAMHard Drive
Intel i7 6950X EVGA GTX 1080 Ti GSkill 32GB 4133Mhz Samsung 960 Pro 1TB 
CoolingMonitor
Corsair H110i ASUS PG278Q 
  hide details  
Reply
Power Bac G5
(6 items)
 
  
CPUGraphicsRAMHard Drive
Intel i7 6950X EVGA GTX 1080 Ti GSkill 32GB 4133Mhz Samsung 960 Pro 1TB 
CoolingMonitor
Corsair H110i ASUS PG278Q 
  hide details  
Reply
post #64 of 77
Quote:
Originally Posted by dmasteR View Post

https://hardforum.com/threads/fury-x-vs-titan-x-benchmark.1867421/
If Image quality was indeed different, AMD's marketing team would have been all over it by now.

I like post # 19 in that thread you linked. But, the BF4 screenshot included came from the Fury club here in OCN, which is this . . .




around 20 more fps.wink.gif
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 290 (2) 16 / 1866 
Hard DriveCoolingOSMonitor
1000 360/240 10 64 28 2160 
PowerCase
850 540 
  hide details  
Reply
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 290 (2) 16 / 1866 
Hard DriveCoolingOSMonitor
1000 360/240 10 64 28 2160 
PowerCase
850 540 
  hide details  
Reply
post #65 of 77
Nvidia didnt get along with Apple that well due to cards sold to Apple from Nvidia did burn and destroyed computers.
Nvidia knew about it btw.
Main reason for AMD and Apple as you cant trust nvidia.
Thyslexia
(16 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 1600 ga B350 gaming 3 Nitro sapphire 390 16gb corsair lpx 2933mhz 
Hard DriveOptical DriveCoolingOS
samsung 850 pro  sony water w10 64bit Pro 
MonitorKeyboardPowerCase
acer xg270hu 1440p 144hz! steelseries 6gv2 Corsair 750w Cm 840 
MouseMouse PadAudioAudio
corsair deathadder (G400 cable went bad) sennheiser HD 600 asus essence stx 
  hide details  
Reply
Thyslexia
(16 items)
 
  
CPUMotherboardGraphicsRAM
Ryzen 1600 ga B350 gaming 3 Nitro sapphire 390 16gb corsair lpx 2933mhz 
Hard DriveOptical DriveCoolingOS
samsung 850 pro  sony water w10 64bit Pro 
MonitorKeyboardPowerCase
acer xg270hu 1440p 144hz! steelseries 6gv2 Corsair 750w Cm 840 
MouseMouse PadAudioAudio
corsair deathadder (G400 cable went bad) sennheiser HD 600 asus essence stx 
  hide details  
Reply
post #66 of 77
Quote:
Originally Posted by mcg75 View Post

What exactly is this proof of? Both companies workstation cards support true 10 bit color.

Some gaming cards also support it. But if you think the people talking about image quality here are using true 10 bit panels to actually see the benefit, you're mistaken.


The point is all AMD's cards support it,though only enabled in Firepro's,while previously only Quadro's had the capability through CUDA,not OpenCL which Apple prefers.
Amelia
(13 items)
 
Professional
(13 items)
 
RCPC#1
(17 items)
 
CPUMotherboardGraphicsRAM
Phenom II X6 1100t MSI 890FX GD65 MSI Radeon HD5670 GSkill RipjawsX DDR3 PC3 12800 2x4GB CL8 
Hard DriveOptical DriveCoolingOS
WD Black 1TB SATA III Samsung BD Zalman 9900MAX Windows 7 64 Professional 
MonitorKeyboardPowerCase
AOC 22" LED Logitech Kingwin Lazer Platinum 500w Fractal Design R3 
Other
Samsung 470 SSD 128GB 
CPUMotherboardGraphicsRAM
AMD Phenom II X6 960T Asus M4A88T-VEVO Asus Strix R7 370 SuperTalent Perfomance 
RAMHard DriveHard DriveOptical Drive
GSkill Snipers Monster Daytona Seagate Barracuda 500GB 7,200 RPM 16Mb cache Memorex DVD/RW 
CoolingOSMonitorKeyboard
Corsair H60 Windows 8N IBM 9494 19" LCD IBM 
PowerCaseMouseMouse Pad
Corsair GS500 In Win H-Frame Wolfking OCZ Behemoth 
Audio
JBL Creature 
  hide details  
Reply
Amelia
(13 items)
 
Professional
(13 items)
 
RCPC#1
(17 items)
 
CPUMotherboardGraphicsRAM
Phenom II X6 1100t MSI 890FX GD65 MSI Radeon HD5670 GSkill RipjawsX DDR3 PC3 12800 2x4GB CL8 
Hard DriveOptical DriveCoolingOS
WD Black 1TB SATA III Samsung BD Zalman 9900MAX Windows 7 64 Professional 
MonitorKeyboardPowerCase
AOC 22" LED Logitech Kingwin Lazer Platinum 500w Fractal Design R3 
Other
Samsung 470 SSD 128GB 
CPUMotherboardGraphicsRAM
AMD Phenom II X6 960T Asus M4A88T-VEVO Asus Strix R7 370 SuperTalent Perfomance 
RAMHard DriveHard DriveOptical Drive
GSkill Snipers Monster Daytona Seagate Barracuda 500GB 7,200 RPM 16Mb cache Memorex DVD/RW 
CoolingOSMonitorKeyboard
Corsair H60 Windows 8N IBM 9494 19" LCD IBM 
PowerCaseMouseMouse Pad
Corsair GS500 In Win H-Frame Wolfking OCZ Behemoth 
Audio
JBL Creature 
  hide details  
Reply
post #67 of 77
This is absurd. Better image quality? Have you ever heard of default color settings? Go into the control panel and change the colors how ever you please. Also, change your monitors color settings.

I think it's hilarious there are threads of people talking about image quality who all have different monitors and different GPUs and most likely different presets yet they can come to a conclusion that one is better than another. I have no facts to back up my claims I'm just using common sense here. All graphical settings are able to be modified... are you judging default presets to default presets? I mean thats a little crazy eh?
Current Rig
(9 items)
 
  
CPUMotherboardGraphicsRAM
Intel 6800k Gigabyte GA‑X99‑UD3P  Asus GTX 980ti Platinum edition 32 GB Crucial Ballistics 
Hard DriveHard DriveCoolingOS
Sandisk 512gb m.2 Western Digital reds Cooler Master 212 evo Windows 10 64bit 
Case
Fractal design define S 
  hide details  
Reply
Current Rig
(9 items)
 
  
CPUMotherboardGraphicsRAM
Intel 6800k Gigabyte GA‑X99‑UD3P  Asus GTX 980ti Platinum edition 32 GB Crucial Ballistics 
Hard DriveHard DriveCoolingOS
Sandisk 512gb m.2 Western Digital reds Cooler Master 212 evo Windows 10 64bit 
Case
Fractal design define S 
  hide details  
Reply
post #68 of 77
Quote:
Originally Posted by Nilareon View Post

This is absurd. Better image quality? Have you ever heard of default color settings? Go into the control panel and change the colors how ever you please. Also, change your monitors color settings.

I think it's hilarious there are threads of people talking about image quality who all have different monitors and different GPUs and most likely different presets yet they can come to a conclusion that one is better than another. I have no facts to back up my claims I'm just using common sense here. All graphical settings are able to be modified... are you judging default presets to default presets? I mean thats a little crazy eh?

 

Yea I think so too. Apples to apples comparisons should be made when calibrated.

Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
post #69 of 77
Quote:
Originally Posted by Nilareon View Post

This is absurd. Better image quality? Have you ever heard of default color settings? Go into the control panel and change the colors how ever you please. Also, change your monitors color settings.

I think it's hilarious there are threads of people talking about image quality who all have different monitors and different GPUs and most likely different presets yet they can come to a conclusion that one is better than another. I have no facts to back up my claims I'm just using common sense here. All graphical settings are able to be modified... are you judging default presets to default presets? I mean thats a little crazy eh?

 

I'm going by my own experience combined with the experiences that I've seen of lots and lots of other people during my time here on OCN. With my personal experience, it was an R9 290 vs. all of my GeForce cards that I ever had: an 8600 GTS, a 9800 GTX+, a GTX 260, a GTX 470, a GTX 580, and now a GTX 780. The R9 290 was my first AMD card and it had a noticeably superior image quality without making any changes whatsoever. I noticed it immediately when I turned my computer on after installing it: the motherboard's own POST splash screen looked better. Then I noticed it with Windows' own startup splash. Then I noticed it at the desktop. The only thing missing for me was, I had always used Digital Vibrance in the NVIDIA Control Panel, so I went searching for the equivalent. When I found it, I had an even better image quality.

 

Overall, it was noticeably less washed out (I had a clearer image) and even before increasing the saturation, the colors popped more and blacks and darks were blacker and darker. It was a pleasure.

 

So it's not absurd. It wasn't due to changing settings. It wasn't the monitor either. It definitely wasn't presets. The only thing that changed was the video card. Oh, do you want to talk about the settings I had for my GTX 580? Sure! I did absolutely everything I could during the time I had it to improve everything yet the R9 290 didn't need anything changed, except for the saturation due to my personal preference. That's all.

 

Like you said, you don't have any facts. Do you have any experience with both? Is the only thing you have your "common sense"?

 

I should probably warn you that I'm not putting this up for debate. This is my statement of fact as I have experienced it (and as I've seen lots of others experience it) and that's that. Take it or leave it.


Edited by TwoCables - 8/19/16 at 9:49am
It's a computer!
(20 items)
 
  
CPUMotherboardGraphicsRAM
i5-2500K @ 4.5GHz (1.368-1.384V fixed voltage) ASUS P8P67 EVO B3 (UEFI ver. 1850) GTX 780 ASUS DirectCU II (1228 / 6300, 1.180V) G.SKILL Ripjaws X 8GB (2 x 4GB) 1866MHz, CL9 
Hard DriveHard DriveHard DriveHard Drive
250 GB Samsung 840 EVO (C:\) 250 GB Samsung 840 EVO (D:\) 150 GB WD VelociRaptor Toshiba 3TB P300 
Optical DriveOptical DriveCoolingOS
Samsung SH-S243N 24x DVD Burner Samsung SH-S203N 20X DVD Burner Thermaltake Frio Win 7 Home Premium x64 SP1 Retail 
MonitorKeyboardPowerCase
AOC G2460PG (24" 1920 x 1080 144Hz G-SYNC) Filco Majestouch 104-key Cherry MX Blues w/NKRO Corsair HX650 (Bronze, ordered on 12-12-2009) CM 690 
MouseMouse PadAudioAudio
Intellimouse Optical (1.1A) 1000Hz polling rate Basic, but premium round X-Fi Titanium HD Klipsch ProMedia 2.1 (with 16 AWG Monster Cable... 
  hide details  
Reply
It's a computer!
(20 items)
 
  
CPUMotherboardGraphicsRAM
i5-2500K @ 4.5GHz (1.368-1.384V fixed voltage) ASUS P8P67 EVO B3 (UEFI ver. 1850) GTX 780 ASUS DirectCU II (1228 / 6300, 1.180V) G.SKILL Ripjaws X 8GB (2 x 4GB) 1866MHz, CL9 
Hard DriveHard DriveHard DriveHard Drive
250 GB Samsung 840 EVO (C:\) 250 GB Samsung 840 EVO (D:\) 150 GB WD VelociRaptor Toshiba 3TB P300 
Optical DriveOptical DriveCoolingOS
Samsung SH-S243N 24x DVD Burner Samsung SH-S203N 20X DVD Burner Thermaltake Frio Win 7 Home Premium x64 SP1 Retail 
MonitorKeyboardPowerCase
AOC G2460PG (24" 1920 x 1080 144Hz G-SYNC) Filco Majestouch 104-key Cherry MX Blues w/NKRO Corsair HX650 (Bronze, ordered on 12-12-2009) CM 690 
MouseMouse PadAudioAudio
Intellimouse Optical (1.1A) 1000Hz polling rate Basic, but premium round X-Fi Titanium HD Klipsch ProMedia 2.1 (with 16 AWG Monster Cable... 
  hide details  
Reply
post #70 of 77
Quote:
Originally Posted by TwoCables View Post
 

 

I'm going by my own experience combined with the experiences that I've seen of lots and lots of other people during my time here on OCN. With my personal experience, it was an R9 290 vs. all of my GeForce cards that I ever had: an 8600 GTS, a 9800 GTX+, a GTX 260, a GTX 470, a GTX 580, and now a GTX 780. The R9 290 was my first AMD card and it had a noticeably superior image quality without making any changes whatsoever. I noticed it immediately when I turned my computer on after installing it: the motherboard's own POST splash screen looked better. Then I noticed it with Windows' own startup splash. Then I noticed it at the desktop. The only thing missing for me was, I had always used Digital Vibrance in the NVIDIA Control Panel, so I went searching for the equivalent. When I found it, I had an even better image quality.

 

Overall, it was noticeably less washed out (I had a clearer image) and even before increasing the saturation, the colors popped more and blacks and darks were blacker and darker. It was a pleasure.

 

So it's not absurd. It wasn't due to changing settings. It wasn't the monitor either. It definitely wasn't presets. The only thing that changed was the video card. Oh, do you want to talk about the settings I had for my GTX 580? Sure! I did absolutely everything I could during the time I had it to improve everything yet the R9 290 didn't need anything changed, except for the saturation due to my personal preference. That's all.

 

Like you said, you don't have any facts. Do you have any experience with both? Is the only thing you have your "common sense"?

 

I understand what you mean and I get where you're going. Im just sad that apple went consumer and not pro-sumer. "Colors that pop" is usually a sign that color accuracy is way off. But yea, to a consumer "colors that pop and look shiny" usually means that the product is better, hence the glass plate on macbooks.

 

The real reason Apple went for AMD i think is because every Nvidia series macbook ever died. My girlfriends 650M macbook died just a few days ago and is still under the long-term "we had an nvidia gpu crash" warranty.

Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
Gaming 2016
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 6700K Gigabyte Z170M-D3H MSI GTX1070 8G ARMOR Kingston HyperX FURY DDR4 
Hard DriveHard DriveHard DriveCooling
Samsung 840 Evo Samsung 850 Evo Seagate SHHD Cooler Master Nepton 240M 
OSMonitorKeyboardPower
Windows 10 Pro XB321HK Logitech 710 Corsair RM550x 
CaseMouseMouse PadAudio
Cooler Master Silencio 352 Logitech G500 Razer Goliathus Presonus Eris 4.5 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › Why Apple choose to use AMD graphic card but not the other one?