Overclock.net › Forums › Graphics Cards › AMD/ATI › 480 vs 1060 - ashes of the singularity, why is the 1060 and 480 tying each other?
New Posts  All Forums:Forum Nav:

480 vs 1060 - ashes of the singularity, why is the 1060 and 480 tying each other? - Page 13

post #121 of 230
Quote:
Originally Posted by oxidized View Post

What language do you need me to tell you that video is no proof? And beside that there's next to none differences, just fps but where lower fps it weirdly looks a hair smoother.
This is what you keep repeating to yourself mostly, but AGAIN no proof of that, only another way to attack nvidia with pointless and false facts
Ah yes... As predicted, the 'no difference' argument. Look. In reviews, no one mentions that they forced settings under the control panel. Chances are that they leave it at default, and only change the game settings, thinking that that gives equal image quality. There is no evidence that they change these settings to achieve equal image clarity. Conclusive proof, not yet, but everything indicates that it is like this. Multiple people have first hand experience with the image quality difference. And there is clear differences in the video posted above.

Just leaving this here... Since apparently people are blind... Screenshots from the previously posted video. I took two examples, but there are more differences (obviously).


post #122 of 230
Quote:
Originally Posted by oxidized View Post

And with such a statement you think you're doing what? Unlike you i don't care which side i'm buying my next card from, i just care about proof and benchmarks, i get the one that's better, either amd or nvidia, but since apparently amd is doing worse on that regard lately, you feel like you need to find arguments to prove amd is better despite all benchmark showing the opposite, you'll use whatever excuse you can find to say that, one is dpc latency, a thing that has came out recently, and it's food for guys like you, i honestly don't even know what that does, but reading on forums very few people have had troubles with that, and i also read that nvidia is already taking care of that since is a known problem.

I don't understand, why is everyone attacking nvidia, while they should be supporting amd doing better, to be more competive and get its market back so that nvidia decreases its damn high prices, and we can have a much better and closer competition

Attacking NVidia? We are just showing proof not good enuf for you.

Like I said, don't worry about it.

In FPS, you won't see this low IQ, especially when moving fast. Just don't take a breather.

kid kid
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 1050 16 / 1866 
Hard DriveCoolingOSMonitor
1000 212 10 64 32 1080 
PowerCase
700 912 
  hide details  
Reply
Second Intel Rig
(16 items)
 
  
CPUMotherboardGraphicsRAM
2700 4.5/ 1.28 77 1050 16 / 1866 
Hard DriveCoolingOSMonitor
1000 212 10 64 32 1080 
PowerCase
700 912 
  hide details  
Reply
post #123 of 230
Quote:
Originally Posted by NightAntilli View Post

He might not know a lot about the subject, but he did notice an image quality difference. And there are many who do. Don't try to dismiss his experience because he doesn't know the terms or lacks knowledge regarding graphics. His experience is still the same.

nVidia's control panel default settings downgrades image quality and that is well-known. I do wonder how many reviewers actually check for this. My guess is none... See a video here...

And now I'll wait for the untrained eye to tell me that there's no difference, even though the difference in FPS is about 10% lower with actual max settings...

You have your videos mixed up.

The one you just presented is where Gregster changed the default settings to prefer performance which lower image quality.

Of course there is a difference. Here's his post.
Quote:
Originally Posted by Gregster View Post

I also did a run with changing the NVCP to "Use my preference emphasising performance" over the default "let the 3D application decide".
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
post #124 of 230
Quote:
Originally Posted by mcg75 View Post

You have your videos mixed up.

The one you just presented is where Gregster changed the default settings to prefer performance which lower image quality.
So even the performance setting offers better image quality than the default 3D setting then. Either that, or the emphasis is quality...

Why does the video say the emphasis is on quality...?
post #125 of 230
Quote:
Originally Posted by NightAntilli View Post

Ah yes... As predicted, the 'no difference' argument. Look. In reviews, no one mentions that they forced settings under the control panel. Chances are that they leave it at default, and only change the game settings, thinking that that gives equal image quality. There is no evidence that they change these settings to achieve equal image clarity. Conclusive proof, not yet, but everything indicates that it is like this. Multiple people have first hand experience with the image quality difference. And there is clear differences in the video posted above.

Just leaving this here... Since apparently people are blind... Screenshots from the previously posted video. I took two examples, but there are more differences (obviously).

You're calling everyone blind, but can't you see it's a different angled camera? And you're using such tiny details as excuse, what's even the point? Using the forced option results in a decrease in fps but in a smoother experience, so what are you saying exactly?
Quote:
Originally Posted by rdr09 View Post

Attacking NVidia? We are just showing proof not good enuf for you.

Like I said, don't worry about it.

In FPS, you won't see this low IQ, especially when moving fast. Just don't take a breather.

kid kid

N O P R O O F

you want me to tell you in italian?
Edited by oxidized - 7/27/16 at 5:56am
post #126 of 230
Quote:
Originally Posted by oxidized View Post

You're calling everyone blind, but can't you see it's a different angled camera? And you're using such tiny details as excuse, what's even the point? Using the forced option results in a decrease in fps but in a smoother experience, so what are you saying exactly?
I specifically chose differences where camera angles and video compression would not make a difference. Your camera angle argument is just an excuse to deny the obvious. You can see it live in the video and the difference in quality is obvious if you know what to look for.
You were saying there was no difference, I showed where a few differences are. The whole discussion was about the image quality. Now that you were proven wrong you're bringing up excuses about performance and re-asking what I'm trying to say, even though that has been obvious the whole time.

Don't move the goal posts so it suits your agenda. You talk about not caring about each company yet you try to defend nVidia like it's your own family.

Obviously we are still missing information. We need;
  • The quality at default settings for AMD cards
  • The quality with emphasis on quality for AMD cards
  • What reviewers use for their benchmarks


We now have;
  • Obvious difference between nVidia default setting and emphasis on quality settings
  • User experience of AMD having better quality at default settings.
post #127 of 230
Quote:
Originally Posted by NightAntilli View Post


Why does the video say the emphasis is on quality...?

I don't know. I believe Gregster stated he forgot to change the title when he recorded it.

But he presented two videos in that thread. This would be the one you are looking for as it's default quality for both sides.
Quote:
Originally Posted by Gregster View Post


Right chaps, the best quality I could get. Recorded at 1080P with 60fps using Raptr for AMD and ShadowPlay for Nvidia.

Again, nothing has been touched from either (not even the colours on the AMD side of things.

DDU - Instaled AMD Drivers, recorded with Raptr
DDU - installed Nvidia drivers, recorded with ShadowPlay
Nothing changed anywhere - Recorded with 30mbs and 1080P on both and set for 60 fps recording.

I didn't even alter the colour settings this time for AMD and left everything default. thoughts?
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
post #128 of 230
I'm gonna leave this here since that's the subject at hand rolleyes.gif

TESTING NVIDIA VS. AMD IMAGE QUALITY
Quote:
Important Benchmarking Issues and Questionable Optimizations
We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD’s Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD’s default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.

What Editors Discovered
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.

Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).
Quote:
A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.

Both NVIDIA and AMD provide various control panel knobs to tune and tweak image quality parameters, but there are some important differences — NVIDIA strives to deliver excellent IQ at default control panel settings, while also ensuring the user experiences the image quality intended by the game developer. NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates. Similarly, with each new driver release, NVIDIA will not reduce the quality of default IQ settings, unlike what appears to be happening with our competitor, per the stories recently published.

We are glad that multiple top tech sites have published their comparative IQ findings. If NVIDIA published such information on our own, without third-party validation, much of the review and technical community might just ignore it. A key goal in this blog is not to point out cheats or “false optimizations” in our competitor’s drivers. Rather it is to get everyone to take a closer look at AMD’s image quality in games, and fairly test our products versus AMD products. We also want people to beware of using certain anisotropic testing tools with AMD boards, as you will not get image quality results that correspond with game behavior.

AMD promotes “no compromise” enthusiast graphics, but it seems multiple reviewers beg to differ.

We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.
Jupiter V
(18 items)
 
Pluto
(15 items)
 
 
CPUMotherboardGraphicsRAM
Intel® Core™ i7-5820K ASUS X99-S NVIDIA GeForce® GTX 1080 Founders Edition Kingston HyperX® Predator 16GB DDR4-2400 
Hard DriveHard DriveHard DriveCooling
Samsung SM951 NVMe 256GB Samsung SSD 840 PRO 256GB WD Purple™ WD20PURX RAID 0 4TB Corsair Hydro Series™ H105 
OSMonitorKeyboardPower
Microsoft Windows® 10 Pro ASUS ROG SWIFT PG278Q ASUS ROG Claymore Antec® HCP-1000 Platinum 
CaseMouseMouse PadAudio
Corsair Carbide Series® Air 540 ASUS ROG Gladius ASUS ROG GM50 Sennheiser HD 449 
AudioOther
Creative A250 Xbox One Wireless Controller 
CPUMotherboardRAMHard Drive
AMD A8-3850 GIGABYTE GA-A75M-S2V Kingston HyperX Blu 4GB WD Blue 320GB 
Hard DriveCoolingOSMonitor
WD Green 1TB Cooler Master GeminII Windows 10 Pro BenQ EW2740L 
KeyboardPowerCaseMouse
A4TECH Game Master Keyboard KB-28G Thermaltake TR2 450W Cooler Master N200 A4TECH V-Laser Gaming Mouse F7 
Mouse PadAudioOther
A4TECH X7 Gaming Labtec Pulse 285 Logitech Rumble Gamepad F510 
CPUGraphicsRAMHard Drive
Intel® Atom™ Z3745 Intel® HD Graphics 2GB LPDDR3 32GB EMMC 
OSMonitorPowerAudio
Windows 10 Home 8" Full HD (1920 x 1200) IPS display Li-ion 6400 mAh battery Dolby® Audio 
Other
WiFi 802.11 a/b/g/n, MiMo, Bluetooth® 4.0 
  hide details  
Reply
Jupiter V
(18 items)
 
Pluto
(15 items)
 
 
CPUMotherboardGraphicsRAM
Intel® Core™ i7-5820K ASUS X99-S NVIDIA GeForce® GTX 1080 Founders Edition Kingston HyperX® Predator 16GB DDR4-2400 
Hard DriveHard DriveHard DriveCooling
Samsung SM951 NVMe 256GB Samsung SSD 840 PRO 256GB WD Purple™ WD20PURX RAID 0 4TB Corsair Hydro Series™ H105 
OSMonitorKeyboardPower
Microsoft Windows® 10 Pro ASUS ROG SWIFT PG278Q ASUS ROG Claymore Antec® HCP-1000 Platinum 
CaseMouseMouse PadAudio
Corsair Carbide Series® Air 540 ASUS ROG Gladius ASUS ROG GM50 Sennheiser HD 449 
AudioOther
Creative A250 Xbox One Wireless Controller 
CPUMotherboardRAMHard Drive
AMD A8-3850 GIGABYTE GA-A75M-S2V Kingston HyperX Blu 4GB WD Blue 320GB 
Hard DriveCoolingOSMonitor
WD Green 1TB Cooler Master GeminII Windows 10 Pro BenQ EW2740L 
KeyboardPowerCaseMouse
A4TECH Game Master Keyboard KB-28G Thermaltake TR2 450W Cooler Master N200 A4TECH V-Laser Gaming Mouse F7 
Mouse PadAudioOther
A4TECH X7 Gaming Labtec Pulse 285 Logitech Rumble Gamepad F510 
CPUGraphicsRAMHard Drive
Intel® Atom™ Z3745 Intel® HD Graphics 2GB LPDDR3 32GB EMMC 
OSMonitorPowerAudio
Windows 10 Home 8" Full HD (1920 x 1200) IPS display Li-ion 6400 mAh battery Dolby® Audio 
Other
WiFi 802.11 a/b/g/n, MiMo, Bluetooth® 4.0 
  hide details  
Reply
post #129 of 230
Quote:
Originally Posted by NightAntilli View Post

I specifically chose differences where camera angles and video compression would not make a difference. Your camera angle argument is just an excuse to deny the obvious. You can see it live in the video and the difference in quality is obvious if you know what to look for.
You were saying there was no difference, I showed where a few differences are. The whole discussion was about the image quality. Now that you were proven wrong you're bringing up excuses about performance and re-asking what I'm trying to say, even though that has been obvious the whole time.

Don't move the goal posts so it suits your agenda. You talk about not caring about each company yet you try to defend nVidia like it's your own family.

Obviously we are still missing information. We need;
  • The quality at default settings for AMD cards
  • The quality with emphasis on quality for AMD cards
  • What reviewers use for their benchmarks


We now have;
  • Obvious difference between nVidia default setting and emphasis on quality settings
  • User experience of AMD having better quality at default settings.

the camera argument isn't invalid, actually it's more valid than everything you said. The videos are both compressed, and a screenshot wouldn't make it very clear, there are differences from the pics you show, but i'm pretty sure that if you made the same experiment (maybe more accurate than that) you would notice no difference at all, besides maybe that paper sheet (i feel ridiculous talking about that honestly) and again this said, if your bionic eye can see better than us, you should be able to notice the forced setting side is smoother in several occasions (despite the lower framerate)
post #130 of 230
Quote:
Originally Posted by ValSidalv21 View Post

I'm gonna leave this here since that's the subject at hand rolleyes.gif

TESTING NVIDIA VS. AMD IMAGE QUALITY

Hmm, so Nvidia's website says AMD's default image quality is worse. Interesting. BTW why all their sources are in German?

Here is the only one with actual screenshots:
https://www.computerbase.de/2010-10/bericht-radeon-hd-6800/5/#abschnitt_anisotrope_filterung_auf_hd_6800
My Rig
(14 items)
 
Ex-wife's Rig
(15 items)
 
 
CPUMotherboardGraphicsRAM
Core i5 4460 AsRock H81M-DG4 Sapphire Rx470 Platinum KVR 1600 16Gb 
Hard DriveHard DriveCoolingOS
2x Seagate 3Tb Samsung 850 EVO 120 Scythe Ninja 3 Rev.B Windows 10 Pro 
MonitorKeyboardPowerCase
Fujitsu Siemens A17-2A Logitech K280e SuperFlower SF-550K12XP Thermaltake Versa H25 
MouseAudio
Logitech G402 Sony MDR XD150 
CPUMotherboardGraphicsRAM
Athlon 750K 4.0Ghz AsRock FM2A75 Pro4+ Sapphire R9 270X Dual-X Kingston 2x4Gb 1600 
Hard DriveHard DriveOptical DriveCooling
Samsung 850 EVO 120  Western Digital 320Gb LiteON DVD-RW CoolerMaster Hyper Z600 
OSMonitorKeyboardPower
Windows 7 Pro x64 Toshiba 32" FullHD TV Logitech FSP Hexa 550 
CaseMouse
DeLUX Logitech 
  hide details  
Reply
My Rig
(14 items)
 
Ex-wife's Rig
(15 items)
 
 
CPUMotherboardGraphicsRAM
Core i5 4460 AsRock H81M-DG4 Sapphire Rx470 Platinum KVR 1600 16Gb 
Hard DriveHard DriveCoolingOS
2x Seagate 3Tb Samsung 850 EVO 120 Scythe Ninja 3 Rev.B Windows 10 Pro 
MonitorKeyboardPowerCase
Fujitsu Siemens A17-2A Logitech K280e SuperFlower SF-550K12XP Thermaltake Versa H25 
MouseAudio
Logitech G402 Sony MDR XD150 
CPUMotherboardGraphicsRAM
Athlon 750K 4.0Ghz AsRock FM2A75 Pro4+ Sapphire R9 270X Dual-X Kingston 2x4Gb 1600 
Hard DriveHard DriveOptical DriveCooling
Samsung 850 EVO 120  Western Digital 320Gb LiteON DVD-RW CoolerMaster Hyper Z600 
OSMonitorKeyboardPower
Windows 7 Pro x64 Toshiba 32" FullHD TV Logitech FSP Hexa 550 
CaseMouse
DeLUX Logitech 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › 480 vs 1060 - ashes of the singularity, why is the 1060 and 480 tying each other?