480 vs 1060 - ashes of the singularity, why is the 1060 and 480 tying each other? - Page 12 - Overclock.net

Forum Jump: 
Reply
 
Thread Tools
post #111 of 230 Old 07-27-2016, 05:31 AM
 
rdr09's Avatar
 
Join Date: Mar 2011
Location: NJ
Posts: 17,006
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 844
If you are fine with what you see in your screen . . . don't worry about it. Besides, you got higher fps, right?wink.gif
rdr09 is offline  
Sponsored Links
Advertisement
 
post #112 of 230 Old 07-27-2016, 05:31 AM
Programmer
 
ronnin426850's Avatar
 
Join Date: Sep 2009
Location: Europe
Posts: 9,735
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 574
Quote:
Originally Posted by oxidized View Post

So it means that they won't risk it, what if it turns out to be fake or just not true, amd would destroy itself, and since their situation isn't very promising atm, why do this, maybe they will in the future if they start taking back market and have more proof, for now to me and most of people who tried both sides is just bs.

Jesus, try using sentences, man!

On topic, this is like watching children argue who is cooler, batman or superman (let's imagine we're back in the 80's when DC was still relevant). If AMD decided to go public with this, of course they won't show THIS scrappy post, they'll make their own investigation and show their own proof, so there is no risk involved. "What if it turns out to be fake" just doesn't apply when you do your own research.

I suspect Nv are cheating a bit with IQ, but in such a minor way that even if proven, it won't do any harm. Imagine AMD going to a press conference with "Look at THAT pixel... no, no, the one next to it. Yeah. That one should be one shade greener!" People will laugh at them at how they try to make drama out of something nobody cares about, especially with how dynamic today's games are. And of course Nv will just use some lame excuse like cable or monitor issue, which won't hold up, but will still be enough for fanboys and regular folk alike to forget it ever happened.
LAKEINTEL likes this.

Last edited by jhughy2010; 06-30-2014 at 12:13 PM.
ronnin426850 is offline  
post #113 of 230 Old 07-27-2016, 05:50 AM
New to Overclock.net
 
oxidized's Avatar
 
Join Date: Jul 2014
Location: Rome, IT
Posts: 1,121
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Quote:
Originally Posted by rdr09 View Post

If you are fine with what you see in your screen . . . don't worry about it. Besides, you got higher fps, right?wink.gif

The 2 times i heard about such thing was with aots and this guy on the forum, so it's most likely bs, and again you're just using this to attack nvidia, (because amd is the good side) when ofc there is 0 proof but still you keep saying the same stuff just to be in peace with yourself.
oxidized is online now  
Sponsored Links
Advertisement
 
post #114 of 230 Old 07-27-2016, 05:54 AM
 
Join Date: Aug 2015
Posts: 1,475
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 264
Quote:
Originally Posted by oxidized View Post


"So it's simple for me I went from using a Power color HD7970 to a gigabyte G1 Gtx970. I honestly noticed far worse super-sampling, texture color and all over quality of my games"

i mean read this, the guy doesn't even know what he's talking about, and you are just using this to prove yourself right, but you both just proved another thing...
He might not know a lot about the subject, but he did notice an image quality difference. And there are many who do. Don't try to dismiss his experience because he doesn't know the terms or lacks knowledge regarding graphics. His experience is still the same.

nVidia's control panel default settings downgrades image quality and that is well-known. I do wonder how many reviewers actually check for this. My guess is none... See a video here...


And now I'll wait for the untrained eye to tell me that there's no difference, even though the difference in FPS is about 10% lower with actual max settings...
NightAntilli is offline  
post #115 of 230 Old 07-27-2016, 05:55 AM
 
HaiderGill's Avatar
 
Join Date: Jul 2016
Posts: 176
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 4
Quote:
Originally Posted by danjal View Post

also dont forget out of the three dx12 titles out right now, two are sponsored by amd, go figure.

Yes but since NVidia exited the console market. AMD hardware is being used to develop games. Console development is low level so you get more intimate with the actual workings of the hardware and optimisations of your code. PC is a niche market so they devs don't get as intimate with the hardware as you don't have the money=time to spend on that coupled with the fact that DirectX <12 didn't really provide you with an interface to do that. Now DirectX 12 will provide a low level interface and the software is originally written (for consoles) and optimised for AMD GCN it will naturally run better on GCN graphics cards. nVidia should kept or been in the consoles work but then you need X64 APUs. Intel isn't really interested server CPUs make money, graphics are poor revenue generators..
HaiderGill is offline  
post #116 of 230 Old 07-27-2016, 06:00 AM
 
rdr09's Avatar
 
Join Date: Mar 2011
Location: NJ
Posts: 17,006
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 844
Quote:
Originally Posted by oxidized View Post

The 2 times i heard about such thing was with aots and this guy on the forum, so it's most likely bs, and again you're just using this to attack nvidia, (because amd is the good side) when ofc there is 0 proof but still you keep saying the same stuff just to be in peace with yourself.

Don't worry about it.
rdr09 is offline  
post #117 of 230 Old 07-27-2016, 06:10 AM
 
Join Date: Aug 2015
Posts: 1,475
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 264
And to add to my last post, even when using max settings, nVidia likely has slightly lower image quality. Their memory compression is lossy, not lossless.
NightAntilli is offline  
post #118 of 230 Old 07-27-2016, 06:12 AM
New to Overclock.net
 
oxidized's Avatar
 
Join Date: Jul 2014
Location: Rome, IT
Posts: 1,121
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Quote:
Originally Posted by NightAntilli View Post

He might not know a lot about the subject, but he did notice an image quality difference. And there are many who do. Don't try to dismiss his experience because he doesn't know the terms or lacks knowledge regarding graphics. His experience is still the same.

nVidia's control panel default settings downgrades image quality and that is well-known. I do wonder how many reviewers actually check for this. My guess is none... See a video here...


And now I'll wait for the untrained eye to tell me that there's no difference, even though the difference in FPS is about 10% lower with actual max settings...

What language do you need me to tell you that video is no proof? And beside that there's next to none differences, just fps but where lower fps it weirdly looks a hair smoother.
Quote:
Originally Posted by NightAntilli View Post

And to add to my last post, even when using max settings, nVidia likely has slightly lower image quality. Their memory compression is lossy, not lossless.

This is what you keep repeating to yourself mostly, but AGAIN no proof of that, only another way to attack nvidia with pointless and false facts
oxidized is online now  
post #119 of 230 Old 07-27-2016, 06:16 AM
 
rdr09's Avatar
 
Join Date: Mar 2011
Location: NJ
Posts: 17,006
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 844
Quote:
Originally Posted by oxidized View Post

What language do you need me to tell you that video is no proof? And beside that there's next to none differences, just fps but where lower fps it weirdly looks a hair smoother.

I think what you should be worried concerned about is dpc latency. Been reading that NVidia cards have been having this issue for quite some time. Brought to light by the 10XX series due to higher levels.
rdr09 is offline  
post #120 of 230 Old 07-27-2016, 06:30 AM
New to Overclock.net
 
oxidized's Avatar
 
Join Date: Jul 2014
Location: Rome, IT
Posts: 1,121
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Quote:
Originally Posted by rdr09 View Post

I think what you should be worried concerned about is dpc latency. Been reading that NVidia cards have been having this issue for quite some time. Brought to light by the 10XX series due to higher levels.

And with such a statement you think you're doing what? Unlike you i don't care which side i'm buying my next card from, i just care about proof and benchmarks, i get the one that's better, either amd or nvidia, but since apparently amd is doing worse on that regard lately, you feel like you need to find arguments to prove amd is better despite all benchmark showing the opposite, you'll use whatever excuse you can find to say that, one is dpc latency, a thing that has came out recently, and it's food for guys like you, i honestly don't even know what that does, but reading on forums very few people have had troubles with that, and i also read that nvidia is already taking care of that since is a known problem.

I don't understand, why is everyone attacking nvidia, while they should be supporting amd doing better, to be more competive and get its market back so that nvidia decreases its damn high prices, and we can have a much better and closer competition
oxidized is online now  
Reply

Quick Reply

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off