Overclock.net › Forums › Graphics Cards › AMD/ATI › [Various] AMD FreeSync Reviews
New Posts  All Forums:Forum Nav:

[Various] AMD FreeSync Reviews - Page 113

post #1121 of 1757
Quote:
Originally Posted by Blameless View Post

I didn't really see anything biased in these reviews. The displays tested do indeed appear to be disabling overdrive when FreeSync is enabled. This is going to be relevant for some and is an issue that should not be glossed over.
Quite possibly not an AMD issue.

This.

It's an adaptive sync issue that has to be addressed by monitor makers not AMD.

But AMD shot themselves in the foot here. Had they remained silent about Freesync and made sure adaptive vsync worked beforehand, they wouldn't be in this situation.

At this point, all we were hearing about Freesync was it being advertised as equal to or better than Gsync. Right now, that's false.

As for PCper, I have nothing but respect for those guys. The amount of crap they take because people perceive they lean one way or the other is ridiculous. I can easily identify with that.

When AMD stops providing product for them to review, then I'll take accusations of bias seriously.
Upstairs Rig
(12 items)
 
  
Reply
Upstairs Rig
(12 items)
 
  
Reply
post #1122 of 1757
Quote:
Originally Posted by looniam View Post

i'll just leave this here . .

Problem with this argument in my judgement is, if you are saying most cards can't max Crysis 3 at more than 40 FPS, well then you are going to have a crappy gaming experience whether or not you use VRR. If you have a 970 chugging at 25FPS you really can't expect VRR to do a damn thing for you to start with...
Quote:
Originally Posted by Forceman View Post

25 FPS maybe not, but what about 50? You don't have to go all the way to the extreme. Plenty of people play 30 FPS on consoles.

At 50 FPS what is the problem with Freesync again???
Edited by Majin SSJ Eric - 3/28/15 at 7:21pm
post #1123 of 1757
25 FPS maybe not, but what about 50? You don't have to go all the way to the extreme. Plenty of people play 30 FPS on consoles.
post #1124 of 1757
Quote:
Originally Posted by Mand12 View Post

I find it troubling that some people make an accusation of bias when they don't like the result of a review.

When the results of certain reviewers always (and I do mean absolutely ALWAYS) paint one manufacturer in a negative light while nearly always portraying a competitor in a positive light then yes, that is bias. Even if I buy everything that is being said about Freesync right now, there is just a certain tone to some "reviews" that is obviously showing little more than contempt for the technology and I know there are plenty of guys here that totally agree with me.

For quick reference, here are the "Final Thoughts" of two PCPer reviews summing up the GTX 680 when it launched vs the 7970GE when it took the crown back. Notice anything different in the tone of these two conclusions (remember both cards were the undisputed fastest cards tested at the time of each review):

GTX 680:
Quote:
I think it goes without saying that the new GeForce GTX 680 2GB graphics card from NVIDIA really has impressed me in my short time with it. The performance of the new Kepler GPU is astounding as it was able to best the Radeon HD 7970 in the large majority or our tests and really only lost to it in one — Metro 2033. While some users might stress over the variability that it introduces, I think the GPU Boost technology is innovative and really helps the GPU stand up above the HD 7970 3GB in overall performance. Added bonuses like Adaptive VSync and Frame Rate Targeting add to the differentiation of the GTX 680 and the long-awaited ability to run NVIDIA Surround gaming configurations on a single GPU really completes the story.

7970GE:
Quote:
I hate to not come to a very firm conclusion here with the HD 7970 GHz Edition review like I did with the NVIDIA GeForce GTX 680 article, but there are a few absolutes to be considered. If you want the fastest single-GPU graphics card, then the Radeon HD 7970 GHz Edition is it. If you want a card that is both fast without running away with power consumption, the GTX 680 takes that angle. In terms of unique features and cool technology innovations on the software side, NVIDIA's team again gets the nod from me.

Just an obvious undercurrent there and its that way with every single AMD vs Nvidia review I have ever seen on their site...
post #1125 of 1757
Look it, I don't know much about pcper, except that Ryan comes across as a smart guy, and probably is smarter than a number of other talking head reviewers out there.
However, as someone standing back and viewing this gsyn/freesync debate unfold, I do find Peterson's claims of gsync superiority way too well synchronized with PCper's critique of free sync, and the only reviewer that categorically laid the blame of free sync on AMD, rather than on the panel makers, and without considering any other factors. Just an observation, and may be its just a matter of perception. But, in the business world, perception of reputation is reality, and so perhaps anyone being wrongly accused of bias should go the distance to ensure that Independence is maintained, both in reality and in perception...
just my 2cents.gif
Simplicity
(11 items)
 
Apotheosis
(10 items)
 
 
CPUMotherboardGraphicsRAM
4770k Asus Z87 Pro TBD Corsair Vengeance (2x8GB) DDR3 1600 RAM 
OSMonitorKeyboardPower
Windows 7 Pro Dell U2713HM Alienware TactX gaming Seasonic 850W Gold  
CaseMouse
Cooler Master HAF XB Alienware TactX premium mouse 
  hide details  
Reply
Simplicity
(11 items)
 
Apotheosis
(10 items)
 
 
CPUMotherboardGraphicsRAM
4770k Asus Z87 Pro TBD Corsair Vengeance (2x8GB) DDR3 1600 RAM 
OSMonitorKeyboardPower
Windows 7 Pro Dell U2713HM Alienware TactX gaming Seasonic 850W Gold  
CaseMouse
Cooler Master HAF XB Alienware TactX premium mouse 
  hide details  
Reply
post #1126 of 1757
Quote:
Originally Posted by Majin SSJ Eric View Post

When the results of certain reviewers always (and I do mean absolutely ALWAYS) paint one manufacturer in a negative light while nearly always portraying a competitor in a positive light then yes, that is bias. Even if I buy everything that is being said about Freesync right now, there is just a certain tone to some "reviews" that is obviously showing little more than contempt for the technology and I know there are plenty of guys here that totally agree with me.

For quick reference, here are the "Final Thoughts" of two PCPer reviews summing up the GTX 680 when it launched vs the 7970GE when it took the crown back. Notice anything different in the tone of these two conclusions (remember both cards were the undisputed fastest cards tested at the time of each review):

GTX 680:
7970GE:
Just an obvious undercurrent there and its that way with every single AMD vs Nvidia review I have ever seen on their site...

They seem pretty supportive of AMD here:

"AMD has put itself in a great position with the Radeon R9 290X based on the new Hawaii GPU. The R9 290X is faster than the GTX 780, just about on par with the GTX Titan, all while coming in at a price well under what NVIDIA has on the shelves today. It isn't perfect and the warts of noise, power, and heat will bother some. Still, users looking for a top performing single-GPU graphics card will find the R9 290X at the top of their list."

"If you love fast graphics cards, you are simply going to be infatuated with the new Radeon HD 7970. For the first time in a couple of generations, AMD will have the fastest single-GPU solution on the market - at least until we see what NVIDIA is going to do later in the year. The Tahiti GPU offers more than enough horsepower to push past the year-old GTX 580 and take the performance crown and is able to do so using less power than NVIDIA's GeForce option as well. With performance and efficiency this impressive we can easily see the upcoming Southern Islands based Radeon 7800 and 7700 cards offering just as compelling a solution to the graphics market.
Obviously we were hoping for a lower price on the Radeon HD 7970 - even if it isn't really justified based on today's market conditions. Yes yes, I know, you are getting better performance and twice the frame buffer of the GeForce GTX 580 (3GB vs 1.5GB), and for $50 that seems like a pretty reasonable offer for enthusiast gamers that want the best of the best."
Edited by Forceman - 3/28/15 at 7:40pm
post #1127 of 1757
Quote:
Originally Posted by FreeElectron View Post

  1. Dissecting G-Sync and FreeSync - How the Technologies Differ @7m47s
  2. Dissecting G-Sync and FreeSync - How the Technologies Differ @18m46s That's where i stopped.



Those where the reasons why i am not trusting/skeptical about PCPer
  1. Open source inspected software should be used in this comparison to ensure that the software does not cause any issues with competitor's products and to ensure that the software will evenly work with both manufacturers products.
  2. Did not logically make sense, because while we can see that nvidia doubles/multiplies the refresh rate we can also see that AMD is using the minimum acceptable refresh rate (acceptable/good according to their own claim at the beginning of the video when they said that freesync is working well within the refresh rate window) so refresh rate and the whole measurement thing is useless because the only thing that matters (till the point i watched) in this review is their own observation (not the measurement as AMD did not fall below monitor's refresh rate).
    • Gsync and ULMB did not work in the same time in the latest (and best) gaming gsync monitor released yet.
    • I also have read somewhere that Gsync and ULMB can't work together (Generally) but i can't get the source right now.
    • nvidia's claims are not to be trusted specially after the latest 3.5GB VRAM issue "miscommunication".
    • so this point did not make sense as well because even if AMD can't have FreeSync and Blur Reduction working in the same time so does nVidia.
Quote:

  • Those are the reasons why i became skeptical of pcper's freesync review and by result became skeptical of (shady/unknown) reviews that makes the same claims.
  • I am not saying that those claims are not correct nor am i saying that they are. I am just skeptical of information that i came across..
  • People seem to easily get offended when they are not trusted which does not make sense (trust should be earned not expected) so allow me to apologize if i seemed to offend anyone.

    I would have done so if a Freesync monitor had blur reduction, IPS panel, 1440P and 120+hz but, sadly there is no such a monitor.
    I also have decided to avoid AMD's last gen cards because of their noise/heat (this should prove that my opinion has nothing to do with bias)
I would appreciate any real effort that will help further educate me.
What i have been able to read/see (small portions of it) in your review was not enough.
It would be appreciated If you are willing to spend some time and further clarify the issues i have trouble with in a simple, logical and direct manner.
The demo is designed to showcase the difference between variable and fixed refresh rate monitors. You can select the framerate you want, or sweep between a low framerate and a high framerate. The demo it's self has nothing to do with enabling or disabling freesync, you set that up with AMD's utility. You also get the exact same behavior in games and other demos, for example: https://www.youtube.com/watch?v=1jqimZLUk-c In this video, both panels are showing a framerate below their minimum allowable refresh rate. The g-sync monitor preemptively repeats each frame, so that it can show the next frame on time, while freesync stays at 40hz, letting frame completion and monitor refresh fall out of sync(which causes judder or tearing, depending on which fallback case you choose. In this case, v-sync off is selected.)

If you think this is just something PCPer or Nvidia made up, please show me someone testing freesync below the monitor's VRR window, and finding something different. As far as I know, everyone that tested freesync at low framerates found the same thing. Hell, even AMD said it reverts to v-sync on/off when your framerate goes below the monitor's minimum refresh rate. The surprising thing is that the refresh rate stays at the minimum, instead of going back to max refresh rate when framerate is below the VRR window. Do you think PCPer faked the oscilloscope readings too? Because in my opinion that's pretty compelling evidence of a 40hz refresh rate at framerates lower than 40fps.
Edited by TranquilTempest - 3/28/15 at 11:38pm
1
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 930 @ 3.6 Gigabyte x58a ud3r Gigabyte GV-R6870C-1GD Mushkin Redline 
Hard DriveOptical DriveMonitorKeyboard
2x 1TB Spinpoint samsung dvd burner Samsung p2370 + Mitsubishi Diamond Pro 930SB WASD keyboards v1 semi custom w/ cherry browns 
PowerCaseMouse
Antec CP-850 Antec P183 CM Storm Spawn 
  hide details  
Reply
1
(13 items)
 
  
CPUMotherboardGraphicsRAM
i7 930 @ 3.6 Gigabyte x58a ud3r Gigabyte GV-R6870C-1GD Mushkin Redline 
Hard DriveOptical DriveMonitorKeyboard
2x 1TB Spinpoint samsung dvd burner Samsung p2370 + Mitsubishi Diamond Pro 930SB WASD keyboards v1 semi custom w/ cherry browns 
PowerCaseMouse
Antec CP-850 Antec P183 CM Storm Spawn 
  hide details  
Reply
post #1128 of 1757
Quote:
Originally Posted by Xuper View Post

I'm sure Nvidia Knew this before and Someone (perhaps Tom Petersen or anyone) Told Mr. malventano and he Tested then saw this ghosting issue.
Mr malventano , do not deny , I bet you didn't know about ghosting issue before review Monitor.Must be someone who told you.
anyway this issue must be fixed by AMD.
You're kidding right? The ghosting is so freaking obvious it's not even funny. Ryan plugged in the first FreeSync display, I walked over to look at it, and as soon as we fired up the windmill demo, we both noted how bad it was obviously ghosting.

To answer your question directly, we were told nothing from anyone at NV before we looked at FreeSync. Not one word.
post #1129 of 1757
Quote:
Originally Posted by Blameless View Post

I didn't really see anything biased in these reviews. The displays tested do indeed appear to be disabling overdrive when FreeSync is enabled. This is going to be relevant for some and is an issue that should not be glossed over.
Quite possibly not an AMD issue.

I agree completely with the post.
I would clarify that the technical issue may not be on AMD's side. However these monitors have the Freesync label on them which unfortunately does make it an AMD problem.
The argument could and has been made that AMD should confirm a certain level of quality for monitors using their branding.
I do realize that this the first version/attempt of the AMD ecosystem bringing VRR to consumers however there is competition in the marketplace that it is being compared against.
post #1130 of 1757
Quote:
Originally Posted by Blameless View Post

I didn't really see anything biased in these reviews. The displays tested do indeed appear to be disabling overdrive when FreeSync is enabled. This is going to be relevant for some and is an issue that should not be glossed over.
Quite possibly not an AMD issue.

 

No, its definitely an issue that AMD has saddled themselves with.

 

Putting the FreeSync logo on the box implies that it has been vetted and someone has sat in front of the panel for a significant amount of time to make sure that it works. Granted, they might not be putting as much work into validation as Nvidia might be doing with their partners, but I think its pretty obvious that this is the case when the specification for DP1.2a+ does include caveats that say that display vendors need to put in enough memory to allow some complex optimisations to occur. As soon as I find the paper about it, I'll post up the relevant points here. What it points to is the fact that AMD knew about these problems beforehand, but the first generation of FreeSync panels might have to have their behaviour fixed in drivers before the second gen fixes the lingering issues from the launch models. 

 

Quote:
Originally Posted by FreeElectron View Post
 
  1. Open source inspected software should be used in this comparison to ensure that the software does not cause any issues with competitor's products and to ensure that the software will evenly work with both manufacturers products.
  2. Did not logically make sense, because while we can see that nvidia doubles/multiplies the refresh rate we can also see that AMD is using the minimum acceptable refresh rate (acceptable/good according to their own claim at the beginning of the video when they said that freesync is working well within the refresh rate window) so refresh rate and the whole measurement thing is useless because the only thing that matters (till the point i watched) in this review is their own observation (not the measurement as AMD did not fall below monitor's refresh rate).
    • Gsync and ULMB did not work in the same time in the latest (and best) gaming gsync monitor released yet.
    • I also have read somewhere that Gsync and ULMB can't work together (Generally) but i can't get the source right now.
    • nvidia's claims are not to be trusted specially after the latest 3.5GB VRAM issue "miscommunication".
    • so this point did not make sense as well because even if AMD can't have FreeSync and Blur Reduction working in the same time so does nVidia.

 

I would appreciate any real effort that will help further educate me.

 

1) This is ridiculous to assert, because why would Nvidia spend time to make a demo that cracks down on the performance of their competitor's product? They already have majority market share, they already have the better VRR solution. It would be revealed pretty soon through tools like FCAT, FRAPs, and high-speed video, that they were doing something like this because it is just a simple demo of a swinging pendulum - there is nothing technically taxing about it. The pendulum demo was also available to download a while ago, IIRC, and no-one noticed any glaring issues with AMD cards back then. Do you also complain about the use of FCAT just as much in other scenarios?

 

2) You're trusting what your own eyes see in the video, but you clearly haven't seen this in the works yourself. You can actually test this out for yourself on your existing monitor. Run a game with FRAPS in the corner and implement level caps in the game (easily done with TF2 and the command line in-game). If you're on a 60Hz monitor, drop the fps cap from 60 to 30 and then 20fps. The smoothness clearly suffers with each drop, but the animation is more or less okay. If you then create a custom resolution with a refresh rate at 50Hz and drop fps down to 25, the experience is more or less the same as running a 60Hz monitor with a 30fps level cap in-game. That's basically what is going on with G-Sync here, where it doubles or triples the refresh rate to keep things looking more or less normal and to avoid judder and massive stutters or tears. Obviously, running a game at sub-24fps won't look good, but at least they're trying to preserve some fluidity. 

 

My hunch is that the scalers inside the FreeSync monitors don't have enough memory to perform the same trick as the G-Sync, which could be why overdrive isn't working and why the panel doesn't change the refresh rate to improve how the content is being displayed. AMD can fix that in their drivers, but it will be several months before it is ironed out.

 

3) This is partly physics-related and partly related to how the display is refreshing itself. You can have the strobe match the display rate, but you'd have to implement frametime smoothing to take out any severe dips or increases in framerate, because otherwise you'll run into situations where the display dims when it runs into a low-fps situation, then suddenly increases in brightness then the framerate improves. Even with the smoothing, your display would still have varying brightness levels throughout your time gaming on it, so its not ideal. Additionally, you will run into situations where the game engine hiccups and suddenly the framerate is out of sync - then you would have the strobe light up the display when it is blank, turning itself off when there's actually a frame being drawn on the display. 

 

4) As for blur reduction not working, I'll try find the paper that discusses the specifications needed for a display to support VRR according to the DP1.2a+ specification. One of the requirements is having enough memory for the scaler to perform frame analysis and work out how much it should increase or retard the overdrive to reduce blurring. It is something that can be fixed, but probably only in the second wave of monitors supporting Adaptive Sync. 

 

And finally, 

 

5) I'm pretty sure that hooking up an oscilloscope to a monitor, requesting custom software changes in a demo to point out differences in display technology and taking the time to figure out when the scalers do their thing is a real effort on PC Perspective's part. I don't see anyone complaining about Allyn pointing out that OCZ is introducing something that looks like a hiccup when you're writing constantly to the Vector 180, which hasn't been caught on any other review of the drive that I've seen. These guys do go the extra mile when they have the equipment to do so.


Edited by CataclysmZA - 3/29/15 at 1:53am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD/ATI
Overclock.net › Forums › Graphics Cards › AMD/ATI › [Various] AMD FreeSync Reviews