Overclock.net › Forums › Industry News › Hardware News › [pcper]AMD Talks Technical about FreeSync Monitors and First Impressions - CES2015
New Posts  All Forums:Forum Nav:

[pcper]AMD Talks Technical about FreeSync Monitors and First Impressions - CES2015 - Page 32  

post #311 of 396
Quote:
Originally Posted by Imouto View Post

It's not even funny how some ppl don't understand the basics of these technologies and come to FreeSync threads to spread FUD (pun intended) and brag about their lack of knowledge.

Seriously disappointed to see the same ppl from 1 year ago with the same arguments and trying to wreck every thread. Yet funny to see some others ranging from shameless acknowledge to shy redemption.

I agree, and would add that in GPU and GPU related threads the quality of discussion has been sickeningly poor, no other thread has has so much immature bickering. What I find the most disappointing is when a vendor delays a product, firmware or drives update all hell breaks lose, but when the games that will be run with this hardware are delayed people praise it saying "we want the devs to get it right even if it takes a bit longer". Does no-one wants the hardware released these days to work on release in the same way the software like games is expected to work? Who cares if it takes a year and if the boat is missed be a company? Because no one cares about it or its outdated or will never work as well as another solution? Isn't that fine, as a consumer how does it matter?

/rant
post #312 of 396
Quote:
Originally Posted by Chakravant View Post

Wouldn't happen on G-Sync. That's the point.
Yes it would happen, gsync cannot fix IPS panel flaws, if it could, there would be gsync IPS panels for sale already.
post #313 of 396
Excuse me: what makes you think gsync is the better solution? The one monitor that has it suffers from severe reverse ghosting & persistence due to wrong ULMB implementation. The strobe should have been delayed 4.5 ms to avoid persistence of previous frame, so we are at square one, imo.
Edited by mtcn77 - 1/10/15 at 4:13am
The Machine
(14 items)
 
Nexus 7 2013
(11 items)
 
 
CPUMotherboardGraphicsRAM
A10 6800K Asus F2A85-V MSI 6870 Hawx, VTX3D 5770, AMD HD6950(RIP), Sap... G.skill Ripjaws PC12800 6-8-6-24 
Hard DriveOptical DriveOSMonitor
Seagate 7200.5 1TB NEC 3540 Dvd-Rom Windows 7 x32 Ultimate Samsung P2350 23" 1080p 
PowerCaseMouseAudio
Seasonic s12-600w CoolerMaster Centurion 5 Logitech G600 Auzen X-Fi Raider 
CPUMotherboardGraphicsRAM
Quad Krait 300 at 1.5Ghz Qualcomm APQ8064-1AA SOC Adreno 320 at 400mhz 2GB DDR3L-1600 
Hard DriveOSMonitorKeyboard
32GB Internal NAND Android 5.0 7" 1920X1200 103% sRGB & 572 cd/m2 LTPS IPS Microsoft Wedge Mobile Keyboard 
PowerAudio
3950mAh/15.01mAh Battery Stereo Speakers 
  hide details  
The Machine
(14 items)
 
Nexus 7 2013
(11 items)
 
 
CPUMotherboardGraphicsRAM
A10 6800K Asus F2A85-V MSI 6870 Hawx, VTX3D 5770, AMD HD6950(RIP), Sap... G.skill Ripjaws PC12800 6-8-6-24 
Hard DriveOptical DriveOSMonitor
Seagate 7200.5 1TB NEC 3540 Dvd-Rom Windows 7 x32 Ultimate Samsung P2350 23" 1080p 
PowerCaseMouseAudio
Seasonic s12-600w CoolerMaster Centurion 5 Logitech G600 Auzen X-Fi Raider 
CPUMotherboardGraphicsRAM
Quad Krait 300 at 1.5Ghz Qualcomm APQ8064-1AA SOC Adreno 320 at 400mhz 2GB DDR3L-1600 
Hard DriveOSMonitorKeyboard
32GB Internal NAND Android 5.0 7" 1920X1200 103% sRGB & 572 cd/m2 LTPS IPS Microsoft Wedge Mobile Keyboard 
PowerAudio
3950mAh/15.01mAh Battery Stereo Speakers 
  hide details  
post #314 of 396
Quote:
Originally Posted by Syan48306 View Post

The lack of g-sync in the 32 inch range is forcing me into the freesync camp.

There are several announced.

How about the Phillips 32" 4K with G-Sync?

http://www.overclock.net/t/1511085/hexus-philips-shows-32-34-40-4k-monitors-and-27-g-sync/0_50

How about the Asus 32" 4K with G-Sync?

http://hothardware.com/news/ASUS-PA328Q-32Inch-Monitor-Does-4K-At-60Hz-Over-DisplayPort-And-HDMI-20

etc?

Quote:
Originally Posted by PlugSeven View Post

Yes it would happen, gsync cannot fix IPS panel flaws, if it could, there would be gsync IPS panels for sale already.

There are several IPS panels that will support G-Sync. They are due out ... about the same time that the IPS panels that support FreeSync are due out. ie after CES.
post #315 of 396
Quote:
Originally Posted by error-id10t View Post

Nvidia went the way they did because as per their words, freesync or whatever you guys want to call it doesn't work according to them. Just google back to what they were saying ~12 months ago.

Not to mention it was FASTER for them to get product to the actual people than to wait nearly 2 years to get a modification to a standard approved. As it stands, FreeSync isn't even a standard, it is an OPTIONAL part of a standard.

By developing your own standard, you control it. You don't have to wait for 2 years for it to be OPTIONAL. You don't risk putting your technology in some crap monitor that has a tight operational window that it will work in (say only from 40 to 60 like many of the FreeSync monitors will initially have), etc.

Further, you can make it work on a whole lot more older technology ... unlike AMD, which they already said will only work on some of their APUs, the 260 and 290 cards. It will NOT work on the 270, 280 series, or ANY of their 4-digit cards (like the 7xxx or 6xxx series cards).

Yeah it costs money to do it that way, but it has its advantages. And speaking of money, I wish people would stop with the $200 number. That is the END PRICE of the unit. People need to learn that PRICE DOESN'T EQUAL COST.

The PRICE of the module is $200. The COST is most likely about $50-75 (if that).


Further, IF FreeSync does take off and eclipse G-Sync, it won't take much for nVidia to make a patch for their system and use it.

Fact of the matter is, nVidia looked at this technology before AMD even "invented it" (which they didn't, it was an side effect of eDP and AMD just scrambled when nVida was once again playing "catch up" and saw that nVidia released G-Sync). To think that "Johnny-come-late to the party" AMD actually has an advantage in this, is funny.
Edited by 47 Knucklehead - 1/10/15 at 4:19am
post #316 of 396
Quote:
Originally Posted by 47 Knucklehead View Post

Not to mention it was FASTER for them to get product to the actual people than to wait nearly 2 years to get a modification to a standard approved. As it stands, FreeSync isn't even a standard, it is an OPTIONAL part of a standard.

By developing your own standard, you control it. You don't have to wait for 2 years for it to be OPTIONAL. You don't risk putting your technology in some crap monitor that has a tight operational window that it will work in (say only from 40 to 60 like many of the FreeSync monitors will initially have), etc.

Further, you can make it work on a whole lot more older technology ... unlike AMD, which they already said will only work on some of their APUs, the 260 and 290 cards. It will NOT work on the 270, 280 series, or ANY of their 4-digit cards (like the 7xxx or 6xxx series cards).

Yeah it costs money to do it that way, but it has its advantages. And speaking of money, I wish people would stop with the $200 number. That is the END PRICE of the unit. People need to learn that PRICE DOESN'T EQUAL COST.

The PRICE of the module is $200. The COST is most likely about $50-75 (if that).


Further, IF FreeSync does take off and eclipse G-Sync, it won't take much for nVidia to make a patch for their system and use it.

Fact of the matter is, nVidia looked at this technology before AMD even "invented it" (which they didn't, it was an side effect of eDP and AMD just scrambled when nVida was once again playing "catch up" and saw that nVidia released G-Sync). To think that "Johnny-come-late to the party" AMD actually has an advantage in this, is funny.

id rather spend those 200 extra dollars on a brand new gpu than on an vendor locked dead gsync.
Edited by caswow - 1/10/15 at 5:03am
post #317 of 396
Quote:
Originally Posted by Syan48306 View Post

What happens when a 60hz monitor dips below 30hz? Vsync is unable to do anything about frames less than 30hz. It's no good when it comes to fixing tears.
Huh? VSync never tears. In fact, the whole "frame duplication" of G-sync at low FPS is essentially exactly what VSync does - the monitor keeps re-scanning the same frame until the new one is ready.
post #318 of 396
Quote:
Originally Posted by 47 Knucklehead View Post


There are several IPS panels that will support G-Sync. They are due out ... about the same time that the IPS panels that support FreeSync are due out. ie after CES.
I'm well aware of that.He claimed that there would be no ghosting on the 4K 60hz samsung IPS panel with gsync which is just plain wrong.
Quote:
Originally Posted by 47 Knucklehead View Post

Not to mention it was FASTER for them to get product to the actual people than to wait nearly 2 years to get a modification to a standard approved. As it stands, FreeSync isn't even a standard, it is an OPTIONAL part of a standard.
Looks like we have a much broader choice of freesync monitors coming out the gate than gsync has and it's been available for more than year. As of right now, the only gsync monitor worth getting is the rog swift and it's plagued with rmas'along with being a TN.
Quote:
By developing your own standard, you control it. You don't have to wait for 2 years for it to be OPTIONAL. You don't risk putting your technology in some crap monitor that has a tight operational window that it will work in (say only from 40 to 60 like many of the FreeSync monitors will initially have), etc.
Those are IPS and or 4K panels, hardly what I would call crap - each to their own, I guess.
Quote:
Further, you can make it work on a whole lot more older technology ... unlike AMD, which they already said will only work on some of their APUs, the 260 and 290 cards. It will NOT work on the 270, 280 series, or ANY of their 4-digit cards (like the 7xxx or 6xxx series cards).
It's still only Kepler and maxwell cards that do variable refresh and not all of them at that.
Quote:
Yeah it costs money to do it that way, but it has its advantages. And speaking of money, I wish people would stop with the $200 number. That is the END PRICE of the unit. People need to learn that PRICE DOESN'T EQUAL COST.

The PRICE of the module is $200. The COST is most likely about $50-75 (if that).
Agreed, lets talk about how a gsync equipped monitor is nearly twice the cost of a non-gsynced monitor of the same spec instead.
post #319 of 396
Quote:
Originally Posted by 47 Knucklehead View Post

Quote:
Originally Posted by Syan48306 View Post

The lack of g-sync in the 32 inch range is forcing me into the freesync camp.

There are several announced.

How about the Phillips 32" 4K with G-Sync?

http://www.overclock.net/t/1511085/hexus-philips-shows-32-34-40-4k-monitors-and-27-g-sync/0_50

How about the Asus 32" 4K with G-Sync?

http://hothardware.com/news/ASUS-PA328Q-32Inch-Monitor-Does-4K-At-60Hz-Over-DisplayPort-And-HDMI-20

etc?

Man you're wrong on both links. The first link says phillips 32" 34" and 40" 4K. They're also releasing a 27 inch g-sync monitor. The second is also 4K 32 inch professional 10 bit monitor AND another 28 inch 4k g-sync monitor. Just because the same article talks about a g-sync monitor doesn't mean they're all g-sync. There's not a single 32 inch g-sync monitor.


Quote:
Originally Posted by zalbard View Post

Quote:
Originally Posted by Syan48306 View Post

What happens when a 60hz monitor dips below 30hz? Vsync is unable to do anything about frames less than 30hz. It's no good when it comes to fixing tears.
Huh? VSync never tears. In fact, the whole "frame duplication" of G-sync at low FPS is essentially exactly what VSync does - the monitor keeps re-scanning the same frame until the new one is ready.

The video that I showed proved that with vsync disabled and when it drops below the minimum cutoff fps of 40hz on that monitor, it tears. There is a nuance of difference between the g-sync implementation and freesync implementation with vsync that everyone seems to be missing. Freesync will use v-sync and drop down to 30fps, 20fps and 15fps as mentioned earlier. G-sync will maintain an artificial 30 fps by selective frame duplication.



When I still owned my three ROG Swifts, playing crisis 3 on max across all of the monitors WILL drop you below 40 and 30 fps on max even if you are running a trip sli setup. Again, I'm not trying to bash freesync randomly. I'd like it to work exactly the same if not better than G-Sync but because of the differences between the technology, there will be experience differences. Some people just don't seem to get that.
Hydra TH10A
(15 items)
 
Nova Vault
(5 items)
 
 
CPUMotherboardGraphicsRAM
5960x Rampage V Extreme 3-Way SLI GTX 980 Ti Hydro Copper 16 GB Dominator Platinum 
Hard DriveCoolingCoolingCooling
480GB Intel 730 SSD Raid 0 x4 Custom Loop 4x 480 Alphacool Radiators Aquacomputer Aqualis 
MonitorKeyboardPowerCase
Acer B326HK Razer Black Widow 2014 Corsair AX1500i  Case Labs TH10A 
MouseMouse PadAudio
Razer Naga 2014 Megasoma 2 Creative ZxR 
CPUMotherboardRAMHard Drive
Xeon D-1541 Supermicro X10SDV-TLN4F 128GB Crucial DDR4 2133 ECC 10x WD Red 4TB  
Case
Caselabs Nova  
CPUGraphicsRAMHard Drive
2.9 Ghz Radeon Pro 560 4GB 16 GB 1TB SSD 
OSMonitor
OS 10.11 15 Inch Retina 2880x1800 
  hide details  
Hydra TH10A
(15 items)
 
Nova Vault
(5 items)
 
 
CPUMotherboardGraphicsRAM
5960x Rampage V Extreme 3-Way SLI GTX 980 Ti Hydro Copper 16 GB Dominator Platinum 
Hard DriveCoolingCoolingCooling
480GB Intel 730 SSD Raid 0 x4 Custom Loop 4x 480 Alphacool Radiators Aquacomputer Aqualis 
MonitorKeyboardPowerCase
Acer B326HK Razer Black Widow 2014 Corsair AX1500i  Case Labs TH10A 
MouseMouse PadAudio
Razer Naga 2014 Megasoma 2 Creative ZxR 
CPUMotherboardRAMHard Drive
Xeon D-1541 Supermicro X10SDV-TLN4F 128GB Crucial DDR4 2133 ECC 10x WD Red 4TB  
Case
Caselabs Nova  
CPUGraphicsRAMHard Drive
2.9 Ghz Radeon Pro 560 4GB 16 GB 1TB SSD 
OSMonitor
OS 10.11 15 Inch Retina 2880x1800 
  hide details  
post #320 of 396
Quote:
Originally Posted by mtcn77 View Post

Excuse me: what makes you think gsync is the better solution? The one monitor that has it suffers from severe reverse ghosting & persistence due to wrong ULMB implementation. The strobe should have been delayed 4.5 ms to avoid persistence of previous frame, so we are at square one, imo.

Are you talking about the ROG siwft? Source?

Also, there are multiple gsync/ulmb monitors
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
6700k Asus Maximus VIII Hero Gigabyte G1 980 Corsair LPX 2x8GB 3200c16 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Samsung 850 EVO Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus PG258Q (240hz + Gsync) WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Logitech G Pro Qck+ 
  hide details  
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
6700k Asus Maximus VIII Hero Gigabyte G1 980 Corsair LPX 2x8GB 3200c16 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Samsung 850 EVO Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus PG258Q (240hz + Gsync) WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Logitech G Pro Qck+ 
  hide details  
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
This thread is locked  
Overclock.net › Forums › Industry News › Hardware News › [pcper]AMD Talks Technical about FreeSync Monitors and First Impressions - CES2015