New Posts  All Forums:

Posts by sumitlian

Strange, Thief ran fine even on my 1GHz 280X (1.02v) at Max settings.Played that game for a week, benchmark is showing 43 min but afair it never went below 45 fps in actual gaming. Avg fps were 60+ everywhere.Here is my benchmark of Thief.that min fps on 900p dx always shows lower fps than 1080p dx. Looks like more stable fps with more load.
Thank you for this. It indeed looks great. Would have looked even better in PNG. JPEG image compression significantly reduces quality over PNG.You can see it yourself, take a snapshot of desktop or anything by snipping tool and save it in both formats and compare it side by side, you'll definitely see the difference.
May you upload a screenshot of Ryse of 4K resolution in PNG format ?
Yes yes, you're right about vsync, gsync and freesync. But I believe micro stuttering can still happen with all those features if runt frames are being rendered within fps from the GPU side. I was talking more about internal frame rendering with tight physics synchronization. Aren't vsync, gsync and freesync supposed to reduce tearing (not micro stuttering), that happens due to de-synchronisation between fps and refresh rate of monitor.I agree both tearing and micro...
Most people will not agree with me here in OCN, but imo if there is a constant frame time of 33.33 ms during almost 100th% of gameplay time, then 30 fps can literally look smooth. Both physics and graphics rendering precision must not decrease or exceed 33.33 ms interval. Yes I am talking about a perfect straight line of frame times everywhere in a game regardless of graphics or physics load. Even 1% fluctuation of 33.33 ms frame time in 1th% time of a second should not be...
Last time I checked a review showed Nvidia is already doing better in Linux than AMD.What enlightens me that AMD is doing good too in this field.Now we just need NG OpenGL to be supported by both AMD/Nvidia and game developers. 100% This.
Great news. +1.
No no you misunderstood me a little bit. You can put me on the same list where users spend or will spend their money specifically for PhysX, therefore so will I want to go for Mantle., hell mantle is much more exciting than PhysX, imo. No doubt that Nvidia wonder drivers must have been a great upgrade for users who are already using Nvidia GPUs and comparing it to their previous drivers. But in all real world conditions in latest games like BF4 and Wathc Dogs, what I have...
I have no doubt that it will change.Future Nvidia drivers will change the performance, so will AMD's future drivers. I am going to keep this line of your as your belief regarding real world gaming experience. I'll mention it in upcoming gaming benchmarks where Nvidia stays above AMD.
After this news, Best price performance graphics card with 4GB VRAM, 2014 fall. 1. GTX 970 2. R9 290 3. R9 290X 4. GTX 980 Best performance per watt graphics card with 4GB VRAM, 2014 fall. 1. GTX 970 2. GTX 980 3. R9 290 4. R9 290X My opinion, Mantle is what stopping me to buy a Nvidia GPU, it was literally a great experience with BF4 and Thief in Mantle mode, specially in minimum fps, I've seen by myself more than 40% minimum fps in BF4 with mantle than competitive...
New Posts  All Forums: