[TechSpot] Battlefield V DLSS Tested: Overpromised, Underdelivered - Page 4 - Overclock.net - An Overclocking Community

Forum Jump: 

[TechSpot] Battlefield V DLSS Tested: Overpromised, Underdelivered

Reply
 
Thread Tools
post #31 of 56 (permalink) Old 02-20-2019, 05:08 AM
New to Overclock.net
 
white owl's Avatar
 
Join Date: Apr 2015
Location: The land of Nod
Posts: 5,374
Rep: 136 (Unique: 103)
Quote: Originally Posted by Defoler View Post
Note that the 1080 TI FE was priced at 699$ with the AIBs costing around 700-750$ (not including special versions), and the 2080 FE is 799$, but the AIBs start at 699$ as well.
If you compare FE to FE, yes, ˜15% price increase. If you compare AIB to AIB prices, they stayed the same price.
The 2080 is about 10-15% increase in performance depends on the game suit of the review.

If you move from 980 TI and the choices are 1080 TI vs 2080, the reason to upgrade is clear to choose the 2080. If yo have 1080 TI you can choose whether extra few bucks (used to new) worth getting 10% more performance, or wait to see if the RTX features are worth getting once reviewed. It can be a side upgrade, it can be a big upgrade.
It's more of a statement on what Nvidia is expecting for certain teirs as time goes on than anything else. For the here and now you're right but I refuse to be happy about the direction we're headed where Nvidia jacks up prices, justified by features made for an industry paying thousands per card. Features that trickle down into almost nothing tangible since it would seem as though they've taken a step backwards in every way but raster. That's why I made the super long edit.
Sure it sound's ok when you say you're paying more for better visuals.
But it sounds much worse when you point out why they gave them to you (to sell something that would normally be cut off) and when you point out that you could have very similar IQ for a smaller hit (or possibly the same, hard to say) if someone had just bothered to use raster to do the same thing (in games without special hardware). They marketed it like it was the future "now" but I really think we could have already had the same level of IQ the whole time if THAT'S the kind of performance hit that's acceptable for much better reflections and lighting. Because of all this RTX stuff we'll likely never see what was really possible in games with methods that we already had, instead we're a new target for hardware made for a different purpose that wasn't quite good enough for the job. Now we're an afterthought with a bigger dollar sign.


Keep in mind I'm not claiming any of this as a 100% fact nor am I insulting anyone. I'd hate for some one who bought a 2080TI to take out any frustration on me.

Quote: Originally Posted by SpeedyVT
If you're not doing extreme things to parts for the sake of extreme things regardless of the part you're not a real overclocker.
Quote: Originally Posted by doyll View Post
The key is generally not which brands are good but which specific products are. Motherboards and GPUs are perfect examples of companies having everything from golden to garbage function/quality.
Hot n Bothered
(12 items)
CPU
4790k 4.7Ghz
Motherboard
Asus Sabertooth Z97 MkII 2
GPU
EVGA GTX 1080 SC
RAM
16gb G.Skill Sniper 2400Mhz
Hard Drive
2x Kingston v300 120gb RAID 0
Hard Drive
WD Blue
Power Supply
Seasonic 620w M12 II EVO
Cooling
Cooler Master 212 Evo
Case
Corsair 450D
Operating System
Windows 10
Monitor
Nixeus EDG27
Other
I have pretty lights.
▲ hide details ▲
white owl is offline  
Sponsored Links
Advertisement
 
post #32 of 56 (permalink) Old 02-20-2019, 05:18 AM
mfw
 
ToTheSun!'s Avatar
 
Join Date: Jul 2011
Location: Terra
Posts: 7,147
Rep: 401 (Unique: 207)
Quote: Originally Posted by tubers View Post
Is DLSS better than checkerboarding?
Using the only implementations already present in games, no; it's, in fact, worse.

We'll still have to see if everyone simply screwed up the implementation or if this is as good as it gets. The only satisfactory examples of DLSS are Infiltrator and Port Royal, which are very irrelevant.

CPU
Intel 6700K
Motherboard
Asus Z170i
GPU
MSI 2080 Sea Hawk X
RAM
G.skill Trident Z 3200CL14 8+8
Hard Drive
Samsung 850 EVO 1TB
Hard Drive
Crucial M4 256GB
Power Supply
Corsair SF600
Cooling
Noctua NH C14S
Case
Fractal Design Core 500
Operating System
Windows 10 Education
Monitor
ViewSonic XG2703-GS
Keyboard
Ducky One 2 Mini
Mouse
Glorious Odin
Mousepad
Asus Scabbard
Audio
Fiio E17K v1.0 + Beyerdynamic DT 1990 PRO (Dekoni pads)
▲ hide details ▲
ToTheSun! is offline  
post #33 of 56 (permalink) Old 02-20-2019, 06:42 AM
New to Overclock.net
 
Panzerfury's Avatar
 
Join Date: May 2012
Posts: 171
Rep: 4 (Unique: 4)
Quote: Originally Posted by ku4eto View Post
A performance hit of 10-20% would be something, that we would have taken probably. But not 40-50% performance loss, for negligible Image Quality improvement. On top of that, it costs more than a previous gen cards, by a good amount.
But it has always been known that ray tracing is VERY computationally expensive. So it's not really a surprise it takes a toll on the GPU.
I'm sure it's the future though, as hardware gets better, the performance and IQ will increase.
Panzerfury is offline  
Sponsored Links
Advertisement
 
post #34 of 56 (permalink) Old 02-20-2019, 06:47 AM
New to Overclock.net
 
skupples's Avatar
 
Join Date: Apr 2012
Location: Fort Lauderdale
Posts: 19,680
Rep: 563 (Unique: 321)
Quote: Originally Posted by ku4eto View Post
Yea, watched the Hardware Unboxed video yesterday. Gotta see now people trying to justify the RTX 2060/2070/2080. Only the 2080 Ti brings something to the table, and that is pure performance. Ray Tracing tanking performance by 40% is not cool. Neither is useless feature as DLSS.
its a shame that they've hit a performance wall that's causing another generation of soon to be forgotten software gimmicks.

Really, anyone on 9xx or above should totally avoid 2080 series, n wait for the card that comes out after the next console releases. THATS that card that'll the next ball buster.

R.I.P. Zawarudo, may you OC angels' wings in heaven.
If something appears too good to be true, it probably is.
skupples is offline  
post #35 of 56 (permalink) Old 02-20-2019, 06:57 AM
Performance is the bible
 
Join Date: Apr 2009
Posts: 6,788
Rep: 437 (Unique: 301)
Quote: Originally Posted by white owl View Post
but I refuse to be happy about the direction we're headed where Nvidia jacks up prices, justified by features made for an industry paying thousands per card. Features that trickle down into almost nothing tangible since it would seem as though they've taken a step backwards in every way but raster. That's why I made the super long edit.
I'm not happy either. I hadn't upgrade from the 980 TIs because of those price increases.

Quote: Originally Posted by white owl View Post
Because of all this RTX stuff we'll likely never see what was really possible in games with methods that we already had, instead we're a new target for hardware made for a different purpose that wasn't quite good enough for the job. Now we're an afterthought with a bigger dollar sign.
I'm not sure.
If RTX fail (which it might if things progress like that), we will eventually see increase in performance and other methods as they realise that in order to see us GPUs, they actually need to be affordable and fast.
Or we will have to wait for intel or new cards from AMD, but it will not stay the same (at least I hope). Nvidia lost 1B$ in revenue this year. From both not being able to sell GPUs to miners and not being able to sell GPUs to the desktop market. Eventually they will be forced to wake up.



Quote: Originally Posted by huzzug View Post
No. It just means I know what I'm talking about.
But you showed that you don't. Nor brought proof of that. Considering you are the only one who claim you are right, that doesn't make your claim right.
It at most, only makes your claims a delusion.


Quote: Originally Posted by huzzug View Post
I think people here are pretty capable of making their minds as to how DLSS and RTX has been completely useless.
But you didn't point to other people. You put a claim as a fact, not as an opinion.


Quote: Originally Posted by huzzug View Post
Maybe ask Nvidia marketing. I'm pretty sure they'd hand over the doctored footage if you ask nicely.
Didn't you claim it exist? Hence you need to bring that to light. Considering it apparently doesn't exist, hence your deflection, it shows basically, that you lied.
Can you explain why you lied? Or do we need to contact AMD marketing to ask that?

Quote: Originally Posted by huzzug View Post
Yes I did. They nor any of the other reviewers find the tech mind blowing because it isn't. It's just a fancy blackbox unscaling using pixie dust.
We didn't discuss about mind blowing. You claimed they found it useless, but they actually claimed it can be useful, based on their initial review.
So either you can't understand what they wrote, or you claim something else based on your opinion, not what they wrote. Hence the earlier point.

Quote: Originally Posted by huzzug View Post
If it needs constant coding from Nvidia to make it work as they intended initially and the feature breaking itself if game files change, then it is another SLI'esq disaster from Nvidia.
Can you please explain the "SLI'esq disaster"? Beside dwindling support from both AMD and nvidia, SLI has been something pretty strong in the last years.
Or do you expect me to contact AMD marketing department again to explain your words?

Quote: Originally Posted by huzzug View Post
There aren't many games using the useless feature for now. What promise did you see in which game that you're talking about?
You are only straightening the notion that you clearly have no clue, hence making your first statement here invalid, again (and I expect again soon enough).
Read please (though I expect you won't).

Quote: Originally Posted by huzzug View Post
And now there are 2 developers working to fix the game instead of 1. Impressive.
New feature.
When AMD released their tressfx, Crystal Dynamics, AMD and Nvidia had to work for 2 weeks trying to fix all the issues with the game, because of tressfx patches and problems.
And that is one example of many.
That is why it takes time to make new tech works.

Quote: Originally Posted by huzzug View Post
But this isn't an AMD thread. Maybe you could try talking to someone about it if you like to, but the thread doesn't even mention AMD. Staying on topic would be refreshing when discussing the shortcomings of DLSS and RTX in context of Nvidia.
It isn't, but I'm bringing examples of tech, and show promise (or not), but it was largely praised in OCN as to give it time to mature.

Quote: Originally Posted by huzzug View Post
Evidently with Nvidia, it takes a lot and it always falls short.
Considering most of nvidia's tech from the last few years is today standard in many games, the "always falls short" shows a hate for nvidia, but with intended misinformation.

Quote: Originally Posted by huzzug View Post
2 wrongs don't make a right.
True. But also considering how un-knowledgeable you are, I just wanted to pointing it out.



Last edited by Defoler; 02-20-2019 at 07:11 AM.
Defoler is offline  
post #36 of 56 (permalink) Old 02-20-2019, 07:47 AM
New to Overclock.net
 
dantoddd's Avatar
 
Join Date: Sep 2009
Posts: 1,472
Rep: 35 (Unique: 30)
Quote: Originally Posted by Ricwin View Post
DLSS does actually work as advertised when every single parameter is fixed and there are not variables, such as running through a scripted benchmark.
That being said, it is still even in such scenarios, nowhere near as good as Nvidia claimed (and the fanboys lauded following the DXR failures).
Yeah. It looks good on rails but nothing is on rails in real life. I'm a bit worried about 1300 i spent on 2080 ti

Quote: Originally Posted by ku4eto View Post
A performance hit of 10-20% would be something, that we would have taken probably. But not 40-50% performance loss, for negligible Image Quality improvement. On top of that, it costs more than a previous gen cards, by a good amount.
If you ask RTX makes a fundamental difference in games. even with these not so good implementations the difference is categorical in terms of lighting

Quote: Originally Posted by white owl View Post

Being happy that you bought a 2080 for 15% more than the 1080TI was even though you only get about a 10% bump just blows my mind. That's what we once called a side-grade, not generational improvement If it cost the same then it would make more sense. The whole evolution of CPUs/GPUs is that the same level of performance costs less over time...but I will say that maybe we've reached a point where this no longer applies to the high end with Nvidia cards. And it's not like they made this crap for us, these aren't ray tracing features but research features that they've forcing to run lighting effects. We're just butter on the bread for buying something that can't be sold for $5000 becasue it didn't meet the standard. They only have a single new gaming card that offers reasonable price to performance and even it should cost less.The only way I could justify it's current price is if it could run Metro at 1440p/60 with RTX features turned on. Not "maxed out" bragging right settings but the ones that really improve IQ.
I wanted to buy to 1080 Ti but around late December there were none to be found at less than 1000 USD.

CPU
(4 items)
CPU
i7 8700k
GPU
RTX 2080Ti
RAM
16GB DDR4
Operating System
Windows 10 Pro
▲ hide details ▲

Last edited by andrews2547; 02-20-2019 at 08:20 AM.
dantoddd is offline  
post #37 of 56 (permalink) Old 02-20-2019, 08:00 AM
PC Evangelist
 
ZealotKi11er's Avatar
 
Join Date: May 2007
Location: Toronto, CA
Posts: 45,815
Rep: 1796 (Unique: 1173)
I am more upset that Tensor Cores + Ray Tracing Core are taking space in a GPU now. With these features now Nvidia basically has slowed down the performance of GPUs a good 60-70%.

Yamato
(10 items)
Ishimura
(13 items)
CPU
AMD Ryzen 7 3700X
Motherboard
ASUS TUF Gaming X570-Plus (Wi-Fi)
GPU
NVIDIA GeForce RTX 2080 Ti FE
RAM
G.SKILL Trident Z RGB (2x8GB) DDR4 3200MHz CL14
Hard Drive
Samsung SM961 512GB
Hard Drive
HGST DeskStar NAS 6TB
Power Supply
EVGA SuperNOVA 750 P2
Cooling
Gamdias Chione M1A-280R
Case
Fractal Design Meshify C TG
Operating System
Microsoft Windows 10 Pro 64 Bit
CPU
Intel Core i7-3770K @ 4.8GHz
Motherboard
ASRock Z77E-ITX
GPU
AMD Radeon Vega Frontier Edition
RAM
AVEXIR Blitz 1.1 16GB DDR3-2400MHz CL10
Hard Drive
SanDisk Ultra II 960GB
Hard Drive
Toshiba X300 5TB
Power Supply
EVGA SuperNOVA 750 G3
Cooling
Corsair H100i GTX
Case
Fractal Design Define Nano S
Operating System
Microsoft Windows 10 Pro 64 Bit
Monitor
LG OLED55C7P
Keyboard
Cooler Master MasterKeys MK750
Mouse
Finalmouse Air58 Ninja
▲ hide details ▲


ZealotKi11er is online now  
post #38 of 56 (permalink) Old 02-20-2019, 08:07 AM
What should be here ?
 
huzzug's Avatar
 
Join Date: Jun 2012
Posts: 5,309
Rep: 358 (Unique: 256)
Quote: Originally Posted by Defoler View Post
But you showed that you don't. Nor brought proof of that. Considering you are the only one who claim you are right, that doesn't make your claim right.
It at most, only makes your claims a delusion.
Read my post again. They may enlighten you about what I said.

Quote:
But you didn't point to other people. You put a claim as a fact, not as an opinion.
It is a fact that DLSS and RTX are useless in their current implementation and far cry from what Nvidia demonstrated at launch. But you can go on about how it isn't. Make for cute debate about "what if"

Quote:
Didn't you claim it exist? Hence you need to bring that to light. Considering it apparently doesn't exist, hence your deflection, it shows basically, that you lied.
No. I told it "looks" like Nvidia over promised and under delivered and what they demoed at launch might be a doctored footage. Since you want proof, maybe approach Nvidia.

Quote:
Can you explain why you lied? Or do we need to contact AMD marketing to ask that?
tch tch tch. You can even argue without using AMD as a crutch. Sad

Quote:
We didn't discuss about mind blowing. You claimed they found it useless, but they actually claimed it can be useful, based on their initial review.
Again, "I" said it's useless and reviewers found it "meh" at best. I have a few good books for reading if you have difficulty.

Quote:
Can you please explain the "SLI'esq disaster"? Beside dwindling support from both AMD and nvidia, SLI has been something pretty strong in the last years.
As strong as the RTX and DLSS presentations from Nvidia in the past 6 months since launch.

Quote:
You are only straightening the notion that you clearly have no clue, hence making your first statement here invalid, again (and I expect again soon enough).
Read please (though I expect you won't).
Good thing I don't live in future. As of date, I can count the number of games/demo's that use these crappy upscaling and RTX features on my pinky and even those are "meh"

Quote:
New feature.
When AMD released their tressfx, Crystal Dynamics, AMD and Nvidia had to work for 2 weeks trying to fix all the issues with the game, because of tressfx patches and problems.
And that is one example of many.
That is why it takes time to make new tech works.
But this isn't TressFX. Maybe Nvidia should take cues from AMD about how not to launch half baked features. But they will anyways because Nvidia likes their gullible fan base.

Quote:
It isn't, but I'm bringing examples of tech, and show promise (or not), but it was largely praised in OCN as to give it time to mature.
And matured it has. Like manure.

Quote:
Considering most of nvidia's tech from the last few years is today standard in many games, the "always falls short" shows a hate for nvidia, but with intended misinformation.
Please enlighten what Nvidia's tech from last few years are standard that haven't been done before?

#2 their debt is insane, even for a "diverse field" company. They cannot even afford to service the debt maintenance let alone make an actual dent in the debt itself. - Internet Stranger

Last edited by huzzug; 02-20-2019 at 08:10 AM.
huzzug is online now  
post #39 of 56 (permalink) Old 02-20-2019, 08:14 AM
New to Overclock.net
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 6,326
Rep: 338 (Unique: 248)
Quote: Originally Posted by huzzug View Post
Could be that Nvidia have doctored the release footage of DLSS to create hype. Else, it doesn't make sense to have such disparity between release and actual games. Wouldn't put it past Nvidia either as they can lie in order to get the gullible to buy their overpriced cards.
We already know they did that with BF5 in the demo footage for RTX and actual release game. If you play the game it looks nothing even close to what was shown in the tech demo.








Quote: Originally Posted by white owl View Post
I wonder what we'd get if we used raster to best imitate RTX. Portal managed to pull off mirrors so why can't we apply that to water with ripple effects? Or paint? Or glass?
In the GN video they had a graphics designer come in and explain what RTX features we were seeing and if they could have been done without RTX and the answer was alyasy yes. The reason we don't see it often in games is becasue of permanence. .

We had mirrors in games as far back as 2000. Notice how this first image shows a reflection of the character which is never seen in game and not on screen normally, the wall behind the player, the floor by the back wall, as well as part of the drinking fountain by this mirror that is not on screen normally. All of that is off screen and being shown in a mirror with no performance hit. And you can definitely apply it to paint and glass as shown in images 2 and 3 on this post, it is the same concept as the marble floor. Water is easy if you want cheap ripple effects. Harder if you want to use actual physics based ripples with reflections in them, but still doable with minor performance penalty, with most performance hit from the physics and not the mirror reflection effect on the water. The weird thing really is that this tech existed and had no performance hit way back when, and only modern games suddenly couldnt do this any longer until RTX came along and somehow "fixed" this problem that was never a problem before.


(Deus Ex, release in 2000)




(Deus Ex, release in 2000)


There was 0 performance hit for using a mirror in Unreal 1 engine (from images shown above) or from using mirrors in Quake 3 engine maps. I used to make maps for games and made a pretty nice one in Jedi Academy that had polished marble floor in a massive entry hallway (got the idea from that image above in Deus Ex) that had a mirror effect on the entire floor. No one had even the slightest performance issue playing that map all those years ago
The one, and only, mirror effect that RTX brings that rasterization will not do properly is a "hall of mirrors" (HoM in map designer speak). That is when a mirror is facing another mirror surface. Old games will simply glitch out and display garbage in that scenario, where RTX will properly render a "never ending" hall of mirrors like in real life. Not a very common scenario obviously, but it could possibly be used in a bathroom scene or two in games?




Ta da! went and found my old map with the marble floor entryway hosted online from back in 2008 (towards the end of the games life). If you stand in the middle of this hallway and look straight down at the floor so no walls or ceiling is visible, then in the floor reflection you still do see the walls and ceiling. The only thing you dont see is the player standing there while in first person, you just see a small circle shadow on the ground. If you go into 3rd person mode though and look down you do see the player model there on the ground as well. Just differences in game engine between Unreal1 and Quake3 and how certain models are culled from rendering. Interestingly enough, other players and NPCs are shown as reflections to you when in first person mode:



(custom map I released in 2008, from Quake 3 Arena game engine that released in 1999)


Last edited by EniGma1987; 02-21-2019 at 02:38 PM.
EniGma1987 is offline  
post #40 of 56 (permalink) Old 02-20-2019, 08:17 AM
New to Overclock.net
 
Hwgeek's Avatar
 
Join Date: Apr 2017
Posts: 605
Rep: 14 (Unique: 12)
Quote: Originally Posted by ZealotKi11er View Post
I am more upset that Tensor Cores + Ray Tracing Core are taking space in a GPU now. With these features now Nvidia basically has slowed down the performance of GPUs a good 60-70%.
Also it's interesting if they gonna release GTX 1880Ti with 2080~2080Ti performance at much lower price and then RTX owners will be even more angry since their current RT/DLSS implantation?

I think the Turing GTX parts gonna be tricky. I have a feeling that early adopters of the RTX not going to be happy.
Hwgeek is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off