Overclock.net banner
881 - 900 of 18,210 Posts
I don't know. Given how well these cards always sell, I'm not sure the retailers need to drive up interest. Guess we'll find out in a little over a week.
 
I just wish they would post US pricing for the AIB cards instead of hiding it for so long. Eurasia has already been released, unsure why the delay.
I don't remember when they put up 4090 pricing. Maybe they're waiting to see what President Orange does. Although sounds like no tarrifs before the 30th now.
 
I don't know. Given how well these cards always sell, I'm not sure the retailers need to drive up interest. Guess we'll find out in a little over a week.
Yep, absolutely no need, but they've got nothing to do between now and release so might as well whip up a frenzy to make certain they sell as much as high as possible.
 
Yep, absolutely no need, but they've got nothing to do between now and release so might as well whip up a frenzy to make certain they sell as much as high as possible.
It will be interesting to see what Newegg puts up for prices. They always price gouge some at launch.
 
People talk about the 5090's "Increased Price" a lot but aren't taking other factors into consideration. It's important to start from this premise:

If the 5090 FE costs as much to make as the 4090 FE, Nvidia would make less money per unit by selling it for the same $1599 MSRP.

Consider:

  • GDDR7 is more expensive than GDDR6X
  • TSMC 4nm, despite now being an older process, is still about 10% more expensive than it was in 2022 (also now it's a larger die)
  • New FE cooler is more advanced/expensive to make

Inflation. It's hard to get an exact measure as governments lie to cover up their incompetency and different products can have different price changes due to a variety of factors. But here are a few measures:
  • Gold is up 56%
  • Dow Jones up 47%
  • Nasdaq up 29%
  • Housing up 40% in my city
  • Grocery prices up over 30%
  • RTX 5090 FE up 25%

Technically speaking, if you were buying a 5090 today with Gold, it'd cost you less gold than the RTX 4090 did when it came out. I understand the prices aren't "cheap" but realistically it's also not that expensive when you consider the resale of your existing card.

Other considerations:
  • Lack of anything even close to resembling competition in the upper segment by AMD and Intel
  • Huge demand from enterprise/AI segment, with limited wafer supply.

For an apples to apples comparison:
  • A 5090 FE is $2000.
  • You can sell your 4090 FE for AT LEAST $1000.
  • Your cost comes to $1000
  • Previous card cycle was 27 months.
  • That comes to just $37/month to have access to the absolute latest and greatest card.

This simply isn't that expensive for any employed adult in the west. 1 less dinner/drinks outing per month or 1 less bottle of booze or ounce of 420 per month will pay for it.
 
For an apples to apples comparison:
  • A 5090 FE is $2000.
  • You can sell your 4090 FE for AT LEAST $1000.
  • Your cost comes to $1000
  • Previous card cycle was 27 months.
  • That comes to just $37/month to have access to the absolute latest and greatest card.

This simply isn't that expensive for any employed adult in the west. 1 less dinner/drinks outing per month or 1 less bottle of booze or ounce of 420 per month will pay for it.
I love this kind of rationalisation, already copied and pasted to send to the wife (though I've removed the booze comparison, don't want her get any ideas about cutting back on my whisky).
 
People talk about the 5090's "Increased Price" a lot but aren't taking other factors into consideration. It's important to start from this premise:

If the 5090 FE costs as much to make as the 4090 FE, Nvidia would make less money per unit by selling it for the same $1599 MSRP.

Consider:

  • GDDR7 is more expensive than GDDR6X
  • TSMC 4nm, despite now being an older process, is still about 10% more expensive than it was in 2022 (also now it's a larger die)
  • New FE cooler is more advanced/expensive to make

Inflation. It's hard to get an exact measure as governments lie to cover up their incompetency and different products can have different price changes due to a variety of factors. But here are a few measures:
  • Gold is up 56%
  • Dow Jones up 47%
  • Nasdaq up 29%
  • Housing up 40% in my city
  • Grocery prices up over 30%
  • RTX 5090 FE up 25%

Technically speaking, if you were buying a 5090 today with Gold, it'd cost you less gold than the RTX 4090 did when it came out. I understand the prices aren't "cheap" but realistically it's also not that expensive when you consider the resale of your existing card.

Other considerations:
  • Lack of anything even close to resembling competition in the upper segment by AMD and Intel
  • Huge demand from enterprise/AI segment, with limited wafer supply.

For an apples to apples comparison:
  • A 5090 FE is $2000.
  • You can sell your 4090 FE for AT LEAST $1000.
  • Your cost comes to $1000
  • Previous card cycle was 27 months.
  • That comes to just $37/month to have access to the absolute latest and greatest card.

This simply isn't that expensive for any employed adult in the west. 1 less dinner/drinks outing per month or 1 less bottle of booze or ounce of 420 per month will pay for it.
Thats my mind set more or less, but I’m hanging on to my 4090 until after I receive a new and working 5090, just in case I regret my purchase and decide to return the 5090 for some odd reason (I doubt that’ll happen lol) I’m just wondering which 5090 do I try and get. I want an FE model since the engineering is really cool. But I know the AIB’s are also really nice as well.

Another thing is the RTX 4090 is going to mature so much better than the RTX 3090 ever could dream of. I also really wonder what’s going to happen at 8K with a 5090 VS. 4090 at 8K.

If I could trick my self to be content I would rather be content and keep my 4090. But I have this false belief that I must have or need a 5090 lol. But in reality the 4090 is so brutally fast already in 4K that I do not need a new GPU at all. Any new buyers/owners of the 4090 after 5090’s become more readily available are in for a serious surprise to how fast these GPU’s really are. 4K and max out any game around using DLSS Quality+FG. 🤯
 
Yep, we can dance around it all we want folks, but if you're in this thread, you've very likely already made your mind up to buy one (I certainly have).

I can (and will) justify it as a work expense (as I work in game dev), and @HyperMatrix's excellent list, but bottom line is I want a new toy to play with.
 
Inflation. It's hard to get an exact measure as governments lie to cover up their incompetency and different products can have different price changes due to a variety of factors. But here are a few measures:
  • Housing up 40% in my city
  • Grocery prices up over 30%
  • RTX 5090 FE up 25%
All that would be fine if everyone suddenly also received a 25-40 % pay raise over the past 2 years.
 
Ok, so there's a lot of conflicting information going on right now, let's get to the bottom of it.


Color information for each pixel gets compressed, 4K is still 4K, but the colors of HDR (10bit) gets compressed into 8-bit and then when it arrives, it's unpacked back to HDR, there is essentially no downside to DSC, it's instant as far as I understand, and virtually lossless, artifact free. One requirement for DSC to work is a cable capable of the bandwidth before compression takes place.

Driver encodes it, cable transports it, monitor decodes it. Both ends need to have DSC capability. However, it's important to note that DSC is dynamic, and the compression is happening in real-time, when displaying simpler images the compression ratio drops (to near uncompressed), then increases up to 3.75:1 on more complex images (VESA seems to have a soft cap at 3.75:1, as an "achievable maximum under optimal conditions").

3:1 compression rate means for every 3 bits of data, it only has to send 1, so to send 30Gbps using DSC with a 3:1 ratio, it only sends 10Gbps.

There is no ratio limit in theory, as I understand it, the driver is what decides if you can enable a certain resolution x refresh rate, typically the most cited ratio that let's say, Display Manufacturer/NVIDIA (and VESA), decided on, is 3.75:1, as of January 2025, for DSC 1.2a.

The reason for a "soft" limit is that you shouldn't compress it too much, as that might lead to artifacts. This is why you can't enable whatever resolution x refresh rate you want in windows, "they" decide what you can and cannot enable.

For uncompressed/compressed HDR, this is the bandwidth required with no overhead included:

4K 21:9 (5120x2160)
165Hz = 55 Gbps / With compression of 3.75 = 15 Gbps​
240Hz = 80 Gbps / With compression of 3.75 = 21 Gbps​
480Hz = 160 Gbps / With compression of 3.75 = 42 Gbps (Not possible with DP1.4)​

4K 32:9 (7680x2160)
120Hz = 60 Gbps / With compression of 3.75 = 16 Gbps​
165Hz = 82 Gbps / With compression of 3.75 = 22 Gbps​
240Hz = 120 Gbps / With compression of 3.75 = 32 Gbps (Not possible with DP1.4 if you include complexity and overhead)​
480Hz = 240 Gbps / With compression of 3.75 = 64 Gbps​

DP1.4 = 32.4 Gbps
DP2.1 = 80 Gbps

However, if the compression ratio drops from 3.75:1 to:
3:1 = 27 Gbps
2.5:1 = 32 Gbps (Not possible with DP1.4 if you include overhead)
2:1 = 40 Gbps (Not possible with DP1.4)

This is up to the GPU manufacturer and the display manufacturer to decide, if they will allow it, as they don't want the ratio to be so high that it causes issues (VESA guidelines come into play here).

VESA "Visually lossless compression performance verified by subjective testing": VESA Display Compression Codecs - VESA - Interface Standards for The Display Industry

With a 20% overhead, 5120x2160 (4K 21:9) on DP1.4 at 240Hz HDR, and the ratio dropping to 3:1 = 32 Gbps, hitting the bandwidth limit.

Though, HDMI 2.1 with its 48 Gbps bandwidth, can easily run the above.

So I stand corrected, my earlier information was incorrect, the RTX 4090 with DP1.4 and HDMI 2.1 can indeed run the new 4K 21:9 5120x2160 monitors at 240Hz with HDR enabled, through the use of DSC, even with sub-optimal compression. This greatly increases the value of used 4090s, and you still have the option to run FG 2x to reach these high framerates.
TFT Central doesn't agree with you on DP 1.4 being able to run 4K 21:9 (5120x2160), 10-bit HDR, at 240 Hz hence TFT speculating why LG's 2025 45" 4K 21:9 monitors are only up to 165 Hz unlike their 2024 and 2023 models which were always up to 240 Hz.

Are you saying TFT Central is wrong regarding DP 1.4 DSC not being enough for 4K 21:9, 10-bit HDR, 240 Hz?

 
4090 was the GOAT release - crazy gen to gen improvement. Pity all those that bought the overpriced 3090 or even worse 3090Ti. 2080Ti was also a worthless generation. While I don't think the 5090 is as bad as the 3090 or 2080Ti it just isn't very compelling from a gen to gen perspective either. 5080 is looking way better than the 4080 cause of pricing, so we seem to have a shift back to the 30-series when then 3080 was a much better buy.

Pretty sure I'll see plenty of you here picking up 5090s though, regardless.
 
4090 was the GOAT release - crazy gen to gen improvement. Pity all those that bought the overpriced 3090 or even worse 3090Ti. 2080Ti was also a worthless generation. While I don't think the 5090 is as bad as the 3090 or 2080Ti it just isn't very compelling from a gen to gen perspective either. 5080 is looking way better than the 4080 cause of pricing, so we seem to have a shift back to the 30-series when then 3080 was a much better buy.

Pretty sure I'll see plenty of you here picking up 5090s though, regardless.
If the new AI MFG transformation works even 75% as well as Nvidia promises, we be dining some mutha fooking gaming with 5090

boils down to if AI neural render is the future for gaming or Jensen was gaslighting us for his AI bubble business
 
I will say that I do work from home, and I use my PC for heavy lifting every day. Technically I don’t need a 5090, but I can say “It’s for work” That’s my excuse lol.
 
  • Haha
Reactions: Arizor
boils down to if AI neural render is the future for gaming or Jensen was gaslighting us for his AI bubble business
A bit of both really - we already have frame generation. All you need to do is answer how often you've actually used FG with the 40-series cards... I can think of only one title I'd run it with. DLSS itself is far more useful, way less latency.

MLG is like when the 20-series came out with DLSS, its at least a generation away from becoming mainstream useful. Even FG isn't all that useful so far. Still looking forward to seeing DLSS4 improvements on older cards once the new drivers roll out.
 
  • Rep+
Reactions: kryptonfly
I view it two ways. One i could get say 4090 at discount rate then hold out 2years and get 6090. Other way is get 5090 use it for say 4years then get next greatest thing gpu wise. Its really what combo of best performance and how you want to spend the money tbh.
Better to keep the 4090 for 2 years and get 6090, it's what I will do. Games are mainly designed on console PS5/Xbox in mind, like if suddenly tomorrow it would become unplayable on PC, nah...
For pros who need 32gb and/or crazy +240hz yes but for majority 4090 gamers I don't see the point. DLSS "enhanced" will come to old gen too with around +10% on FGx2 and Lossless Scaling works great for FG, we can see now the FG fps in the Nvidia app and I think sooner or later in third apps like afterburner. They will not sell as many 5090 as 4090, because we gained x2 from 3090 to 4090 and the 4090 was a good perf/price. Not at all the case with the 5090. If people don't buy it in mass, Nvidia could change in 2 years the cost, like they did with the 4080 too expensive at launch.
 
Are you saying that the TFT Central article is incorrect?
Yes. The TFTCentral article is assuming 3:1 compression, which is incorrect and contradicts VESA and the DSC spec.

I can't speak for the exact implementation of DSC on monitors I've never used and have yet to see tested, nor how NVIDIA's driver will react to trying to set such resolutions, but 5K2K@240Hz, with HDR, is well within the theoretical bandwidth limits of DP 1.4a with DSC.

The price/perfomance is completely gone out the window
The price/performance ratio is essentially the same as the last generation. That's a terrible generational uplift, but the ratio itself didn't go anywhere.
 
Discussion starter · #899 ·
Are you saying TFT Central is wrong regarding DP 1.4 DSC not being enough for 4K 21:9, 10-bit HDR, 240 Hz?
The article is confusing that's why, their fault not yours. They only mention HDMI2.1 at the very bottom of the article. Which is what makes it possible for the RTX 30 and 40 series to run it.
TFTCentral said:
You may notice from this roadmap that LG.Display were planning this 45″ panel with both a 165Hz refresh rate and a 240Hz refresh rate. So why have we only got the 165Hz refresh rate so far, and where’s that 240Hz panel? Where are the monitors with 5K2K at 240Hz?
Their use of "where" is dumb, it should be "when", not "where". They even explain it themselves later in the article.
TFTCentral said:
The cynical amongst us might say that this is a bit of a money grab – release a lower refresh rate monitor now at 165Hz, then provide the faster 240Hz refresh rate model later on to scoop up more of people’s money.
This is literally why, it's a money grab, it happens all the time in the monitor space, "double dipping", sell a customer a lower refresh rate monitor and then a high refresh rate, this actually works because enough people have money for the upgrade. Example: 34" UW 1440p 165Hz then became 175Hz followed by the "full" 240Hz. DP1.4 has no issues running it at 240Hz, so why did they release 165Hz at the start and not directly 240? Money grab, but it's not free (for them), "panel technology" such as cooling and optimization plays a role here, each monitor is fine tuned in various ways before the design is finalized, especially now on OLED, because of burn-in risks, the cooling is extremely important, you can't just run a 165Hz monitor at 240Hz with a let's say "unofficial bios upgrade", that'd likely lead to disaster. One could argue they're buying "time" by releasing a "simpler" monitor at first, to get their monitors to the market as fast as possible to make money, then after some time they release the more complex (240Hz) version. Simply put, this can be argued back and forth but panel manufacturers being greedy? That's real and is happening all the time.
TFTCentral said:
But realistically, are many people going to spend $2000 USD on a new screen now at 165Hz, then upgrade it to a 240Hz model for a similar (by the time it’s released) or probably higher price later on as well? It’s not a huge difference in refresh rate. We expect actually that by not providing a 240Hz version now, it could cost LG Electronics sales in the long run as people hold off on buying the new screens now and wait for the 240Hz version later which we expect to come at some point. By which time they’d almost certainly have lost their market advantage as other manufacturers adopt the same two panels and release their own screens. Right now LG Electronics are the only manufacturer who have screens announced based on these new panels and so have a competitive advantage for now.
Doesn't have to be "many", also, there's a contradiction there I think, he says it's not a huge difference, then asking if people who buy it are expected to buy the new one.
Most who buy it (165Hz) aren't going to upgrade, but these are eager people who want a 45" OLED as fast as possible, they'd prefer 240 but they can't wait, so they get the 165, and the people who do wait, not as eager, get the 240Hz. Pricing comes into play here too, 165Hz should be cheaper even if time passes and it won't be directly replaced by 240Hz version, so if let's say 30% who bought the 165Hz, couldn't afford the 240Hz anyway, it's really all about trying to get as many customers as possible, fill all the gaps in the market, including the fact which they mention at the bottom of the article, 165Hz is possible on DP1.4, so everyone on RTX 20 cards can run it at that refresh rate, but not 240, increasing their potential customers.

Two monitors, two price points, two audiences and impatient/patient/poor/rich.

As for the last part of the quote, first they say that LG is being stupid for not releasing the 240Hz right away, because of competitors entering the market, but then they end it by saying LG is the only manufacturer.. so there's no competition, LG can do whatever they want (like releasing a 165Hz first). This could also be the reason why LG is the only one, because they're getting a simpler 165Hz version out as fast as possible to the market.
TFTCentral said:
Actually we expected the primary reason for the missing 240Hz refresh rate version right now is down to video connectivity.
This is where it gets downright dumb. Right after that wild comment, they say this;
TFTCentral said:
To power a 5120 x 2160 resolution at 240Hz and 10-bit colour depth you need ~90.94 Gbps bandwidth for an uncompressed video signal. Or if you use DSC it needs only ~30.31 Gbps. These figures depend slightly on display timings but should be approximately correct. An uncompressed video signal isn’t going to be possible for this spec even on the top-tier DisplayPort 2.1 UHBR20 connections, which reach a maximum of 80 Gbps, or rather 77.37 Gbps data rate which is what we’re talking about here with these bandwidths.
Ok..? At first they said we aren't getting 240Hz because of "video connectivity", hinting at the display outputs, then they immediately say that it's not possible to run 240Hz without compression, and that it indeed is possible with DSC.
TFTCentral said:
Maybe the future HDMI 2.2 connections could support that, with initial specs suggesting they will offer up to 96 Gbps of bandwidth, but that spec is really just in a conceptual stage and is likely several years away from being available on monitors or graphics cards. Don’t hold your breath for HDMI 2.2 in the monitor space for any time soon.

So, to power 5K2K @ 240Hz, you’re always going to need to use DSC (Display Stream Compression).
When they used the word "that", they meant uncompressed, and that is completely irrelevant to us, as those bandwidths are absurdly high and won't be possible for many years.

Then they again tell us that DSC is needed to run it.
TFTCentral said:
The problem is though that even using DSC you need 30.31 Gbps of bandwidth for 5K2K @ 240Hz, and that’s beyond the capability of DisplayPort 1.4 with DSC which caps out at 25.92 Gbps. So that means to power 5K2K @ 240Hz with DSC, you are going to need a DisplayPort 2.1 graphics card.
Ok, so now they finally said it in detail, 5120x2160 at 240Hz is absolutely possible with a DP2.1 graphics card, which we have on the RTX 50 series. So why did they make it sound like 240Hz wasn't possible at the start of the article?

Direct quote again: "Video connectivity is likely to blame. Actually we expected the primary reason for the missing 240Hz refresh rate version right now is down to video connectivity."

It's clearly not missing because of video connectivity, they even said so themselves later in the article.. because we have DP2.1 already and it's coming to NVIDIA in a week.
TFTCentral said:
Given DSC will always be needed, this could be achieved with even the lowest UHBR10 tier in theory (38.69 Gbps data rate), but obviously UHBR13.5 and UHBR20 would be fine too.
Then they even go as far as to say that it's easy to run.. 🤦‍♂️
TFTCentral said:
Getting to the point then, the reason why we suspect the 5K2K 240Hz monitor options are not yet being announced is largely down to support from end-user devices. Right now as we write this, you can only buy a couple of graphics cards from AMD which feature any form of DisplayPort 2.1 connectivity, UHBR13.5 for their top-end consumer cards and UHBR20 for their top-end professional cards. Thankfully NVIDIA have now formally announced their RTX 50 series cards which will feature UHBR20 DisplayPort 2.1 connectivity, but as we discussed at length in the past their slow adoption of this video interface has definitely impacted its uptake on the monitor side of things. So right now the availability of graphics cards that could even support a 5K2K 240Hz spec is very limited.
This is just stupid, they wrote the article on Jan 12, a week ago, well after NVIDIA announced not just the cards with DP2.1, but also the release date of Jan 30, only WEEKS away.. and the 45" monitor isn't even out yet, so you literally can't buy it yet, hence there's no need to even mention that you can't use it today, when you can't even buy it..
TFTCentral said:
Later on, once NVIDIA’s cards are more widespread, tested and have had time to mature and address any bugs or niggles, it will be more viable to develop a 240Hz version and release that to market we think.
What the ** are they on about? Actual idiots.
TFTCentral said:
You could potentially use HDMI 2.1, but again your addressable market will be fairly limited as HDMI 2.1 is still quite new. It’s better than DP 2.1 though for sure.
They literally just said HDMI 2.1 with 42Gbps can do 240Hz, then right after they say you can "potentially" use it.

HDMI 2.1 is quite new? RTX 3090 released in January 2022, that's 3 years ago.. DP2.1 is quite new.

Also they of course made a typo, said HDMI 2.1 is better than DP 2.1 (they meant 1.4). Lastly they make it sound like because the GPUs only have 1xHDMI2.1 it's not the "primary" (vs 3x DP) display output and thus is the sole reason for 240Hz not existing even though it works on HDMI 2.1 on 3 year old cards.. vast majority of gamers run 1 high performance monitor, so it's irrelevant if it uses DP or HDMI.

(n) That's just a seriously bad article, avoid reading any article from them in the future. (y)
 
  • Rep+
Reactions: Zurv and Jay-G30
The article is confusing that's why, their fault not yours. They only mention HDMI2.1 at the very bottom of the article. Which is what makes it possible for the RTX 30 and 40 series to run it.

Their use of "where" is dumb, it should be "when", not "where". They even explain it themselves later in the article.

This is literally why, it's a money grab, it happens all the time in the monitor space, "double dipping", sell a customer a lower refresh rate monitor and then a high refresh rate, this actually works because enough people have money for the upgrade. Example: 34" UW 1440p 165Hz then became 175Hz followed by the "full" 240Hz. DP1.4 has no issues running it at 240Hz, so why did they release 165Hz at the start and not directly 240? Money grab, but it's not free (for them), "panel technology" such as cooling and optimization plays a role here, each monitor is fine tuned in various ways before the design is finalized, especially now on OLED, because of burn-in risks, the cooling is extremely important, you can't just run a 165Hz monitor at 240Hz with a let's say "unofficial bios upgrade", that'd likely lead to disaster. One could argue they're buying "time" by releasing a "simpler" monitor at first, to get their monitors to the market as fast as possible to make money, then after some time they release the more complex (240Hz) version. Simply put, this can be argued back and forth but panel manufacturers being greedy? That's real and is happening all the time.

Doesn't have to be "many", also, there's a contradiction there I think, he says it's not a huge difference, then asking if people who buy it are expected to buy the new one.
Most who buy it (165Hz) aren't going to upgrade, but these are eager people who want a 45" OLED as fast as possible, they'd prefer 240 but they can't wait, so they get the 165, and the people who do wait, not as eager, get the 240Hz. Pricing comes into play here too, 165Hz should be cheaper even if time passes and it won't be directly replaced by 240Hz version, so if let's say 30% who bought the 165Hz, couldn't afford the 240Hz anyway, it's really all about trying to get as many customers as possible, fill all the gaps in the market, including the fact which they mention at the bottom of the article, 165Hz is possible on DP1.4, so everyone on RTX 20 cards can run it at that refresh rate, but not 240, increasing their potential customers.

Two monitors, two price points, two audiences and impatient/patient/poor/rich.

As for the last part of the quote, first they say that LG is being stupid for not releasing the 240Hz right away, because of competitors entering the market, but then they end it by saying LG is the only manufacturer.. so there's no competition, LG can do whatever they want (like releasing a 165Hz first). This could also be the reason why LG is the only one, because they're getting a simpler 165Hz version out as fast as possible to the market.

This is where it gets downright dumb. Right after that wild comment, they say this;

Ok..? At first they said we aren't getting 240Hz because of "video connectivity", hinting at the display outputs, then they immediately say that it's not possible to run 240Hz without compression, and that it indeed is possible with DSC.

When they used the word "that", they meant uncompressed, and that is completely irrelevant to us, as those bandwidths are absurdly high and won't be possible for many years.

Then they again tell us that DSC is needed to run it.

Ok, so now they finally said it in detail, 5120x2160 at 240Hz is absolutely possible with a DP2.1 graphics card, which we have on the RTX 50 series. So why did they make it sound like 240Hz wasn't possible at the start of the article?

Direct quote again: "Video connectivity is likely to blame. Actually we expected the primary reason for the missing 240Hz refresh rate version right now is down to video connectivity."

It's clearly not missing because of video connectivity, they even said so themselves later in the article.. because we have DP2.1 already and it's coming to NVIDIA in a week.

Then they even go as far as to say that it's easy to run.. 🤦‍♂️

This is just stupid, they wrote the article on Jan 12, a week ago, well after NVIDIA announced not just the cards with DP2.1, but also the release date of Jan 30, only WEEKS away.. and the 45" monitor isn't even out yet, so you literally can't buy it yet, hence there's no need to even mention that you can't use it today, when you can't even buy it..

What the ** are they on about? Actual idiots.

They literally just said HDMI 2.1 with 42Gbps can do 240Hz, then right after they say you can "potentially" use it.

HDMI 2.1 is quite new? RTX 3090 released in January 2022, that's 3 years ago.. DP2.1 is quite new.

Also they of course made a typo, said HDMI 2.1 is better than DP 2.1 (they meant 1.4). Lastly they make it sound like because the GPUs only have 1xHDMI2.1 it's not the "primary" (vs 3x DP) display output and thus is the sole reason for 240Hz not existing even though it works on HDMI 2.1 on 3 year old cards.. vast majority of gamers run 1 high performance monitor, so it's irrelevant if it uses DP or HDMI.

(n) That's just a seriously bad article, avoid reading any article from them in the future. (y)
It's very clear in the article that TFT Central said 4K 21:9 (5120x2160), 10-bit HDR, 240 Hz is fully possible with, both, DP 2.1 and HDMI 2.1 using DSC. It's ONLY WITH DP 1.4 (even w/ DSC) that it's not possible, according to them...and that's all we're discussing here (DP 1.4 w/ DSC).
 
881 - 900 of 18,210 Posts