[Techgage] AMD Talks New Radeon ProRender Integrations, New Plugin Versions, And Coming Features - Page 2 - Overclock.net - An Overclocking Community

Forum Jump: 

[Techgage] AMD Talks New Radeon ProRender Integrations, New Plugin Versions, And Coming Features

Reply
 
Thread Tools
post #11 of 14 (permalink) Old 08-01-2019, 02:26 AM
Graphics Junkie
 
UltraMega's Avatar
 
Join Date: Feb 2017
Location: USA
Posts: 1,084
Rep: 23 (Unique: 23)
Quote: Originally Posted by ToTheSun! View Post
Films are, and have been for decades, "shot in resolutions" (parenthesis for analog) much higher than 1080p. Even 35mm film has enough detail to get a much better resolved image digitally. It's, mostly, a problem of standards - streaming bandwidth, outdated hardware, etc.

Of course, no one is that bothered to upgrade the entire chain because movie goers and blu-ray hoarders don't have pixel autism like us PC enthusiasts do.
I didn't say film wasn't good enough to translate over to 1080p but I did mention that until recently it was no where near 4k quality in movies and every movies we have ever seen in theaters until recent years was filmed/rendered/produced in a resolution at or around 1080p.

However, "much higher than 1080p" is inaccurate IMO. Avatar which until Avengers Endgame was the highest grossing movie of all time was rendered/filmed in 1440p and that was a lot higher than most at the time, film or CGI. A lot of older movies that were shot on film can be remastered to 1080p because the film is close to that level of quality but no where near 4k. I'd say it's pretty close to 1080p, if even a little less for most movies shot with film before the 4k standard.

This is a good video on it:

i7 7700kK @4.2ghz
16GB DDR4 3200mhz
GeForce 1080 Ti
UltraMega is offline  
Sponsored Links
Advertisement
 
post #12 of 14 (permalink) Old 08-05-2019, 06:50 AM
New to Overclock.net
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 6,266
Rep: 336 (Unique: 246)
Quote: Originally Posted by UltraMega View Post
I didn't say film wasn't good enough to translate over to 1080p but I did mention that until recently it was no where near 4k quality in movies and every movies we have ever seen in theaters until recent years was filmed/rendered/produced in a resolution at or around 1080p.

However, "much higher than 1080p" is inaccurate IMO. Avatar which until Avengers Endgame was the highest grossing movie of all time was rendered/filmed in 1440p and that was a lot higher than most at the time, film or CGI. A lot of older movies that were shot on film can be remastered to 1080p because the film is close to that level of quality but no where near 4k. I'd say it's pretty close to 1080p, if even a little less for most movies shot with film before the 4k standard.

This is a good video on it:
https://www.youtube.com/watch?v=YSZ-yFTSmfY
35mm Film from decades ago was at a minimum 1080p quality if you converted it to digital. That would be if the movie was shot using very low lines/mm cameras and lenses. If someone used a high quality lens of around 100 lines/mm it would translate to around 8k resolution if you scanned the film and converted to digital frames. They also has lenses up to around 200/mm, but those werent used much as far as I know because although resolution was the highest, it had other bad tradeoffs. Typical was around 75-100/mm. This is why you see remaster releases of really old 80's and 90's movies in 4k and they look so good. The film itself from way back then had the resolution of now days and more, but projection systems and other parts of the signal chain have just been trash for so long that we only got them in "480i" quality at the time.


The move to digital cleaned things up nicely, but actually cut our resolution of movies WAY down to a sliver of what it used to be.






Oh, and 70mm IMAX film would be the equivalent of anywhere from 32k to 64k resolution depending on the setup it was shot with.
One of the reasons Batman Dark Knight looked so incredible in those opening scenes when watched at a theater. It is impossible for that quality to be seen over a disk or streaming platform so unless you saw it in an IMAX theater no one would have any idea how to even imagine the quality difference it had.


Last edited by EniGma1987; 08-05-2019 at 06:56 AM.
EniGma1987 is offline  
post #13 of 14 (permalink) Old 08-05-2019, 08:35 AM
Overclocker
 
JackCY's Avatar
 
Join Date: Jun 2014
Posts: 9,764
Rep: 332 (Unique: 237)
Doesn't matter what's it shot at, the intermediates are often still 2k and final image is blown up, IMAX or not. 70-90s film depends a lot on what film was used, what lenses, what post processing intermediates, some look nice sharp low noise while most look like a potato but are still better than even the butchered 4k remasters that corporations are trying to sell when they take 90s DVD era transfer and create a bluray from it... so amateurs go and buy the old film who's existence is being denied by studios and scan it to digital and post process it themselves at fraction of the cost that studios spent on a botched 90s source rerelease. They rarely if ever go and do a full from master transfer/scan and 4k processing of the whole movie, forget it, too much work too much $$$. They use what ever film transfer they have lying around most of the time and scan it at higher resolution, then spend millions on removing all the artifacts and correcting their time degraded trash source.

New movies... 2k, mostly, even when shot at 4k.

Series... depends, mostly 1080p really and some have a crap ton of colored noise to boot as well.

With older movies higher resolution scan... = more noise not better resolution and one can see just how poorly focused 90% of the movie is.
Sometimes they use the old expensive anamorphic lenses on new movies in some scenes... it looks awful and warped.

70mm film having 32k or 64k LOL dream on. By the time it gets onto the 70mm print used in cinema or on bluray it has went through 2k processing XD
The masters sure have reasonable resolutions but it gets wasted and cut down, not even scanned large enough etc. by production.

Another butchery is half the HDR movies are not even set/tonemapped right when it comes to brightness. Many even modern especially heavy CGI ones are not HDR at all, it's a 1080p SDR blow up...
4k and HDR for movies... it's mostly a scam and only very few can be called that.

Will movie makers ever bother to implement any of the GPU advances? Nope. They may buy it if someone updates a software/hardware package they use to process the images. That's about it.
JackCY is offline  
Sponsored Links
Advertisement
 
post #14 of 14 (permalink) Old 08-05-2019, 10:06 AM
New to Overclock.net
 
ILoveHighDPI's Avatar
 
Join Date: Oct 2011
Posts: 3,255
Rep: 133 (Unique: 84)
Quote: Originally Posted by UltraMega View Post
I'm someone who really hate aliasing which was my main motivation to move to 4k in the first place and to me, 4K seems to be enough resolution. The shift from 1080p to 4k is huge and in no small part because its enough to basically eliminate aliasing issues from games entirely with no AA or low AA. The shift from 4k to 8k would be massively less noticeable and the benifit in fidelity vs the GPU render cost just isn't there like it is for 4k, and it never will be. 8k for gaming will not be as common in ten years as 4k gaming is now because going from 4k to 8k does not solve the aliasing problem, since it's already solved at 4k. 8k gaming will not be a thing until GPU power and screen manufacturing costs for 8k are so low that there isn't much of a reason not to do it.

for all of those reasons and more, 4k is going to be the standard for a long, long time. 1080p is too low to eliminate artifacts but 4k isn't, and 8k is excessive and wasteful with terrible cost benifit ratio.

8k is a novelty and it's going to remain that was for way longer than 4k did. Most movies today still are not even filmed in 4k. It wasn't even until the last few years that movie theaters upgraded to 4k projectors. If you saw the first avengers movie in theaters, it was probably at a much lower resolution than 4k at the time and the newest avengers movies are not even truly 4k in most of their shots for a variety of reasons.

That said, you could use upscaling for PC games if you happened to have an 8k monitor and you wanted to render a game at 4k. DLSS probably wouldn't work but some of the other upscalers probably would work just fine today. But seeing that there is basically going to be zero 8k content for a long long time, I don't see it becoming even common enough for that situation to arise often enough to even be worth mentioning 5+ years from now. I would bet that PS6 won't even have a strong 8k focus. The need just wont be there.
Some of the Checkerboard 4K techniques are almost indistinguishable from native quality, apply the same to 8K and the minor checkerboard artifacts will be practically nonexistent.
At minimum CB8K will be worthwhile, and is the standard the gaming industry should be shooting for at the high end (don’t forget VR still needs to push even more pixels than that), but going a step further, given the quality of modern TAA I’m confident you could use just 8.3 million pixels out of the 33 million pixels of an 8K resolution display and with enough Frames Per Second it would be very hard to differentiate between a 1/4 resolution sparse rendered 8K image and a Native 8K image.
Getting temporal resolution up (framerate) is key though, at 30fps everything is ugly.
At 120fps you can also start alternating which pixels get rendered per-frame and create a more complete spatial map over time.

The critical thing is that 8K resolution is what you physically need for your eyes to perceive a crisp image. A given high contrast line doesn’t need to be rendered exactly but you can tell that sub-8K panels are less sharp.
ILoveHighDPI is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off