Topic Review (Newest First) |
01-31-2019 06:28 AM | |
ryan92084 | You have to turn them on in your settings. |
01-30-2019 09:02 PM | |
JackCY | Well Q2VKPT doesn't exactly run fast even on RTX cards and it does miss (badly) more bounces to get indirect lighting. I can't find any config file for graphical settings really, would be nice to be able to change some of the parameters without having to rewrite the hard coded values and recompile it all. |
01-30-2019 03:16 PM | |
WannaBeOCer |
I believe the big issue was Nvidia primarily came up with the DxR proposal to Microsoft, so it performed poorly on AMD hardware due to specific rendering path / calls used.
nVidia is purely a graphics company that just throws money at making amazing GPUs. While AMD is focusing on making money and they means they have to sacrifice some markets to work on the profitable ones. The high end GPU market isn't profitable which is why they focus on the mid range which is profitable. |
01-30-2019 02:52 PM | |
kevinf |
RTX is just a platform which inside has DxR nVidia enhanced libraries, VKRay(GPU Agnostic) Vulkan extensions, and OptiX(nVidia's API.) We'll see if AMD adopts VKRay but I doubt they will.
A developer can do what ever they want with the platform and Microsoft's DxR is a hybrid ray tracing method also. We are no where near the compute power of having current generation games with full Ray Tracing let alone Path Tracing. If you guys notice in Battlefield V the option is called "DXR Enabled" since AMD cards will also be able to enable DxR once AMD either releases driver support for Vega or wait until they have the hardware to run DxR. RTX just has enhanced libraries to make use of their RT cores. |
01-25-2019 01:58 AM | |
ILoveHighDPI |
Hang on a second. I just realized that there's something incredibly ironic with people saying they just want high quality Ray Tracing, but they don't care about resolution. Remember this video? Eventually he goes on to demonstrate that "High Quality" Ray Tracing (Path Tracing in this video) uses Thousands of Samples Per Pixel... That is exactly the same thing as running "One Sample Per Pixel" at 64K Screen Resolution or something ridiculously high like that. As noted in the video there is of course "Diminishing Returns", but the example given is 50 Samples Per Pixel, you only need 16 Samples to hit Native 8K (the video is 1080p so I'm assuming 1SPP=1920x1080 resolution). If pure Ray Traced games ever become a thing, "Quality" and "Resolution" will be exactly the same setting. Basically it's inevitable that eventually there will be zero added computational cost in moving to nearly any resolution you could imagine. |
01-24-2019 06:58 PM | |
littledonny |
Been playing around with the Quake demo on my RTX 2080 the last few days.
Question. Are the RTX's Tensor cores actually being utilized here, the same as they are in something like Port Royal? Or is there some kind of "trickery" going on to get the game running on the RTX cards? |
01-24-2019 06:25 PM | |
skupples |
When you compare RTX to modern lighting effects in games it's almost impossible to tell which is best.
I didn't know TW3 had crappy lighting. The game looked great to me. I guess this is where we are though...RTX at 30fps is the new standard and anything that isn't simulating light paths is trash. ![]() |
01-24-2019 06:22 PM | |
MonarchX |
WTH happened to all the post styling buttons? OCN's "Advanced" post is freaking empty. Where did all the formatting tools go??? Wiki says real-time ray-tracing was done a whiiile back: Quote:
The first implementation of a "real-time" ray-tracer was the LINKS-1 Computer Graphics System built in 1982 at Osaka University's School of Engineering, by professors Ohmura Kouichi, Shirakawa Isao and Kawata Toru with 50 students.[citation needed] It was a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing. According to the Information Processing Society of Japan: "The core of 3D image rendering is calculating the luminance of each pixel making up a rendered surface from the given viewpoint, light source, and object position. The LINKS-1 system was developed to realize an image rendering methodology in which each pixel could be parallel processed independently using ray tracing. By developing a new software methodology specifically for high-speed image rendering, LINKS-1 was able to rapidly render highly realistic images." It was "used to create the world's first 3D planetarium-like video of the entire heavens that was made completely with computer graphics. The video was presented at the Fujitsu pavilion at the 1985 International Exposition in Tsukuba."[12] The LINKS-1 was the world's most powerful computer at the time, as of 1984.[13]
The earliest public record of "real-time" ray tracing with interactive rendering (i.e., updates greater than a frame per second) was credited at the 2005 SIGGRAPH computer graphics conference as being the REMRT/RT tools developed in 1986 by Mike Muuss for the BRL-CAD solid modeling system. Initially published in 1987 at USENIX, the BRL-CAD ray-tracer was an early implementation of a parallel network distributed ray-tracing system that achieved several frames per second in rendering performance.[14] This performance was attained by means of the highly optimized yet platform independent LIBRT ray-tracing engine in BRL-CAD and by using solid implicit CSG geometry on several shared memory parallel machines over a commodity network. BRL-CAD's ray-tracer, including the REMRT/RT tools, continue to be available and developed today as Open source software.[15] Since then, there have been considerable efforts and research towards implementing ray tracing in real time speeds for a variety of purposes on stand-alone desktop configurations. These purposes include interactive 3D graphics applications such as demoscene productions, computer and video games, and image rendering. Some real-time software 3D engines based on ray tracing have been developed by hobbyist demo programmers since the late 1990s.[16] The OpenRT project includes a highly optimized software core for ray tracing along with an OpenGL-like API in order to offer an alternative to the current rasterisation based approach for interactive 3D graphics. Ray tracing hardware, such as the experimental Ray Processing Unit developed at the Saarland University, has been designed to accelerate some of the computationally intensive operations of ray tracing. On March 16, 2007, the University of Saarland revealed an implementation of a high-performance ray tracing engine that allowed computer games to be rendered via ray tracing without intensive resource usage.[17] On June 12, 2008 Intel demonstrated a special version of Enemy Territory: Quake Wars, titled Quake Wars: Ray Traced, using ray tracing for rendering, running in basic HD (720p) resolution. ETQW operated at 14-29 frames per second. The demonstration ran on a 16-core (4 socket, 4 core) Xeon Tigerton system running at 2.93 GHz.[18] At SIGGRAPH 2009, Nvidia announced OptiX, a free API for real-time ray tracing on Nvidia GPUs. The API exposes seven programmable entry points within the ray tracing pipeline, allowing for custom cameras, ray-primitive intersections, shaders, shadowing, etc. This flexibility enables bidirectional path tracing, Metropolis light transport, and many other rendering algorithms that cannot be implemented with tail recursion.[19] Nvidia has shipped over 350,000,000 OptiX capable GPUs as of April 2013. OptiX-based renderers are used in Adobe AfterEffects, Bunkspeed Shot, Autodesk Maya, 3ds max, and many other renderers. AMD enabled real-time ray-tracing on Vega graphics cards through GPUOpen Radeon ProRender. [20] Nvidia has announced real-time ray-tracing on their Quadro RTX workstation graphics cards. The Nvidia GeForce 20 series of video cards have real-time ray tracing capabilities. Imagination Technologies offers a free API called OpenRL which accelerates tail recursive ray tracing-based rendering algorithms and, together with their proprietary ray tracing hardware, works with Autodesk Maya to provide what 3D World calls "real-time raytracing to the everyday artist".[21] In 2014, a demo of the PlayStation 4 video game The Tomorrow Children, developed by Q-Games and SIE Japan Studio, demonstrated new lighting techniques developed by Q-Games, notably cascaded voxel cone ray tracing, which simulates lighting in real-time and uses more realistic reflections rather than screen space reflections.[22] The upcoming game MechWarrior 5: Mercenaries is stated to feature ray tracing. As of 2018, the option is admitted to be a strain even on the highest-end graphic cards.[23] |
01-24-2019 04:29 PM | |
WannaBeOCer |
Been playing around with the Quake demo on my RTX 2080 the last few days.
Question. Are the RTX's Tensor cores actually being utilized here, the same as they are in something like Port Royal? Or is there some kind of "trickery" going on to get the game running on the RTX cards? RT cores are being used for the path tracing while the Tensor cores are being used as an AI-accelerated denoiser. There isn't any trickery, Ray Tracing causes noise and they created an AI to denoise. http://brechpunkt.de/q2vkpt/ Edit: DxR can use DirectML to denoise. Which seems like AMD will be using for their denoising. https://blogs.msdn.microsoft.com/dir...r-2018-update/ Quote:
ML techniques such as denoising and super-resolution will allow hardware to achieve impressive raytraced effects with fewer rays per pixel. We expect DirectML to play a large role in making raytracing more mainstream
|
01-24-2019 04:21 PM | |
CelticGamer |
Been playing around with the Quake demo on my RTX 2080 the last few days. Question. Are the RTX's Tensor cores actually being utilized here, the same as they are in something like Port Royal? Or is there some kind of "trickery" going on to get the game running on the RTX cards? |
This thread has more than 10 replies. Click here to review the whole thread. |
Posting Rules | |