Overclock.net banner
1 - 7 of 7 Posts

· Registered
Joined
·
401 Posts
Discussion Starter · #1 ·
is it heat? gpu's/cpu's? is it the games themsevles not being coded big enough (poly counts etc)?

how can a high end intel cpu with quad sli titans not beable to produce pre rendered cut-scene like gameplay? it's 2013..and from what i hear, we aren't near 4k gaming yet?
 

· Premium Member
Joined
·
5,414 Posts
Consoles are what is holding it back. Why? Because they make developers the most money, so the majority of development time is naturally going to cater for them. Considering the amount of money it would take to create something so far ahead for PC, the amount of optimization that would have to go into it, and how little return they would likely get from it taking into account it's the only platform capable of it in terms of hardware and that only a minority would likely be able to run something so advanced, I don't think it's surprising as to why it hasn't happened yet.

When consoles are capable, then you'll see it happen.
 

· Registered
Joined
·
20 Posts
No, it's physics.

Rendering photo realistic (or even cut scene level) graphics in real time is impossible.

First, a game has way more than just rendering going on. It has AI, occlusion, physics(collisions, environment, etc.), distance. Even if a game wasn't doing all that, to get that awesome look you need
1) raytracing (ray casting used in games)
* this includes shading, refraction, opacity, etc.
2) radiosity (false radiosity used in games)
3) much higher detailed texture/bump mapping
4) more complex blur (motion, focus)

Even on a super rendering PC machine you would have to sacrifice a lot of detail for real time. Of course, a lot of these problems are parallelizable (hence rendering clusters), so as cpus get faster/more cores, it will be possible.
 

· Registered
Joined
·
626 Posts
Isn't it a mixture of both market + tech?

Quad sli titans won't necessarily be enough to render cut-scene like gameplay but you can get fairly close if they do the optimizations right. Photo realistic won't be possible for quite a while though.. there needs to be some other tech breakthroughs (which could just very well happen).

It's not quite possible yet but without the market being able to keep up they prob won't bother, although it's not like no one will try sooner or later.
 

· Registered
Joined
·
3,845 Posts
Market and Hardware, mostly. We can get scary good renders via software like C4D, Maya, etc. but it takes crazy amounts of time to do a single frame (a few hours on a desktop). That's mostly because the rendering is done on the CPU though. It might be possible on GPUs, but you'd need some crazy 3x Titan setup, and even then you might not get 30fps the whole time.

The real issue is accurate caustics and ray tracing, both of which are murder to render times. We've got real time SSAO, shadowing (No accurate penumbra), and lighting down fairly well, it's just the details. The other issue is texture size. Movie-quality stuff uses CRAZY high res textures, which would flood even a 6gb card.

Outside of that there's the cost. Pre-rendered cut-scenes aren't cheap, and doing them in real time would probably be even more expensive.

Also no engine is currently coded correctly for it. You'd have to basically write an engine from the ground up to handle that many poly's, shading, etc.
 
1 - 7 of 7 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top