Originally Posted by ILoveHighDPI
Very cool, but you didn't exactly answer my question (not that there's any obligation, thank you for your time. And yes I realize that not everyone knows everything and I may be asking too much).
I'm wondering if rendering time is actually a significant portion of the production cost vs. artist workload. From what I've read the polygon counts in CGI movies have always been insanely high just to give it a clean look (I'm thinking of Tron and Toy Story).
Point being it sounds like movies should inherently have more than enough detail for very high resolution displays, in which case rendering time may be the only thing that actually changes with higher resolution. If artist workload is the main production cost and that doesn't change, then the overall cost of doing higher resolution movies may not actually be a lot higher, you just have to sit there waiting longer (or quadruple the size/power of your rendering farm).
It actually surprises me that a few million dollars for a hardware upgrade would be an issue, what with movie budgets regularly running into hundreds of millions of dollars. I guess a publisher owned CGI studio would be different from something independently run. The big wigs wouldn't want to just give an independent company a free hardware upgrade.
When I say "big publisher" I'm thinking of projects headed by people like Peter Jackson or James Cameron where the publisher will throw everything they have at them. If the CGI department says their machines would take ten years to render everything in The Hobbit, then PJ gets them what it takes to fulfil his vision for the movie within his production schedule (or maybe I have unrealistic assumptions of how much control producers have over the budget?) in which case the upgrade would have to cost a whole stinking lot to make them hold back on their goals.
Ah okay, I understand more what you're asking now.
Rendering time isn't THAT bad in terms of overall production cost, but keep in mind here that you're thinking more along the lines of Weta Digital, ILM, Pixar, etc., guys that own large shares in the movies they make AND charge top dollar for their services. Rendering is a HUGE bottleneck though, and in really bad cases will start to waste artist time...then you really begin losing money.
Our firm isn't top-rung like them. We do tons of stuff on films with 80M+ budgets, and we get shots on A-list titles, but never at the scale that they do and we're more for fitting a budget than "we MUST get Mr. X to do our work so the film looks like we imagined!" Not to dog on us though, because we do crank out some absolutely kick ass work as you can see in that reel. We'll get paid something in the neighborhood of $17M for our work on Resident Evil 5 as an example. A big chunk of change, no doubt...but with 120+ artists, a three floor studio downtown, 600+ rackmounted Xeons, 120+ Xeon workstations, and a few hundred TB worth of servers, there probably isn't a ton left over. Upgrading the farm from 2P 1366/1P 1155 to 2P 2011 would be a significant portion of our yearly profits if I had to guess...maybe even put us in the red if it's all in one year.
Back to your other questions though about what might be involved in 4K rendering...you're correct in thinking that the detail already in our assets' models would probably more than hold up at 4K resolution. Fur and water effects would probably not, and require heavier simulations and denser grooming/guide curve generation. I think a lot of our texture painting would need to be kicked up a notch, get more detailed, more crisp. It would take some beast workstations to paint those well...64GB mem with a Quadro 6000 or something like that. We'd be talking about dozens of 8K texture maps on a single asset.
Big publishers have a lot of luxuries too like access to power and infrastructure that we don't. We've basically maxed out the grid's delivery at our location right now, I don't think we can add more blades even if I wanted to, and it could be than a Xeon E5 upgrade isn't even possible for us right now.
Weta has a 6,000+ blade render farm to give you an idea...in a massive, private, data center that they built and wired from the ground up. The plebs like us have to create makeshift setups in whatever office building the studio space lives in.
So basically, Weta, ILM; they're Tier 1. Framestore, MPC, Cinesite, and a handful of others; they're Tier 1.5. Guys like us are Tier 2.
The Tier 1/1.5 guys wouldn't be hurt terribly badly by a move to 4K, but Tier 2 would be crushed right now if it were to happen.
Edited by kweechy - 12/7/12 at 9:15am