How rendering should be

General discussion about 3D DCC and other topics
Kzin
Posts: 432
Joined: 09 Jun 2009, 11:36

Re: How rendering should be

Post by Kzin » 21 Aug 2012, 12:54

CiaranM wrote:
ActionArt wrote: I have. So far they're not what I was hoping for and certainly not fast. Also (for me) very confusing and difficult to set up. Volume rendering just seems a natural fit to use the GPU on. I haven't tried the new Fury yet so I will at some point.
Volume rendering can be very memory intensive, especially if you're rendering several different types of data (density, temperature etc.), so maybe not so suited for the GPU?

thats the point, the massiv amount of data used in todays feature films. i think most people underestimates this alot. the tech demos from nvidia are nice, but these are extrem low res without details and small in scale. my advice is to do some tests with fume, what you need to get some nice volumes and you will see how much resources you really need. the idea of using the gpu is right away from the table then.

User avatar
Mathaeus
Posts: 1778
Joined: 08 Jun 2009, 21:11
Location: Zagreb, Croatia
Contact:

Re: How rendering should be

Post by Mathaeus » 21 Aug 2012, 14:24

Maximus wrote: Its not a matter of days to switch to another render engine despite the frustration you will have cause you have to learn everything again, especially when you encounter problems.
yeah, first divorce is pain :) - I'll never forget my fanatic days of..... Pov-Ray :). Next divorce is a pain too, but not that much....

SreckoM
Posts: 187
Joined: 25 Jul 2010, 00:18
Skype: srecko.micic

Re: How rendering should be

Post by SreckoM » 21 Aug 2012, 15:23

Kzin wrote:
SreckoM wrote:I said something similar on MR forum, about most of studios using other render engines with SI nowadays ... man I almost finished in jail ... So take care about what are you saying :D
you throw in a general statement from your limited view which is not true at all and you was corrected ( btw, i did the same experience for the people i know).
problem is that you dont awnser anymore like alot of people which comes in the forum, bash mr and goes away. dont know why they dont defend their meaning. if you write this, name the companys, what they do and WHY they switched. thats the important point, the WHY. the mi developer reading this forum more often then you think, they discuss about it. but that all makes no sense when you wrote a simple line of words which helps no one.

mr sucks? why it is sucking. ;)

I really do not have will to elaborate what was my point in that post. But honestly you think that I am under false impression that major portion of SI studios switched to Arnold or 3Delight in last several years?

@milanvasek
Can you elaborate your statement, you might have more trustful sources than I do?

And why they switched question, honestly it is not my job to do that, let them do research if they need answer on that.
- H -

milanvasek
Posts: 143
Joined: 09 Jun 2009, 12:12
Location: Czech Republic
Contact:

Re: How rendering should be

Post by milanvasek » 21 Aug 2012, 15:45

SreckoM wrote:
Kzin wrote:
SreckoM wrote:I said something similar on MR forum, about most of studios using other render engines with SI nowadays ... man I almost finished in jail ... So take care about what are you saying :D
you throw in a general statement from your limited view which is not true at all and you was corrected ( btw, i did the same experience for the people i know).
problem is that you dont awnser anymore like alot of people which comes in the forum, bash mr and goes away. dont know why they dont defend their meaning. if you write this, name the companys, what they do and WHY they switched. thats the important point, the WHY. the mi developer reading this forum more often then you think, they discuss about it. but that all makes no sense when you wrote a simple line of words which helps no one.

mr sucks? why it is sucking. ;)

I really do not have will to elaborate what was my point in that post. But honestly you think that I am under false impression that major portion of SI studios switched to Arnold or 3Delight in last several years?

@milanvasek
Can you elaborate your statement, you might have more trustful sources than I do?

And why they switched question, honestly it is not my job to do that, let them do research if they need answer on that.
well, i'm not sure it's a right thing to talk about specific studios. i feel it's kind of internal matter.
but i worked on 3 animated feature films in last two years and two of them were Softimage+Arnold (the other one was Maya+3delight).
here on SI-community is also some thread about Arnold where you can see commercials from The Mill, Glassworks, Psyop etc. I'm not saying that they are not using MR at all anymore, because i dont know if they do, but you can clearly see Arnold being used more and more...
Milan Vasek
ceramic artist & softimage fan
http://www.milanvasek.com

Kzin
Posts: 432
Joined: 09 Jun 2009, 11:36

Re: How rendering should be

Post by Kzin » 21 Aug 2012, 15:55

SreckoM wrote:
Kzin wrote:
SreckoM wrote: And why they switched question, honestly it is not my job to do that, let them do research if they need answer on that.
as i wrote, its ok to write down critiques, especially now where things changing so the one or other feature can be implemented before the release of mr 3.11.
but saying "others switching" does not help in a discussion about features. saying WHY they switches helps to start to think about options that could be implemented.
they "switched" say nothing about the usage of a diffferent renderer.
so go to the mi forum and write down some things you would like to have as an option for the new shaders, they will hear you. ;)


mill is using arnold mostly for hair.
arnold is used more and more, thats right, but not without problems. the renderer comes in the right time and can solve some of the actual problems like gi, usability is another one. implementation is the key here, no one would use arnold in si if the implementation would be bad and buggy like mr's in some parts.

the whole mental ray/ad implementation thing was a warning for all how you have to not to do it, and that is good.

User avatar
ActionArt
Posts: 853
Joined: 25 Nov 2010, 18:23
Location: Canada

Re: How rendering should be

Post by ActionArt » 21 Aug 2012, 16:18

Kzin wrote:the whole mental ray/ad implementation thing was a warning for all how you have to not to do it, and that is good.
Absolutely. This is the real heart of the problem. MR itself is not a bad renderer and is more flexible than most. It's the lack of integration and the fact we're about 3 or 4 years behind in getting the features that are sitting there is most frustrating.

All of the alternate renderers have major issues that if they were the default renderer in SI people would be very upset.

When big developments like progressive rendering, IBL and Iray are just left on the table it angers customers especial when AD appears to have such huge resources at their disposal but just don't care enough to get it done. This goes for Max, Maya and SI, all of which are in a bad state.

Back to my original thought, ditch the stupid HQVP and get progressive, IBL and Iray properly integrated! Really, for something like progressive, how hard can that be? Basically an on/off switch!

User avatar
ActionArt
Posts: 853
Joined: 25 Nov 2010, 18:23
Location: Canada

Re: How rendering should be

Post by ActionArt » 21 Aug 2012, 16:33

Nizar wrote:Yes, cycles is nice too, but is half baked, and many features are not implemented and are only beta (i. e. cycles cannot rendering particles). Actually you can have fast result with GPU, but the big development is under CPU, and they will have, in a year, a fast rendering CPU without memory issue (GPU will be used, if I understood correctly, only for preview purpose), blender developers affirm GPU development is too complex and with too many limits so they started from the beginning cycles like a hybrid cpu/gpu. The viewport preview is very fast (IMHO many steps forward for quality and speed than HQV) , but in opengl is very poor, they have a google summer code project (a totally new rebuild of opengl code for the viewport following the new opengl 2.0 standard) and in blender 2.65 viewport FX will be ready to rock.

I think blender must be respected, sometime I read caustic or hilarious comments about, many think it is not at par only because is free, or the common thought about blender is a not well implement software due to his open source nature where anyone can put his hands for adding this or these other feature. All wrong, blender is one of the most coherency e well structured software out there, his development beginning before maya, and has different paradigms, but it far different from agglomerate plugins like 3dsm because blender foundation has a strong control all over the process (and the process going fast and strong like a train)

I astonish about how fast and user friendly is blender development (and all they are free users...)
They have the right plan. This is how it should be for now. GPU for preview, CPU for final render if memory is an issue. That's where it's needed and most useful. Go Blender!

This is what MI was trying to do with Iray but AD threw it under the bus. One of the reasons given was that Iray didn't support motion blur. Well, who needs that for preview? And practically speaking, MR doesn't really support motion blur either ;)
Last edited by ActionArt on 21 Aug 2012, 16:35, edited 1 time in total.

User avatar
Mathaeus
Posts: 1778
Joined: 08 Jun 2009, 21:11
Location: Zagreb, Croatia
Contact:

Re: How rendering should be

Post by Mathaeus » 21 Aug 2012, 16:35

and..... yes, when we are already with volume rendering, Blender has a very nice pack, that is, full voxel based simulation and rendering - far superior than anything I sow in Mental Ray. There are a few nice tuts at Blender Diplom.

Right now, main problem seems to be a limited import of Camera from SI to Blender, but seems that Blender people is working on better Collada import, possibly it will appear in 2.64.
I've noticed a few inconsistencies with mapping particles (no way to add per particle texture projection), but this hopefully has nothing with full voxel sim. All in all, definitively worth to try - actually I couldn't stop to play for whole one weekend...

User avatar
ActionArt
Posts: 853
Joined: 25 Nov 2010, 18:23
Location: Canada

Re: How rendering should be

Post by ActionArt » 21 Aug 2012, 16:39

Mathaeus wrote:actually I couldn't stop to play for whole one weekend...
Me too :D

They seemed to have really gained some momentum.

User avatar
ActionArt
Posts: 853
Joined: 25 Nov 2010, 18:23
Location: Canada

Re: How rendering should be

Post by ActionArt » 21 Aug 2012, 16:46

Kzin wrote:and i dont think gpu rendering is much faster. i am playing aroung with octane render, its great and fast for what is does, but 1080p noisefree renders also take 8-20 hours on my gtx580 for more complex lighting situations.
Just playing with Blender where you can easily switch between CPU and GPU and GPU is drastically faster for previewing. You're right it takes time to get rid of the noise which reduces it's usefulness for final renders but wow, is it nice when you're setting up a scene which to me is most useful. I don't care if it takes all night after that but just not while I'm sitting there!
wetas pantaray solution is complete different. weta renders all the arealights with pantaray, bake it and using this as lookup in renderman. so they using 2 renderer, with renderman a custom solution with all their stuff. i think that only works in a bigger pipeline.
True. But couldn't that concept be implemented in one render engine (MR)? Weta just happens to use Renderman but Nvidia helped develop PantaRay so they have the code and they own MR so...put the two together?

User avatar
gustavoeb
Moderator
Posts: 587
Joined: 21 Jul 2010, 00:33
Skype: gustavoboehs

Re: How rendering should be

Post by gustavoeb » 21 Aug 2012, 16:56

Accelerating area lights, or ao, with the gpu on existing renderers is not such a big deal because there is no actual shader evaluation in those cases. Since in theory anyone can write theire own mr shaders it is not pratical to run that through the gpu.

When you talk about mimicing the pantaray workflow, that goes in the complete oposite way of workig interactivly since there are many passes and all. The only actual reason for them to work in such a way is to use raytracing on really huge datasets... there is no reason for such a workflow on less demanding projects (read gazillion polygons).
Gustavo Eggert Boehs
Blog: http://www.gustavoeb.com.br/

User avatar
ActionArt
Posts: 853
Joined: 25 Nov 2010, 18:23
Location: Canada

Re: How rendering should be

Post by ActionArt » 21 Aug 2012, 17:08

gustavoeb wrote:When you talk about mimicing the pantaray workflow, that goes in the complete oposite way of workig interactivly since there are many passes and all. The only actual reason for them to work in such a way is to use raytracing on really huge datasets... there is no reason for such a workflow on less demanding projects (read gazillion polygons).
Could this not be done internally/automatically as one pass though? Since there is no shader evaluation, why would this interfere with custom shaders?

All I know, is that calculating area lights eats up a LOT of time for me so it would be significant I think.

Just have to wait and see what happens I guess...

Kzin
Posts: 432
Joined: 09 Jun 2009, 11:36

Re: How rendering should be

Post by Kzin » 21 Aug 2012, 17:36

ActionArt wrote:
gustavoeb wrote: All I know, is that calculating area lights eats up a LOT of time for me so it would be significant I think.
how many lights do you have in your scene? did you use falloffs? how many samples per light you use? are you using the physical light node with the treshold option? it accelerates the rendering alot in scenes with more lights. not as fast as with MIS but this will come in the next version.

User avatar
ActionArt
Posts: 853
Joined: 25 Nov 2010, 18:23
Location: Canada

Re: How rendering should be

Post by ActionArt » 21 Aug 2012, 18:02

how many lights do you have in your scene? did you use falloffs? how many samples per light you use? are you using the physical light node with the treshold option? it accelerates the rendering alot in scenes with more lights. not as fast as with MIS but this will come in the next version.
I was talking more in general over a number of different projects but typically I use only 2 or 3 area lights, sometimes only 1. For the aviation projects I don't want any grain at all so I have to use fairly high samples per light, usually 5 or 6.

When possible I use Holger's very nice area light node which is significantly faster but if there are 2 or more lights sometimes I run into a bug where the shadows are black and blotchy where the two light shadows overlap.

I haven't tried the physical light node, thanks for the tip. I'll give that a try.

It's just an observation that there is a drastic hit in render time when area lights are activated so any improvement in that area would be most welcome.

User avatar
gustavoeb
Moderator
Posts: 587
Joined: 21 Jul 2010, 00:33
Skype: gustavoboehs

Re: How rendering should be

Post by gustavoeb » 21 Aug 2012, 18:44

ActionArt wrote:Could this not be done internally/automatically as one pass though?
wont help, as it still slow. the ONLY reason why they do it is because they have insane ammount of geometry
ActionArt wrote:Since there is no shader evaluation, why would this interfere with custom shaders?
you mixed both of my answers:
1. there is no evaluation of shaders in ao and shadows
2. shader evaluation is (one of) the (propably impossible to overcome) bottlenecks in transfering MR to the GPU
Gustavo Eggert Boehs
Blog: http://www.gustavoeb.com.br/

nuverian
Posts: 143
Joined: 29 Sep 2011, 23:25
Location: Greece
Contact:

Re: How rendering should be

Post by nuverian » 26 Aug 2012, 01:10

With every development and support going to Maya and every other third party support not going into XSI, droppping from XSI or half implemeted in XSI, it is tempting for someone to finally move away from XSI. And the obvious next step is moving to Maya.
Thinking of this better though it might not be a good solution as well. Autodesk might buy a new software next year and move all the good stuff in their new baby and suddently Maya becomes another XSI in terms of development lagging back, just like Max is slowly doing. Then what? Oh well..move again I guess?
The thing is that people in the industry have to wake up (me included) and move not away from one software or another, but rather from AutoBot as a whole, to other healthier companies, bacause I recently feel like someone is grabbing me from my balls (greek phrase, but you got the point).
You know what, I hate this feeling and who wouldn't, so give me break or gracefully unplug yourself.
Portfolio / Blog
http://www.nuverian.net

Post Reply

Who is online

Users browsing this forum: No registered users and 19 guests