Hi there,
I'm wondering whether object's position affect rendering time when it's assigned mib_amb_occlusion.
Here is the situation.
I created a simple scene with one polygon sphere and one polygon plane at origin position. Then I assigned these objects with mib_amb_occlusion shader and tweaked occlusion's sample setting (samples:256, spread:1.0, left others default) and rendering quality setting(sampling mode: custom sampling, min sample level:1, max sample level:2, left others default). It took me 1 min to render. After I added relative translation to camera and objects with (10000, 10000, 10000), which means camera’s viewport is the same but shifted whole scene to far position, it took me around 1 min and 45 seconds to render ambient occlusion. The rendering time increased, it's weird! It even took double time while rendering large scene.
Of course, rendering time depends on machine's capability and rendering setting. However, theoretically rendering time should be more or less the same if there are under identical rendering setting. The only difference is camera's position and objects' coordination. I supposed mental ray would create a virtual coordination system based on camera's position when rendering ambient occlusion. Obviously I was wrong.
I've tried passes, but it took much longer time. So, is there any workaround to rendering scene with objects distance from origin? Or it would be great if some expert could share mental ray's AO rendering mechanism? Any ideas will be appreciated! Thank you in advance!