In addition to the new LPE buffers we presented earlier, iray now also has another neat user buffer: The Irradiance Buffer.
Independent of the primary hit materials (i.e. textures, procedurals, layering, BRDFs, etc. that are used to describe a material), it contains the amount of (simulated) incoming light for all the points visible from the camera (which is not the most correct formulation for irradiance, but at least hopefully understandable :)).
Previously, hacks like replacing all materials with a purely white diffuse material were needed to get a very rough approximation of irradiance, but this breaks down easily as materials in the scene get more complex, a lot of transparency and refraction is involved and/or the amount of indirect lighting dominates the rendered view.
Note, that combined with the previous post on the Sun & Sky settings, this results in a pretty workflow to estimate illuminance in lux or foot candles (as an example: one gets around 100.000 lux when measuring this on a simple plane with date set to today and time to 13:37 (and around 20K for objects not being in direct sunlight)).
So here some examples, using a false color encoding (heat map):
Here the lux range is roughly 1k (only indirect / blueish) up to 100k (direct Sun & Sky / redish).
The range in this picture is roughly 0.02k up to 90k.
Procedural Sun & Sky (and setting it up)
A change that slipped through mostly unmentioned is the procedural Sun & Sky. Previously, iray had to bake environment shaders upfront to support all kinds of custom shaders in the most compatible way to avoid manual replacements of shaders. Nowadays we try to support as much as possible natively in the core.
As a result, Sun & Sky can nowadays be changed interactively, but of course without loosing any efficiency in the various importance sampling techniques implemented inside of iray.
So now for the correct settings to work with luminance values that are mostly consistent with those of real world sun and sky light on a clear day (note that due to historical legacy reasons, the settings differ slightly between mental ray and the Iray Integration Framework (neuray)):
rgb_unit_conversion: 1.0, 1.0, 1.0
multiplier: 0.31831 (mental ray mia_physicalsky) or 0.10132 (neuray sun_and_sky)
btw: In 3ds Max, use the Daylight System (Create -> Systems), set it to use mr Sun and mr Sky (Modify -> Daylight Parameters: Sunlight / Skylight) and change the multiplier in both to 0.31831. To make the settings perfect, one must also pick a matching tonemapper (as it is matched with the Sun & Sky settings internally): mr Photographic Exposure Control (Rendering -> Environment -> Exposure Control). Now change the Physical scale there to Unitless with a value of 1.0. Then simply tweak the exposure via Exposure Value (EV) afterwards, depending on time of day, indoor vs. outdoor scene, etc.
As it has been a while since the last update (shame on us, but we have been busy implementing useful features :)), here a (technically) less spectacular, but rather popular and useful new feature:
Of course, multiple section planes can also be combined:
In addition, the section planes can either work as if they cut off the geometry completely (so let the light in as in the pictures above), or to just let the viewer take a peek inside, to take a look at the unaltered lighting.
(Fun fact: Actually section planes have been in iray pretty early already but then kicked out again, because nobody really used it back then)
Instant Relighting & Nonphysical Effects
Today, we would like to give you a glimpse of one of the new features of our renderer.
Often, the image generated by the renderer contains all the effects you want, and looks roughly right. But you still may want to emphasize some areas, increase a highlight here, dim a caustic there, maybe tweak the color of a light source. You could modify your scene and re-render, but re-rendering frequently may be too costly. Image composition is often a cheaper alternative, and it lets you create effects which are not physically plausible. With iray, you can render different bits of information into different images. You can then use your favorite image manipulation software to tweak what you don’t like, without having to worry about breaking the things you do like.
Starting with iray 3.0 and the Iray Photoreal mode in Iray 2013, we support Light Path Expressions (LPEs), which allow you to do this separation. An LPE is a regular expression that matches some light transport paths that iray generates, but not others. Each result buffer can be associated with a LPE so that only paths which match the expression end up contributing to that buffer. Iray also allows you to render several buffers with different LPEs at the same time at almost no additional runtime cost. LPEs can distinguish between different surface properties such as diffuse or glossy, reflection or refraction, types of light sources, and names.
Some renderers use Arbitrary Output Variables (AOVs) to achieve a similar separation. While this allows access to some things that LPEs don’t give you, AOVs usually require modification of shader or material code. LPEs, on the other hand, can be used without modifying scenes or materials.
So, what do these expressions look like and how can they be used in practice? Take, for example, this image of a glass of whiskey illuminated by two light sources.
Let’s say we want to enhance the visibility of the caustic on the left and tweak the color of the rear light source. First of all, let’s get all contributions from the rear (environment) light source. We are interested in light emitted by the environment (“Le”) that bounces from any type of surface any number of times and then hits the eye (“E”). As in standard regular expressions, the dot character matches any interaction, and the star operator means “repeat any number of times”. So, we get “Le .* E” for our first buffer.
The other light is an area light, so we can filter for “La”. Or, we can filter for the names of groups of lights as well as individual ones. The caustics we are interested in have had any number of arbitrary interactions with the scene before hitting a specular surface and then ending up on some diffuse (or glossy) surface. The corresponding LPE is “La .* S (D|G) E”. Instead of “(D|G)”, we can also use “[DG]” or ”[^S]”.
We can filter all the other paths into the final buffer, for example, by specifying “La (.* (S | [^S] .) | .?) E”.
Now, we can modify and recombine these images in any way we want. 75% environment contribution, 110% area light caustics, 100% other area light contributions:
50% environment contribution tinted blue, 100% area light contribution tinted orange:
60% environment contribution, 30% area light caustics tinted orange, 100% other area light contributions tinted orange:
Depending on the operations we apply, the result is exactly the same as re-rendering with the corresponding operations on the scene. But it’s instantaneous and no re-rendering is necessary. In addition, we can achieve a look that we cannot get directly from simulated light transport.
Remember to disable tone mapping and gamma correction during rendering and apply it after compositing.
This bit of eye candy shows the magic of a thin beam of light being refracted through a (diamond!) prism.
iray Next & Dispersive Caustics
Today, we have another great example for the improvements which are coming to the next major release of iray.
Here’s our test scene rendered with “iray classic” (what you would expect from iray 2.x):
And here’s the scene rendered with “iray next”:
This scene makes heavy use of subsurface scattering, caustics, and dispersion (those two gargoyles are made of diamond!). What about render times? The first image was rendered for about 2h00m, and the second for about 2h30m (using two Quadro 6000’s with 16 CPU threads).
Note that neither image is completely converged: we only rendered them to 95% completion to explicitly show the difference in noise and convergence behaviour.
iray Next: Sneak Peek
We’ve been working on improving iray to make it better at rendering difficult lighting conditions. The results are good so far, so we thought: how about a preview?
Let’s first look at the scene used in our post on frosted glass (http://blog.irayrender.com/post/19731699592/frosted-glass-part-ii). Before & after:
Those “shadows” are actually indirect transmissive glossy illumination.
Now let’s look at a more concrete example. Before & after:
Much of the illumination passes through translucent curtains, a pretty complex lighting effect which gets partly filtered out in the “before” shot. Also notice the reflective caustic casting an indirect shadow on the wall, behind the chair.
And finally, what happens when the sun shines on a wavy, polished floor?
The living room and bathroom models are courtesy of Autodesk, with content from Turbosquid. Thanks!
Shadow Acne and the Shadow Terminator
Looking at the title of this post you may wonder if the iray guys now completely lost their minds. But actually this is more or less scientific terminology for effects that each artist has (unfortunately) experienced at least once in his/her lifetime. iray (fortunately) solves one of them, but actually fails to solve the other, as do all other physically based rendering packages out there. So this post will try to explain why these problems happen and how to work around (or rather avoid) the second one.
Shadow Acne is usually what one can spot as dark dots on some surfaces in a rendered scene. While these will (usually) ‘vanish’ with an increased amount of samples, it actually is not part of the classical noise ‘problem’ that path tracing based renderers have to deal with. Instead it is the fault of numerical precision issues of the underlying rasterization or ray tracing implementation. A classic workaround is to expose a global epsilon/offset parameter that must be tweaked for each scene separately if one is unlucky (i.e. most users will have stumbled over this at some point). iray though will avoid this automatically without user interventions and scene specific settings, so the first part of this post ends with a simple solution. ;)
Unfortunately the second issue, the Shadow Terminator, is currently unsolved when using physically based materials within a path tracer. Also this problem will not go away with huge numbers of samples.
So what happens? Currently almost all renderers on the market will work on triangle or quad meshes at the end of the day. So even if the scene is completely setup using SubDivs, NURBS, or whatever your favourite brand is, the render engine will most likely repackage that into a triangle soup before being able to work with it. To avoid having visible edges/facets due to the triangulation of the mesh(es), one can (and should) use vertex/shading normals to smoothen these by interpolation during the rendering process itself.
One downside of this technique is that now the physical properties of the mesh are a bit jeopardized: While the ray tracing or rasterization will work on the actual, ‘physical’ triangulated mesh, most of the material calculations will be based upon the fuzzy interpolated normals. So, as you might have already guessed from this explanation, at some point there must be artifacts being introduced into the picture due to this, right?
And that’s not all, you can even make it worse: Simply assign a complicated bump or normal map to a material. This can (for some(!) scenarios) amplify this effect even more. Here a pretty extreme example:
The reason for this boils down to a simple explanation: One tries to manipulate physics by perturbing the original geometric normals of a mesh, which is something that the ray tracing/rasterization that has to rely on the original mesh can not automatically cope with under all circumstances.
Using classical shader based materials, there are some ways to work around this issue most of the time by careful implementation (for example some mental ray shaders can do this automatically) though, unfortunately this is not the case for all physically based materials, such as the BRDF models and the accompanying path tracing simulation used by iray (unless one is willing to pay the price of various other artifacts that could happen then).
Fortunately, the workaround is pretty simple: If you experience shadow terminator artifacts on a mesh, simply increase the triangulation, or in the case of bump/normal maps: use displacement maps instead (which looks much more realistic on mesh silhouettes anyhow).
We received some questions in the past on (the lack of) texture filtering options in iray. With this post i want to shed some light on this and why we decided to not bother our users with such low level implementation details.
iray by definition doesn’t require to use common texture anti-aliasing techniques like MIP/SAT-mapping and trilinear filtering, or even view dependent techniques like anisotropic filtering techniques and EWA (=eliptical approximation to oversample the texture within a pixel/sample). The reason is very simple: As the image generation process in iray is completely based on path tracing, each pixel is filled with hundreds to thousands of samples, implicitly oversampling any texture that shows up in that pixel with the same amount. So instead of playing safe and pre-blurring the texture (what is basically happening when using one of the mentioned texture filtering techniques) iray delivers the best possible texture filtering automatically. In addition, iray does not even use bilinear filtering for the reconstruction of the texels, but a higher order scheme to increase the crispiness (especially noticable when using bump or normal maps) even more.
mental ray using no filtering
mental ray using pyramid filtering
iray using “no” explicit filtering
iray Material for 3ds Max
This blog has seen several posts on the capabilities of the iray material model. So far we’ve shown:
- Metallic flakes
- Sub-surface scattering
- Thin film coating
And there’s a lot more that we haven’t shown.
Today, life gets a little more wonderful for all 3ds Max users: we’re finally introducing the iray Material plugin for 3ds Max (2013).
The plugin is designed to be powerful, simple to use, and iray-friendly. Take a look at what a programmer can do in a couple minutes:
… and imagine what you could do!
You can download the plugin from link posted our public forums:
P.S.: The image above shows, from left to right: metallic flakes, thin film coating, and backscattering.