Vray

Dealing with Ptex displacement by Xuan Prada

Render using Ptex displacement.

What if you are working with Ptex but need to do some kind of Zbrush displacement work?
How can you render that?

As you probably now, Zbrush doesn't support Ptex. I'm not a super fan of Ptex (but I will be soon) but sometimes I do not have time or simply I don't want to make proper UV mapping. Then, if Zbrush doesn't export Ptex and my assets don't have any sort of UV coordinates, can't I use Ptex at all for my displacement information?

Yes, you can use Ptex.

Base geometry render. No displacement.

  • In this image below, I have a detailed 3D scan which has been processed in Meshlab to reduce the crazy amount of polygons.
  • Now I have imported the model via obj in Zbrush. Only 500.000 polys but it looks great though.
  • We are going to be using Zbrush to create a very quick retopology for this demo. We could use Maya or Modo to create a production ready model.
  • Using the Zremesher tool which is great for some type of retopology tasks, we get this low res model. Good enough for our purpose here.
  • Next step would be exporting both model, high and low resolution as .obj
  • We are going to use these models in Mudbox to create our Ptex based displacement. Yes, Mudbox does support Ptex.
  • Once imported keep both of them visible.
  • Export displacement maps. Have a look in the image below at the options you need to tweak.
  • Basically you need to activate Ptex displacement, 32bits, the texel resolution, etc)
  • And that's it. You should be able to render your Zbrush details using Ptex now.

Zbrush to Maya and Vray 2.0 by Xuan Prada

I know how tricky can be sometimes to make your Zbrush displacements look great outside Zbrush.
Maya, Softimage, Vray, Renderman or Arnold, just to name a few treat Zbrush displacements in a different way.
Let me explain to you my way to export displacement from Zbrush to Maya and Vray 2.0

- First of all, if you are working with a final asset you will have to export your displacement using your base geometry imported in Zbrush. If you did the scult from scratch in Zbrush you may want to export your lowest subdivision mesh, create a good uv mapping and re-project your sculpted detail in that mesh.
If this is the case, check this.

  • Go to the lowest subdivision level.
  • Turn off all your layers.
  • Export as .obj
  • This is the object that you are about to render. If you had imported a base mesh before, you won’t need to export it again, it would be in your 3D application already.
  • Go back to the highest subdivision level.
  • Turn on all your layers.
  • Go down to the lowest subdivision level.
  • Store a new morph target and import the previous exported .obj or your original base mesh from your 3D application.
  • Your sculpted model will be substituted by the original mesh with no sculpt information.
  • Click on switch morph target to activate again your sculpted mesh.
  • You are ready to export the displacement maps, just check my settings below for 16 bits, 32 bits and vector displacement.
  • Finally to set-up your shaders and render settings for Zbrush displacements in Maya and Vray 2.0 check my previous post about it.

Zbrush displacement in V-Ray for Maya by Xuan Prada

It is always a bit tricky to set up Zbrush displacements in the different render engines.
If you recently moved from Mental Ray or another engine to V-Ray for Maya, maybe you should know a few things about displacement maps extracted from Zbrush.

I wrote down here a simple example of my workflow dealing with that kind of maps and V-Ray.

  • First of all drag and drop your 16 bits displacement to the displacement channel inside the shading group attributes.
  • Maya will create a displacement node for you in the hypershade. Don’t worry to much about this node, you don’t need to change anything there.
  • Select your geometry and add a V-Ray extra attribute to control the subdivisions and displacement properties.
  • If you exported your displacement subdividing the UV’s, you should check that property in the V-Ray attributes.
  • Edge lenght and Max subdivs are the most important parameter. Play with them until reach nice results.
  • Displacement amount is the strength of your displacement and displacement shift sould be half negative than your displacement amount if you are using 16 bits textures.
  • If you are using 32 bits .exr textures, the displacement shift should be 0 (zero).
  • Select your 32 bits .exr file and add a V-Ray attribute called allow negative colors.
  • Render and check that your displacement is looking good.
  • I’ve been using these displacement maps. 16 bits and 32 bits.

Vray sss test by Xuan Prada

Just testing Vray’s SSS shader for realistic skin look-dev purposes.
I ended with the theory that would be quite simple to set-up a nice, realistic and cheap SSS shader for human and creature assets. I love the raytraced solid scatter, but with complex models I can’t get rid of some of the artifacts in the SSS channel.
I will post more quite soon.

  • To achieve better results, I like to combine SSS shaders with Vray Mtl shaders which have better solutions for speculars and reflections. With this method the reflection of the surface is controled by BRDF instead of the poor spec control of the SSS shader.

Love Vray's IBL by Xuan Prada

When you work for a big VFX or animation studio you usually light your shots with different complex light rigs, often developed by highly talented people.
But when you are working at home or for small studios or doing freelance tasks or whatever else.. you need to simplify your techniques and tray to reach the best quality as you can.

For those reasons, I have to say that I’m switching from Mental Ray to V-Ray.
One of the features that I most love about V-Ray is the awesome dome light to create image based lighting setups.

Let me tell you a couple of thing which make that dome light so great.

  • First of all, the technical setup is incredible simple. Just a few clicks, activate linear workflow, correct the gamma of your textures and choose a nice hdri image.
  • Is kind of quick and simple to reduce the noise generated by the hdri image. Increasing the maximum subdivisions and decreasing the threshold should be enough. Something between 25 to 50 or 100 as max. subdivision should work on common situations. And something like 0.005 is a good value for the threshold.
  • The render time is so fast using raytracing stuff.
  • Even using global illumination the render times are more than good.
  • Displacement, motion blur and that kind of heavy stuff is also welcome.
  • Another thing that I love about the dome light using hdri images is the great quality of the shadows. Usually you don’t need to add direct lights to the scene. If the hdri is good enough you can match the footage really fast and accurately enough.
  • The dome light has some parameters to control de orientation of your hdri image and is quite simple to have a nice preview in the Maya’s viewport.
  • In all the renders that you can see here, you probably realized that I’m using an hdri image with “a lot” of different lighting points, around 12 different lights on the picture. In this example I put a black color on the background and I changed all the lights by white spots. It is a good test to make a better idea of how the dome light treats the direct lighting. And it is great.
  • The natural light is soft and nice.
  • These are some of the key point because I love the VRay’s dome light :)
  • On the other hand, I don’t like doing look-dev with the dome light. Is really really slow, I can’t recommend this light for that kind of tasks.
  • The trick is to turn off your dome light, and create a traditional IBL setup using a sphere and direct lights, or pluging your hdri image to the VRay’s environment and turn on the global illumination.
  • Work there on your shaders and then move on to the dome light again.

My favourite V-Ray passes by Xuan Prada

Recently working with V-Ray I discovered that these are the render passes which I use more often.
Simple scene, simple asset, simple texture and shading and simple lighting, just to show my render passes and pre-compositing stuff.

  • Global Illumination
  • Direct lighting
  • Normals
  • Reflection
  • Specular
  • Z-Depth
  • Occlusion
  • Snow (or up/down)
  • Uvs
  • XYZ (or global position)

RGB

GI

Direct lighting

Normals

Occlusion

Reflection

Snow

Specular

UVs

XYZ global position

Slapcomp

Linear Workflow in Maya with Vray 2.0 by Xuan Prada

I’m starting a new work with V-Ray 2.0 for Maya. I never worked before with this render engine, so first things first.
One of my first things is create a nice neutral light rig for testing shaders and textures. Setting up linear workflow is one of my priorities at this point.
Find below a quick way to  set up this.

  • Set up your gamma. In this case I’m using 2,2
  • Click on “don’t affect colors” if you want to bake your gamma correction in to the final render. If you don’t click on it you’ll have to correct your gamma in post. No big deal.
  • The linear workflow option is something created for Chaos Group to fix old VRay scenes which don’t use lwf. You shouldn’t use this at all.
  • Click on affect swatches to see color pickers with the gamma applied.
  • Once you are working with gamma applied, you need to correct your color textures. There are two different options to do it.
  • First one: Add a gamma correction node to each color texture node. In this case I’, using gamma 2,2 what means that I need to use a ,0455 value on my gamma node.
  • Second option: Instead of using gamma correction nodes for each color texture node, you can click on the texture node and add a V-Ray attribute to control this.
  • By default all the texture nodes are being read as linear. Change your color textures to be read as sRGB.
  • Click on view as sRGB on the V-Ray buffer, if not you’ll see your renders in the wrong color space.
  • This is the difference between rendering with the option “don’t affect colors” enabled or disabled. As I said, no big deal.

Inverse dirt maps by Xuan Prada

Sometimes is very useful to generate inverse occlusion bakes to reach an interesting starting point to paint our dirt maps.
Vray dirt material is perfect for this goal, but if you don’t work with Vray, is very easy to do the same with Mental Ray and Binary Alchemy Shaders.

  • You need to install the Binary Alchemy Shaders. Some packages are free and you will have to pay for another ones.
  • Apply a “surface shader” to the object and connect a “BA_color_raylenght” to it.
  • Put this shader in “Inverted Normal” mode and play with his parameters.
  • We get an inverted “ambient occlusion”.
  • Use a “blend colors” , “layered shader” o similar to combine this inverted occlusion with a nice bitmap.