texturing

Mari to Maya by Xuan Prada

Yes I know, make your Mari textures work inside Maya could be a bit weird specially if you never worked before with multi UV spaces.

I hope to give you some clues with this quick and dirty step by step tutorial.

I’m using the blacksmith guy from The Foundry who has 40 textures with 4k resolution each.

  • First of all check your model and UVs.
  • Export all your textures from Mari. You know, right click on the desired channel and export.
  • Now you can type the naming convention that you want to use. I like to use COMPONENT_UDIM.tif COL_1001.tif for example.
0003.png
  • Check your output folder. All your textures should have been exported.
  • Import your model in Maya and check the UV mapping. You need to understand how the UV shells are called inside Maya to offsetting your texture maps.
  • The default UV space is 0-0 the next one on the right will be 0-1 the next one 1-1 and so on.
  • Open the first texture map called COL_1001.tif in the hypershade and rename the image node to COL_1001 and the 2D placement node to UDIM_1001.
  • Do the same with all the textures.
  • Select all the texture nodes and open the attribute spread sheet.
  • Set the default color RGB to 0.
  • Select all the 2D place texture nodes and open again the attribute spread sheet.
  • Switch off wrapU and wrapV.
  • Type the properly offsets in the translate frameU and translate frameV.
  • Create a layered texture node.
  • Select all the texture images nodes and click and drag with MMB from an empty space of the hypershade to the layered texture node attributes tab. This will create one layer with each texture map.
  • Delete the default layer.
  • Set the blending mode of all the layers to lightnen.
  • Connect the layered texture to the input color of one shader of your election.
  • Repeat the whole process with all your channels. (SPEC, BUMP, DISP, etc)

Mari to Softimage by Xuan Prada

Recently I was involved in a master class about texturing and shading for animation movies, and as promised I’m posting here the technical way to set-up different UV sets inside Softimage.
Super simple process and really efficent methodology.

  • I’m using this simple asset.
  • These are the UVs of the asset. I’m using different UV sets to increase the quality. In this particular asset you can find four 4k textures for each channel. Color, Specular and Bump.
  • You probably realized that I’m using my own background image in the texture editor. I think that this one is more clear for UV mapping than the default one. If you want you can download the image, convert it to .pic and replace the original one located on C:\Program Files\Autodesk\Softimage 2012\Application\rsrc
  • This is the render tree set-up. Four 4k textures for color, specular and bump. Each four textures are mixed by mix8color node.
  • Once everything is connected, you still need to offset each image node to match the UV ranges.
  • I know that the UV coordinates in Softimage are a bit weird, so find below a nice cart which will be so helpfull for further tasks.
  • Keep in mind that you should turn off wrap U and wrap V for each texture in the UV editor.
  • Really quick render set-up for testing purposes.

Faking SSS in Softimage by Xuan Prada

SSS is a very nice shader which works really great with a good lighting setup, but sometimes  is so expensive shader when you´re using Mental Ray.
Find below a couple of tecniques to deal better with SSS. Just keep in mind that those tricks could improve your render times a bit, but never will reach the same quality than using SSS for itself.

  • I’m using this simple scene, with one key light (left), one fill light (right) and one rim light.
  • A SSS compound is connected to the material surface input, and the SSS_lightmap (you can find that node in the render tree -> user tools) connected to the lightmap input of the SimpleSSS. And then, the Simple SSS lightimap connected to the material lightmap input.
  • Write the output and resolution of your lightmap.
  • Hit a render and check the render time.
  • Disconnect the lightmap.
  • Render again and check the render times as well. We have imprpved the times.
  • If you need to really fake the SSS and render so fast, you can bake the SSS to texture using RenderMap, but keep in mind that the result will be much worst than using SSS. Anyways you can do that for background asset or similar.
  • Now you can use another cheaper shader like blinn, phong or even constant with your baked SSS.
  • As you can see the render is now so fast.

Dealing with normal maps in Softimage by Xuan Prada

Yes I know, working with normal maps in Softimage is a bit weird sometimes, specially if you worked before with 3D Max normal+bump preset.

I’ve been using the same method over the years and suited fine for me, maybe would be useful also for you.
I prefer to generate the normal maps inside Softimage rather than Mudbox or Zbrush, usually works much better according to my tests with different assets.

  • So, you should import in the same scene both geometrys, high and low. Don’t be afraid of high poly meshes, Softimage allows you to import meshes with millions of polygons directly from Mudbox or Zbrush.
  • With both meshes in your scene be sure that they are perfectly aligned.
  • Check the UV mapping of the low resolution mesh.
  • Select the low resolution mesh and open the ultimapper tool.

- The most important options are:

  • Source: You have to click on your high resolution mesh.
  • Path: Where your normal map texture will be placed.
  • Prefix: A prefix for your texture.
  • Type: You can choose between different image formats.
  • Normal in tangent space: The most common normal map type.
  • Resolution: Speaks for itself.
  • Quality: Medium it’s fine. If you choose high the baking time will increase a lot.
  • Distance to surface: Click on Compute button to generate this parameter.
  • Click on generate and Softimage will take some time to generate the normal map.
  • The normal map is ready.
  • Hide your high resolution mesh.
  • Grab one of the MR shaders and drag it to your mesh.

- Use a normal map node connected to the bump map input of the shader.

  • Choose the normal map generated before.
  • Select the correct UVs.
  • Select tangents mode.
  • Uncheck unbiased tangents.
  • Hit a render and you’ll see you normal map in action.
  • Cool. But now one of the most common procedures is combining a normal map with a bump map.
  • I’m using the image above.
  • If you use a bump map generator connected into the bump map input you will have a nice bump map effect.
  • Find below the final render tree combining both maps, normal and bump.
  • The first bump map generator has two inputs, color matte which is a plain white color and the normal map with the options which I already commented before. Be sure to select relative to input normal in the base normal option of the bump map generator.
  • The second bump map generator is your bump texture where you can control the intensity increasing or decreasing the factor value.
  • The vector math vector node allows you to combine both bump map generators.
  • Connect the first bump map generator  to the first input and the second one to the second imput.
  • In the operation option select vector input1 + vector input2.
  • Final render.

Baking between UV sets in Maya by Xuan Prada

One of the most useful workflows when you are texturing is bake your textures from one UV set to another one.You will need to do this for different reasons, one of them for example could be using different resolution models with different UV mapping or using a different UV mapping for grooming, etc.

The first time I tried to do this in Maya I realize that MentalRay Batch Bake tool doesn't work fine, I don't know why but I couldn't use it.

I solved the problem using Transfer Maps tool for Maya and decided to write down for future chances.

  • Check the different UV sets in Maya.
  • Apply your textures and shaders.
  • I'm using six different shaders with six different texture maps.
  • If you use the Mental Ray Batch Bake tool (common used for baking purposes) and configure all the parameters, you'll realize that the baked maps are completely black. Something is wrong realated with UV sets. Bug? I don't know.
  • You need to use the Maya Transfer Maps tool. Lighting/Shading -> Transfer Maps.
  • Duplicate the mesh and rename to source and target.
  • Select target and his UV set.
  • Select source.
  • Select desired map to bake. (probably diffuse)
  • Select the path.
  • Select resolution.
  • Bake.
  • Your baked texture is ready.

Mudbox and UDIMs by Xuan Prada

When you’re going to texture an asset which already have a displacement map, probably you’ll want to apply that displacement to your mesh before start the painting process.

In my pipeline, I usually apply the displacement map in Mudbox and then I export the high resolution mesh to Mari.

The problem here is that Mudbox doesn’t allow you to work with displacement maps and multiple UV shells.

I tried below to find a solution for this problem.

  • Check your UV mapping in Maya.
  • I’m using these simple displacement maps here.
  • One map for each UV shell.
  • Export as .Obj
  • Open in Mudbox and subdivide.
  • Go to maps -> sculpt model using displacement map.
  • Select your mesh and your displacement map.

As you’ll realize, Mudbox doesn’t allow you to choose different maps for each UV shell which means that Mudbox will be able only to sculpt using the displacement map for U0-1 V1-0 coordinates. Big problem.

The way which I’ve found to solve this problem is:

  • Go back to Maya.
  • Select your mesh and open de UV Texture Editor.
  • Select one of the UV shells which is outside of the default U0-1 V1-0 range.
  • Open the script editor and type -> polyEditUV -u -1 -v 0 ;
  • You’ll notice that the second UV shell is placed in the default UV shell but was moved 1 exact position. Then your displacement texture  will match perfectly.
  • Export again as .obj
  • Now you’ll can use your displacement map in Mudbox without problem.
  • Repeat the process for each UV shell.
  • Commands to move UV shells 1 exact position.

Move left -> polyEditUV -u -1 -v 0 ;

Move right -> polyEditUV -u 1 -v 0 ;

Move up -> polyEditUV -u 0 -v 1 ;

Move down -> polyEditUV -u 0 -v -1 ;

Selection groups in Mari by Xuan Prada

When you are working with huge assets is very useful to keep everything organized.
One of the best ways to do it inside Mary is using selection groups.

  • Go to view -> palettes -> selection groups.
  • Select faces, elements or objects.
  • Click on plus icon to create a new selection groups based on your current selection.
  • You can create different selection models based on different parts of your asset.
  • Now you can be focused on just one specific area of your asset.

Save as in Mari by Xuan Prada

If you are getting crazy trying to find the save as button in Mari, don’t worry, it's not there.
The best way to save as in Mari is using snapshots tool.
Is not exactly the same as save as option, but is something quite similar.

  • In this example I have a version of the robot with some flat colors as texture maps.
  • Open the snapshot window, under view -> palettes -> snapshots.
  • Create a new snapshot and name it as you want. For example v001
  • Keep working on your textures, channels, shaders, etc.
  • When you want to save as indeed going to file -> saves as (traditional way) go to your snapshots window and create a new snapshot.
  • If you want to switch between versions just select the thumbnail and click on revert.

UDIMs in BodyPaint by Xuan Prada

Step by step tutorial.

  • Export your object from Maya with multiple UDIMs.
  • You can start your texture work from scratch or using any kind of baked stuff.
  • Import your .obj geometry in BodyPaint and create two different materials, one for each UV layout created before in Maya.
  • Create a new texture for the color channel of each material, or connect your textures if you baked previously.
  • Drag both materials over the geometry.
  • You can see in the viewport just the last material dragged, because BodyPaint doesn’t handle multi material jobs at the same time.
  • Click on objects tab, select the material for the UV layout 1 channel and check that X and Y offset are both as 0
  • If you click on texture tab you’ll realize that UV’s and texture match perfectly.
  • Select the material for the UV layout 2 and switch the texture used for this material. You’ll realize that something is wrong, UV mapping and texture don’t match.
  • You will need to change the X offset to 100 and then, will work fine.
  • You can change viewport visualization switching from one shader ball to another one.

BodyPaint only allows to work with one material at the same time, so you will need to switch between both materials to paint in a properly way.

UDIMs workflow, Maya to Mari by Xuan Prada

Sometimes is very useful to work with different range of UV’s, specially when you are working with a huge assets and a high detail is needed.
Find below my workflow dealing with this kind of stuff.

  • Unfold the UV’s in different ranges.
  • If you need to bake procedurals, dirtmaps or whatever, keep in mind to change the UV range in the baking options.
  • I always use the same naming convention.
  • UxxVxx.tiff
  • 0101.tif
  • 2301.tif
  • Create a new project in Mari.
  • Check if the UV’s are placed correctly.
  • Create a new channel called “base” and import your baked textures into it.
  • Ready to keep working.

Introduction to channels in Mari by Xuan Prada

  • Create a new proyect in Mari.
  • Create a new channel called “base”.
  • Adjust size, color space and color.
  • Right click on “base channel” to import a texture bitmap as base color.
  • We already have our cube with the blue base color.
  • Create a new shader called “blueCube” and choose as texture the “base channel” created before.
  • Create a new channel called “underPaint”.
  • Adjust size, color space and color.
  • Right click on the “underPaint” channel to import a nice under paint texture map.
  • You can see it in the viewport.
  • Create another channel called “underPaintMask”.
  • Adjust size, color space and color.
  • Import the under paint mask texture into the “under paint channel.”

Looks awesome in the viewport.

  • Select the “blueCube shader” and add a “new shader module”.
  • Select “masked diffuse” from the list.
  • As base texture select “underPaint channel”.
  • As mask texture select “underPaintMask channel”.
  • Invert the mask.
  • Add more layers in very simple, just need to add more channels.
  • Create a new channel called “rust”.
  • Adjust the size, color space and color.
  • Import a rust map into the channel.
  • Check it in the viewport.
  • Add a new channel called “rustMask”.
  • Adjust size, color space and color.
  • Go back to shaders tab and select “bluCube shader”.
  • Add a new shader module and select “masked diffuse” from the list.
  • Select the “rust channel” as “base texture” and “rustMask channel” as “mask texture”.
  • Invert the mask.
  • Select the “rustMask channel” and paint with black color to create rust in desired areas.

Inverted occlusion in 3D Max by Xuan Prada

People asked me for a step by step installation and usage of Binary Alchemy Color Ray Length shader in 3D Max.
Here we go.

Installation

  • Download BA Shaders for 3D Max.
  • Copy .dll files here -> “3ds Max 2010\mentalray\shaders_3rdparty\shaders”
  • Copy .mi files here -> “3ds Max 2010\mentalray\shaders_3rdparty\include”
  • Edit “3rdparty.mi” located here -> 3ds Max 2010\mentalray\shaders_3rdparty
  • Your “3rdparty.mi” must be something like this.

Usage

  • Create a matte/shadow shader and uncheck “receive shadows” and “use ambient occlusion”.
  • In the “camera mapped background” input, connect a “BA_color_raylength” shader.
  • Play with the “spread” to control the behaviour of the occlusion.
  • Once rendered you’ll have something similar to this.
  • Mix the “BA_color_raylength” with procedural maps or bitmaps to improve the result.

Edit: The most important parameters to play with are “spread” and “far output”.

Black holes with final gather contribution by Xuan Prada

Black holes are a key feature in 3D lighting and compositing, but black holes with bounced information are super!

  • Apply a Mental Ray Production Shader called “mip_rayswitch_advanced” to your black hole object.
  • In the “eye” channel, connect a “surface shader” with the “out_matte_opacity” parameter pure black.
  • In the Final Gather input, connect the original shader of your object. (a blinn shader for example).

Beauty channel.

Alpha channel

Inverse dirt maps by Xuan Prada

Sometimes is very useful to generate inverse occlusion bakes to reach an interesting starting point to paint our dirt maps.
Vray dirt material is perfect for this goal, but if you don’t work with Vray, is very easy to do the same with Mental Ray and Binary Alchemy Shaders.

  • You need to install the Binary Alchemy Shaders. Some packages are free and you will have to pay for another ones.
  • Apply a “surface shader” to the object and connect a “BA_color_raylenght” to it.
  • Put this shader in “Inverted Normal” mode and play with his parameters.
  • We get an inverted “ambient occlusion”.
  • Use a “blend colors” , “layered shader” o similar to combine this inverted occlusion with a nice bitmap.

Worn edges by Xuan Prada

This technique is based on “worn edges techniques” by Neil Blevins.

Requirements

  • 3D Max Scanline Render
  • SoulBourn Scripts
  • Warp Texture Script
  • All the objects must have a correct UV mapping

Procedure

  • We must complete perfectly the UV mapping of the objects, without overlappings and similar common issues.
  • To reach better results, we need more geometry information, especially in the corners.
  • For that purpose, duplicate the objects, rename them and apply them some bevels in the corners and one or two turbosmooths if necessary. (but try first only adding bevels).
  • Note: All the object mesh must be “Editable Poly”.
  • Select the object and execute “Corner edge to vertex map” script.
  • We will have to play with the low and high angle parameters, especially decreasing the intensity of the low angle when more complex geometry has the object.
  • The next step is to distort this mask created by vertex color, to give it more caotic shape and indeed, more real aspect.
  • We need to download the “warp texture” plugin.
  • In a standard material connect the warpt texture to diffuse channel.
  • In the target input connect a vertex color. 3D Max put by default the vertex information which we have generated previously with the corner edge to vertex script.
  • In the warp input connect a procedural noise, whose parameters will vary depending of scene scale and object size.
  • If we hit a render we reach a pretty decent results, but we need to define better our mask.
  • If we put an output in the vertex color channel, we can play with the curve for empathize the results.
  • In the noise we can also play with his output.
  • To finish, we can bake this mask to paint it in a more appropriate software.