ljilekor
Hi

When I use the tensionBlendshape tool I can 'see' compressing and stretching poly's are colored green and red.

I'd like to connect those red and green colorchannels to a textures' alpha channel.(or anything as long as I can 'WORK FREELY' with the given data)

It sounds very easy but it seems impossible in native Maya. (I've done similar things in other softs without any problem) It's a shame that maya tends to overcomplicate things at a certain point. That's very 'un'intuitive.


I think it should be possible with SOUP. But How... ???

http://soup-dev.websitetoolbox.com/post/Tension-Map-to-drive-displacement-5117916?highlight=vertex
sheffdog

wrote:

I am using the tension map to drive vert colors on my model. I can drive bump maps with this setup but not displacement.


Driving the bump with the tension map? How to?

Thanks in advance

Quote 0 0
ljilekor
Should I use the arrayToTexture2D node?

I tried to connect the tensionMap nodes' compression(stretch)OutputArray to the
arrayToTexture2D nodes...

It seems impossible to connect the tensionMaps output 'Arrays' with any 'Array' in the arrayToTexture2D. Or am I just missing the point... ???

I really can't see how... (F#*.. frustating)

Where can I find some 'STEPBYSTEP  / STAIGHTFORWARD / WYSIWYG' documentation/examples about the arrayToTexture2D node?
Quote 0 0
pshipkov
Take a look at the arrayToTexture node.
Quote 0 0
ljilekor
Thank You for helping me with my miserable plight...

I am looking at the 'arrayToTexture' (guess it's the arrayToTexture2D node) for about 4 hours now and starting to hit the boiling point...
Guess I've tried almost every connection between my 'tensionMap' and the 'arrayToTexture2D' (whether they make sense or not)

What must be connected with what???
Taking into consideration a tensionMap and an arrayToTexture2D. Theres also the geometry shape (with a uvset for the arrayToTexture2D)

btw:
Our animation happens in Maya. Rendering in Modo.
I need to drive Modo displacement/Bump/Normal maps with auto generated alpha's. Those Alpha's should be generated by tensionmaps in Maya.
The animators in in Maya need visual reference of the disp/bump. Therefore we use simplified normal maps as proxies.
The pipe is simple. Generating the alphas from tensionMaps is a real HELL for me. I don't see the logic (compared to how straightforward this operation is in other softs)...

Thanks again for helping me out

Quote 0 0
pshipkov
Actually did you try the vertexColorsToTexture tool under the "wrench" shelf button ?
Quote 0 0
pshipkov
Ok, sorry for the cryptic messages early. Didn't have time to focus on your problem.
You have to do this i think:
ourMesh.outMesh -> point.inGeometry
Turn-on the processing of colors, weights and maps (UVs) inside the point node. In the weights expression type: $W=$CR.
point.outGeometry -> displayComponents.inGeometry
point.outWeightPP -> displayComponents.inData
Turn off uniform colors inside the displayComponents node. From that point you should see in the viewport if you are getting the right data.
point.outWeightPP -> arrayToTexture2D.inWeightPP
point.outMapPP -> arrayToTexture2D.inMapPP
Connect the arrayToTexture2D to the diffuse channel of a shader node and apply the shader to the geometry. You should be able to render this with Maya's Software render.
I am working on MR counterpart.

I am not familiar with the render you use, but usually the third party render plugins for Maya provide only basic support for the default texture nodes. Which means that you will most likely not be able to render your stuff anyway. That's why i suggest you to bake the vertex colors to texture maps and use that to control your displacement shaders.

Quote 0 0
pshipkov
I am going to tweak the rgbaToColorAndAlpha, or maybe create new node from its code base that will split rgba stream into independent r g b and a channels. That way we can easily use anyone of them.
Quote 0 0
ljilekor
Thank You very much! Nice to see SOUP has top-notch support!

As you can see, the DisplayComponent works correctly. There's a pb with the arrayToTexture2D though... The result looks funky but is far from what we need...


UV looks like this:



When I bake the texture using convertSolidTx command:
(Note that I must bake the texture in order to export the sequences to Modo)



So... Guess I'm kinda stuck... again... (I made some progress though! ;) )

Applying VertexColorsToTexture (in the soup wrench menu) produces a usable texture. But because the animators need interactive feedback (alphas applied on proxy Normalmaps) this solution won't do for us.

For your information:
Modo is not a Maya render plug but a standalone app.
In my opinion it is 10x better, 100x faster and 10.000x more artist friendly than any renderer on Maya. We use Maya for animation and VFX.
You can check some of our work here

Thx

Quote 0 0
pshipkov
Oh, in this case you have to only export image sequence containing the vertex colors.
You never clarified that you tried to export the vertex colors to texture using the SOuP tool. I think that should work.
Tonight i will take a look at the arrayToTexture2D node.

Quote 0 0
ljilekor
Great!
A good working arrayToTexture2D Node would make things perfect.

I was not planning to export the maps with the VertexColorsToTexture though. I just wanted to mention that the tool worked fine 'in this particular case' (where we need vertexColors).
To cover all types of dynamic alphas (not necessarily driven by vertex maps) I need a 'uniform' procedure to export them. The convertSolidTx command would work fine in all cases. It provides a more general/steamlined way to export the dynamic alphas.

Animators need to have a 'real time visual reference' of  the displacement maps that will be generated at render time. Therefore a rock solid arrayToTexture2D Node is the key. (the same dynamic alphas are used in Maya to drive proxy normal maps).

Thanks again for looking into this! It's very much appreciated!
Quote 0 0
ljilekor
Hi, not that I want to push or anything...
Do you have an idea of when we could expect the arrayToTexture2D update?

I'd like to know this in order to decide if we can implement this particular functionality into our pipeline for current and future productions (series, features, pub, ...).

Quote 0 0
pshipkov
Actually the node works just fine. The only trick right now is to split the RGB or RGBA stream into single channels and pipe only one of them into the arrayToTexture2D.

http://www.petershipkov.com/temp/textureToArray2D.ma.zip

Quote 0 0
ljilekor
Any Progress on the rgbaPP => rPP, gPP, bPP, aPP node?
Quote 0 0
pshipkov
It is done already. Will be in the upcoming update.
Quote 0 0
DennisJ
Any progress on the upcomming update?

I'm really looking forward to test it out.
Quote 0 0

Add a Website Forum to your website.