Register Latest Topics
 
 
 


Reply
  Author   Comment   Page 3 of 3      Prev   1   2   3
arichter

First taste
Registered:
Posts: 7
Reply with quote  #31 
I have created something looking at your example which looks like global directions.
Not exactly what I wanted but we are getting there.
What would be the final touch to get object based tension directions?

By the way: Which codec is used for vertexColorsToTexture? Since XnView is the online program which can open and convert them (not Nuke, Adobe Bridge, Photoshop etc.)

Attached Images
Click image for larger version - Name: global_direction.PNG, Views: 16, Size: 449.43 KB 

0
arichter

First taste
Registered:
Posts: 7
Reply with quote  #32 
This looks better. With your explanation and the example I came to this result.
Thanks! I will have a look how much of the data I need is in there.

Attached Images
Click image for larger version - Name: global_direction.PNG, Views: 24, Size: 445.06 KB 

0
pshipkov

SOuP Jedi
Registered:
Posts: 4,650
Reply with quote  #33 
Seems like the right data, but i display the point normals to see if they transform in "uv" space.

The vertexColorToTexture can outuput jpg, png, tiff, iff formats - image sequence.
0
Pizzaman

Asking for seconds
Registered:
Posts: 133
Reply with quote  #34 

With Arnold, you just have to enable the "exportVertexColors" option on the objectShape->ArnoldTab. Once done, you can create a aiUserDataVector node which will be able to read the values per point and use them as a texture in your shading graph : No need to bake to textures. You just need to fill the "Color Attr Name" slot of the node with the name of the colorSet containing the values.

Works like a charm with progressive rendering...

... but I think the hardest part awaits you there. You'll need to convert your vector direction to an angle rasterized in UV space to drive the angle attribute of your shader. Can't help you there, so good luck.

0
arichter

First taste
Registered:
Posts: 7
Reply with quote  #35 

Thanks for the answers!

Since I would like to process the direction vector further, it would be very helpful if I would have the (exact) mathematical definition of the vectors being outputted by "...out variable...". Could you share this definition or your procedure you use for the computation ?

So far, I can only guess that the principle deformation axis are derived by the singular-value-decomposition of the matrix which maps the triangles between static and deformed tanged-space positions. These directions (or the matrix itself) computed per triangle are than averaged to obtain the per vertex direction vector.
Is this true? Or do you use a different approach?

Best, Alex

0
pshipkov

SOuP Jedi
Registered:
Posts: 4,650
Reply with quote  #36 
There is no triangles used in the computation.
Everything is derived from existing point attributes.

arrayExpression - calculate the overall squashAndStretch vector
pointsOnMeshInfo - convert the 3d deformation to uv space (as vector per point)
point - calculate the angle between squashAndStretch and uv offset vectors in 2d (uv) space

This scene can be simplified quite a bit, but for now we use it the way it is.

let me know if you need even more detailed explanation.
I have the feeling you are struggling with the nodal network and overall concept.
Don't worry to ask more questions - we are here to help.
0
arichter

First taste
Registered:
Posts: 7
Reply with quote  #37 

Do you know an "easy" way to convert the 3D Direction Data into 2D Direction Map in UV Space?

Since if I would use vertexToTexture it would create a 3D Direction Texture but I would need it 2D to make the image processing work with the tension map data.
Has SOuP maybe already a function for that?

0
pshipkov

SOuP Jedi
Registered:
Posts: 4,650
Reply with quote  #38 
This topic here is exactly about that.
The example scene you were looking at does exactly that.
0
rolfcoppter

Wanting the recipes
Registered:
Posts: 56
Reply with quote  #39 
Hey guys, 

I just read through this entire thread and there is some awesome information in here! My question is what exactly is this technique used for? I understand the process but I can't figure out what the output fixes or the problem? Sorry for the noob question?

Thank You,

0
pshipkov

SOuP Jedi
Registered:
Posts: 4,650
Reply with quote  #40 
One possible application is mentioned in the very first message of this thread.
0
Pizzaman

Asking for seconds
Registered:
Posts: 133
Reply with quote  #41 

One possible solution (thanks for the challenge by the way !) :

- Compute the "stretching" vector (length and direction)
- Displace the mesh to the stretch direction, with points projected on the tangent/binormal plane
- Use the pointsOnMeshInfo to grab the UV values of the old mesh at the position of the new one
- Calculate the difference between the new UVs and the old ones, you get the direction in UV space :)
- Create a mesh in world space from UV space (mapToMesh)
- Calculate the angle between the X direction and the mesh vertices
- Export the directions and amount of stretch in the vertices... after a little blur to avoid the crap i can't understand.

Although the theory seems to work, the way Arnold deals with anisotropy makes it a nightmare to setup.

 
Attached Files
mb testDirStretch.mb (136.20 KB, 1 views)

0
rolfcoppter

Wanting the recipes
Registered:
Posts: 56
Reply with quote  #42 
When I clicked the link to the first post of the thread it didn't seem to work for me but I did manage to get the link to work by removing part of the end or the URL. This is a lot more clear now. Thanks for that explanation Pizza man!
0
Previous Topic | Next Topic
Print
Reply

Quick Navigation: