• redpawfx
  • Asking for seconds
  • Posts 108
  • Reputation 0
  • Member Since
  • Last Active
  • AOL redpawfx
  • All Posts Topics Started Likes
    SOuP 2017-03-21
    pshipkov wrote: Run gcc -v and see if you are not on 4.8.2 (the highest supported one for Cent0S).
    If that's the case, you can try upgrading your development environment.

    It sounds like this is a studio environment and you probably don't have admin rights to do it, but running the commands below may get you in a better place:

    yum install devtoolset-2
    scl enable devtoolset-2 bash

    The second command is per console. You can put it in your .bashrc - but that's per user.

    If the solution works, then the system guys can push this to the team and make sure the devtoolset-2 environment is available right off the bath for everybody.


    Hey  Peter,  I tried this but  not with  devtoolset-2, (latest one we have access to is  devtoolset-3)  since we're on  centos 7 and it still doesn't work.
    I am  not understanding why this even has to happen in the first place.  We'd much rather not have to install overy 75 extra packages in our  install and run  some extra stuff on the environment which we already HEAVILY manage our own way   to get this plugin working..    

    Is there a way you can just  compile a clean version  that doesn't need any of this hackery,   using  the default  gcc  4.8.5   (or 4.8.3) which is the  vfxplatform default   for  Linux?

    Please let me know why  this devtools stuff is even neccessary  at all?

    thanks

    -johnc
    (redpawfx)


    0 0
    Bounding object questions
    Have 2 questions.. first is it possible to  set up a bounding object   sphere for example  to be  exclusive instead of inclusive?    basically return the vertexes / weights  in a  inverse way.. so as to say  affect everything  outside of this bounding sphere?


    second..   I  have a deformer acting on 2 separate objects, and  I want to use one bounding object
    to control the area of affect on both objects,  This doesn't seem to work,  it only  affects one of them (the first? )   is there/what is,   a  proper  way to make this work.. do I need to  have a separate multiAttr  node  for each  geo?


    any  help would be awesome.  thanks!

    -johnc

    0 0
    NParticle force update in API
    Think I may have just run across a bug, either in my code, or  in the API  regarding  NParticles.  

    I'm working on  updating the  code for  partio4Maya, and something seems to have changed in 2015 or 2016  and  for the life of me, I now cannot  force an update to  Nparticles  any other way except  running it  per frame via  mel code frame stepping forward, and exporting a frame.  Slightly complicating things is that this seems to only be a problem with viewport 2.0   or  if the particles are pre-nucleus cached. 

    I end up getting all the particles, the  counts are  correct.. but  they end up all at the origin.

    to be clear..  (writing out  cache  using the api  via   my partio plugin) 

    live playback  + vp1.0   =  success!
    n-cache  +  vp1.0    =  failure (particles at the origin)
    live playback + vp2.0 = failure (particles at the origin)
    n-cache + vp2.0  =  failure(particles at the origin)

    Legacy particles  are not affected by this problem at all.

    Previously  I was using the   MFnParticleSystem  function   "evaluateDynamics"   and it was working
    with both  particles and NParticles,  now  for whatever reason I can't use it anymore  becuase in cases where NParticles  use  lifespan of any kind,  it simply forgets to kill the particles with lifespan<age.    If I  don't use that evaluateDynamics  call,  in viewport  1.0,  it works correctly now.  but  in viewport 2.0   it fails to dirty something and  all the particle positions  end up at the origin... 


    Not sure where to point my finger at this point..  it seems like  a bug  to me, but does anyone else know of any  other functions or  ways to force an update to  Nucleus  ?   I need to look at the NCache command  to see if its doing  a frame step in python/mel or not.. thats my only  other idea.

    any ideas  would be welcome on this  

    thanks
    -johnc
    0 0
    write a particle cache node
    Hi,   so the closest  thing in the partio tools  to what you are looking to do is  the partioEmitter 

    so you'll want to  have a look at that one.. it was not written to be  a caching system exclusively
    but  written as an emitter specifically so that you could have the option to take an existing sim and "re-sim" it
    to add more data  or  do something else with it after the cache ran out for example. 

    There's  some  stuff in there for remapping cache attributes  to  new particle systems  attributes 
    it gets a bit confused  with  lifespan issues  if you try to kill the particles after  the emitter runs...

    I would not go this way if you're just trying to make a  "Reader only" type of node.  I would probably go with  a more simplistic approach  like you and  Peter have been discussing.

    hope this helps a little..

    -redpawfx
    0 0
    OpenVDB for Maya and Arnold
    Thanks Besha,  I look forward to it, if possible.

    -johnc.

    0 0
    OpenVDB for Maya and Arnold
    Hi Besha!  nice to meet you.

    I'm mainly interested in knowing which version the mtoa  nodes and translators are being compiled with and being able to keep correct versions if we were to use your  version of the plugin. 

    Where I work,  we compile our own development version of MTOA and exclusively use that in production, so we'd need  to always make sure your  plugin was compatible with our current build, based on which version of arnold  we're currently using.

    Since you're not charging for the  software, I was wondering if you were able to put the source up somewhere like github so people can  compile and contribute  their own version where needed.  
      At the very least we need your releases to give us a list of the version of arnold that it is compiled with,   and  the version of MTOA  that is compiled with as well, so that we can match them together and  give it a try to see if they will work alongside our  custom build version.

    Please let me know,  as I'd like to give your shader a try. 

    thanks

    -johnc.
    redpawfx
    0 0
    OpenVDB for Maya and Arnold
    Hey Ruslan, and other folks. 

    I'm aware of the orig  openVDB  Dreamworks stuff,   That only contains  the actual Dreamworks  OpenVDB  library code,  their houdini nodes plugin,  and   the original  version of the maya OpenVDB plugin.       What I'm specifically  asking about is the  Version of the  MTOA  translation nodes , shader and procedural  that work together with arnold. They seem to be custom written by the author and  Those are not available anywhere  as source to my  knowledge. 

    There are simple versions of the  MTOA  volume translation  witin the source for MTOA,   but  It seems  this package includes a very nice  custom  shader and  "visualization"   node, which I was  planning on writing  something like this myself, but  I just saw this  this week,  so I'm asking  if the source is available first, because I don't want to  bother if it were already out there somewhere, or there might be the possibility for it  to be released.     MTOA specifically is what I  am worried about
    since its  versioning with  Arnold releases are fairly rapid and when new versions are released it will stop working unless he continues to compile new versions all the time.

    If anyone can ask the author or  knows how to get in contact with him please let me know. 

    Thanks.  

    -johnc
    redpawfx
     
    0 0
    OpenVDB for Maya and Arnold
    Is the source available for this anywhere?   Arnold changes rapidly, and as soon as a new version comes out, this   download will potentially not work with it anymore.

    we'd like to be able to  compile it ourselves whenever we switch up to a new version of arnold and MTOA.


    -johnc

    0 0
    Epic Sequence – Calling All FX Talent! (Luma Pictures)
    Luma Pictures is developing some truly original FX.  Join us in creating something really special.

    Job Descrip:
     
    Responsibilities
     
    - Light and render FX passes for specific shots using developed FX rigs
    - Develop rigs/techniques to implement and optimize approved looks
    - Produce look tests based on provided reference materials
    - Creation of tools and scripts to facilitate workflow
     
    Qualifications
     
    Artist should be familiar with more than one of the following applications:
    - Maya  + SOuP  :-)
    - Houdini
    - Realflow
    - Krakatoa
    - Fume
    - Python scripting experience is a plus
    - Understanding multi pass 3D compositing is a plus
    - Artists must be able to take direction and work in a team focused environment, good communication skills are a must
    - Strong time management skills and extremely organized
    - Solid work ethic and a positive attitude in the face of challenging situations
     
    Positions available in both our Santa Monica and Melbourne studios.
     
    Apply Now!  http://www.lumapictures.com/jobs/

    (please don't respond to this post directly)
    0 0
    Context tool undo
    Is it possible?   I'm ok with   building my own undo stack and such, but  how do I hook it into the 
    ctrl-z   ?   I've tried  a couple things so far, but no luck..

    I've got a  brush-like tool that is designed as a contextTool.    once maya enters the context,  the  ctrlz
    just pops it back out the context.  is there a way to capture or add to the undo stack  from within a context tool   like the artisan tools  do for example? 

    I'd like to be able to undo   context operations one by one  for each time I   press->release the mouse
    like a painting operation .   

    any tips would be greatly  appreciated.

    thx
    -johnc

    0 0
    updating python plugin from 2014->2015
    Figured this out wiht the help of a co-worker.   another python plugin was patching some functions of the API weirdly  and  blocking it from working correctly.

    thanks

    -johnc

    0 0
    updating python plugin from 2014->2015
    Hi, I'm trying to debug, whats probably a simple  issue, but I'm not very good with the python API, 
    I usually stick to c++. 

    This code previously worked in maya 2014  and broke in 2015
    its from the  (now) open source  Nimble tools unInstancer tool that I took over from the orig author.
    https://github.com/redpawfx/uninstancer/blob/master/python/ns/maya/ParticleUtil.py

    I have a couple of lines of code  as follows,

    Code:

      def fromParticle( self, oParticle, deepCopy=False ):
    fParticle = MFnParticleSystem( oParticle )
    pIdMapping = nsm.DG.getPlug( node=oParticle, attrName="idMapping" )
    pSortedId = pIdMapping.child( fParticle.attribute( "sortedId" ) )
    pIdIndex = pIdMapping.child( fParticle.attribute( "idIndex" ) )
    pParticleId = nsm.DG.getPlug( node=oParticle, attrName="particleId" )
    fSortedId = MFnIntArrayData( pSortedId.asMObject() )
    fIdIndex = MFnIntArrayData( pIdIndex.asMObject() )


    and I get an error on the   second to the last line as follows:

    RuntimeError:  kInvalidParameter:  object is incompatible with this method

     I've put in some debug prints to find out that the  pSortedId  is indeed a   MIntArray, and such.
    I'm assuming this is some slight  change in the API.  or python.

    any suggestions?  I'd appreciate it :-)

    thanks in advance

    johnc


    0 0
    particleGoals

    try not using the  goalWeight0PP  stuff and just use  goalPP

    I've come across something simliar where using both seems to break things.. it might be the same sort of error going on.

    -johnc

    0 0
    Partio4Maya
    Hi Ben, thanks!  Glad you like the tools so far,  I've got a lot more up my sleeve yet with partio so  stay tuned!

    As for  BGEO stuff,  there've been a few  spurts of  development on that front  from 
    the main  Disney branch  and  from another outside branch  if I remember correctly.. both stopped short of  finishing for some reason.

    I've been using bgeo lately  and have just  had to put into my workflow,   a  rename step of
    .bhclassic->.bgeo  after each caching  from houdini.  its definitely a bit of  a pain. 

    what I might do in the immediate future would be to support the  file extensions  .bhclassic and .hclassic, pointing them properly to the  right format readers for now.  and then try to gather all the info I can  on the  new BGEO format issue, and find out  if there's something that can be done to finish it.

    If you have any other suggestions  or   feature requests, feel free to post them  here,  but preferably  on   github as an "issue"  here:

    https://github.com/redpawfx/partio/issues?state=open

    this is the central place I'm keeping track of things.

    thanks again for using partio4Maya!  and  Partio!

    If there's  any projects  that you guys can let me know partio was used on , let me know!

    -johnc
    redpawfx

    0 0
    Point Cloud import using SOUP
    Hi guys, just catching up on this thread.  

    Couple of things right off the bat,  I  don't think currently the  partioEmitter  has the right outputs to work with soup.. although its something I would really like to implement very soon  Peter.  So if you could  hit me up with the  details of what your point cloud  nodes are expecting  API wise ? :-)

    My suggestion was to use a program like meshlab to  mesh the points.  the main problem is converting your point formats into one that meshlab  supports.

    One thing to keep in mind with  PTS files is that they are extremely slow because they are  all Ascii.

    If you were to pre-convert your PTS files to another format using the partio command line tool

    partconv, then you'd be able to load more points into maya much faster.

    the command line  to do this  is    partconv   infile.pts  outfile.<your format here> 

    and yes  the other trick with lidar  points in the main toolkit  for maya is  that  it really prefers 
    filenames with  padded numbers...

    ex.   filename.0001.pts    not   filename.pts

    these are some limitations I'm hoping to  solve in the near future. 

    There are tons of ways to  make geometry  from Lidar points.   in my opinion  maya is not the place to start, but the place to finish..   If you use a free  external program like meshlab or  cloudCompare,  to generate  a fairly  low to medium rez  mesh  you can snap to,   then bring in the  partioVisualizer as reference,   or partioEmitter   points  if you really want to snap to them, this should get you where you want to go.   No matter what you do,  snapping to  a massive point cloud is going to be a bit sluggish.

    I have a couple ideas for better tools for this workflow as I do it a lot in my job as well. 
    Hopefully I'll have some time soon to  hack together something faster! :-)

    hope this clears things up. 

    johnc
    (redpawfx)


    0 0
    count post selected

    Add a Website Forum to your website.