Hi,
I’m working on an animation including an 8,300 frame Houdini generated fluid simulation.
Until recently I've exported the meshes using Houdini to alembic, then referenced in Maya as V-Ray Proxy Objects.
This works fine for network rendering, but due to the 3 main alembic files being quite large, between 0.5 and 1.5 TB, it doesn't seem to work for distributed rendering. The render nodes don't return the buckets (they either do nothing or return black, I can't remember which and can’t check right now as the workstation is unavailable).
Is this because the nodes are trying to transfer the large alembic assets locally but can't due to the size? They don't have enough local storage to copy over TB of data.
I’ve tried turning off 'Transfer missing assets', hoping the assets would be read from the file server as all are referenced in the Maya scene file via UNC using the file servers IP, but this doesn't work.
Hopefully someone can advise?
To try and get around the issue I've looked to reference as vrmesh as vrmesh supports a file sequence.
I tried converting the Houdini .bgeo.sc compressed cache files to vrmesh but received errors, I thought perhaps as .bgeo.sc are already compressed so can't be re-compressed into the vrmesh format.
Although I received errors again when trying to convert the .bgeo files.
The errors were 'missing magic number' and another shorter error I've forgotten.
Turns out the more recent Houdini .bgeo cache files from Houdini are not supported by ply2vrmesh.
They've been around for a while, are they planned to be at some point?
So with a bit of fiddling in Houdini I exported to the old Houdini .bhclassic format, renamed to .bgeo, then converted to .vrmesh.
Frustrating having to export to an old file format, especially an uncompressed one. Per frame file sizes being:
.bgeo.sc 274MB
.abc 886MB
.bhclassic 959MB
.vrmesh 340MB (converted from .abc)
.vrmesh 337MB (converted from .bhclassic renamed to .bgeo)
To get around this I've tried exporting an alembic in Houdini from the Houdini .bgeo.sc cache, then converting the alembic to vrmesh, which works, but seems a bit crazy.
The final conversion takes a very long time due to ply2vrmesh being single threaded, is it planned or possible for it to be multithreaded? Or is the only option to run multiple instances of the process against different sections of the cache to be converted?
On taking the vrmesh files into Maya I've encountered a couple of issues...
The V-Ray Proxy Object Animation Preferences 'Start offset' option doesn't seem to work with vrmesh sequences? I can overcome by renumbering the sequence to the Maya frames but am I missing something or does the 'Start offset' just not work with vrmesh sequences?
Also, the vrmesh sequences containing the fluid meshes reference well - the viewport updating as expected. Another vrmesh sequence that contains a particle simulation (a Houdini generated white-water simulation exported to alembic then converted to vrmesh) doesn't, on test rendering the sequence is updating, just not in the viewport – unless I switch the 'Geometry to load' from GPU Mesh to Maya Mesh, or vice versa, then it updates in the viewport.
Any advice would be much appreciated!
I’m working on an animation including an 8,300 frame Houdini generated fluid simulation.
Until recently I've exported the meshes using Houdini to alembic, then referenced in Maya as V-Ray Proxy Objects.
This works fine for network rendering, but due to the 3 main alembic files being quite large, between 0.5 and 1.5 TB, it doesn't seem to work for distributed rendering. The render nodes don't return the buckets (they either do nothing or return black, I can't remember which and can’t check right now as the workstation is unavailable).
Is this because the nodes are trying to transfer the large alembic assets locally but can't due to the size? They don't have enough local storage to copy over TB of data.
I’ve tried turning off 'Transfer missing assets', hoping the assets would be read from the file server as all are referenced in the Maya scene file via UNC using the file servers IP, but this doesn't work.
Hopefully someone can advise?
To try and get around the issue I've looked to reference as vrmesh as vrmesh supports a file sequence.
I tried converting the Houdini .bgeo.sc compressed cache files to vrmesh but received errors, I thought perhaps as .bgeo.sc are already compressed so can't be re-compressed into the vrmesh format.
Although I received errors again when trying to convert the .bgeo files.
The errors were 'missing magic number' and another shorter error I've forgotten.
Turns out the more recent Houdini .bgeo cache files from Houdini are not supported by ply2vrmesh.
They've been around for a while, are they planned to be at some point?
So with a bit of fiddling in Houdini I exported to the old Houdini .bhclassic format, renamed to .bgeo, then converted to .vrmesh.
Frustrating having to export to an old file format, especially an uncompressed one. Per frame file sizes being:
.bgeo.sc 274MB
.abc 886MB
.bhclassic 959MB
.vrmesh 340MB (converted from .abc)
.vrmesh 337MB (converted from .bhclassic renamed to .bgeo)
To get around this I've tried exporting an alembic in Houdini from the Houdini .bgeo.sc cache, then converting the alembic to vrmesh, which works, but seems a bit crazy.
The final conversion takes a very long time due to ply2vrmesh being single threaded, is it planned or possible for it to be multithreaded? Or is the only option to run multiple instances of the process against different sections of the cache to be converted?
On taking the vrmesh files into Maya I've encountered a couple of issues...
The V-Ray Proxy Object Animation Preferences 'Start offset' option doesn't seem to work with vrmesh sequences? I can overcome by renumbering the sequence to the Maya frames but am I missing something or does the 'Start offset' just not work with vrmesh sequences?
Also, the vrmesh sequences containing the fluid meshes reference well - the viewport updating as expected. Another vrmesh sequence that contains a particle simulation (a Houdini generated white-water simulation exported to alembic then converted to vrmesh) doesn't, on test rendering the sequence is updating, just not in the viewport – unless I switch the 'Geometry to load' from GPU Mesh to Maya Mesh, or vice versa, then it updates in the viewport.
Any advice would be much appreciated!
Comment