Jump to content

All Activity

This stream auto-updates

  1. Today
  2. Yesterday
  3. The UV layout-node lines up the books. Just disable the axis alignment and pack into UDIM tiles. book_shelf.hip books_shelf_3d_od.hip
  4. If you're still looking for a solution, I can post something; Click the Like reaction on this message It's actually quite simple, but you have to be proficient in For loop with feedback SOP None of the distribute / align tools that ship with Houdini compute the collisions on the full geometry, AFAIK, you have to do it yourself with rays; there are a few different cases to test though, depending on how complex you want you book stack.
  5. Hello everyone I have three spheres with different animation timelines want to read per-piece attributes from a detail attribute based on each wedge to feed them into the ROP Cache node (or something else ) but I don’t know how to do . i have drop a simple setup for my problem Thanks in advance for any help TOP Cache - Base on Per Piece Frame Range.hip
  6. Last week
  7. The interesting thing in the video is indeed the fact that adding ramp points adds geometry. I found a quick way of doing this, maybe a bit dirty but it looks like it's working ! chrampedgeloop.hipnc
  8. Standard guided ocean setup. Have a big cargoship going thru som waves. everything works fine. As soon as i lower the particle separation on the simulation i get artifacts on the extended area on the ocean. Both visible in viewport and in render. Tried a lot of different stuf like bigger flattening distance but nothing seems to help. Anyone have a solution?
  9. I have a geometry sequence that has substeps (saved as x number of substeps per frame). Would you have a standalone lops network graph or even a stand alone hip file just for that animated geometry with enable playback at fractional frames? or would you just mix it with other assets and simulations caches that are not saved with substeps? and if you mix it do you then enable playback at fractional frames? and if you do enable the fractional frames would that impact the other non substep elements during rendering?
  10. I found some funny stuff, more inspired by floraform and the blender addon for differential growth ,it's mesh based and can create a variety of shapes, depending on the "weight" attribute used, and the direction. This method is not really the same as the one described in the paper I sent above. In the second hipfile, I try to replicate it 1:1. If you don't want to read i all : the main idea is using what they call an "intermediate" representation of the mesh to calculate the forces (eg, a point cloud). I'm still trying to figure out the different variations and methods used. I would love to hear any ideas, advice or anything ! TL_growth_mesh.hipnc unified_approach_grown_structures.hipnc test_edgedist.mp4 test_orientedvolume.mp4 testcurvature.mp4
  11. This is amazing Antoine, thank you. Would you be open to create an asset that could automate this for me and handle different situations? I'll shoot you a private msg.
  12. Totally possible using this neat trick Basically, polyslice -> clean up -> UVLayout on some input grids -> export as OBJ of DXF I attached the resulting OBJ and DXF files, which is a collection of curves laid out on 0.61m x 1.2m rectangles ( like a standard plywood sheet) The tricky part is to prepare the result of Labs polyslice for UVLayout consumption.. Other than that it's straightforward. UVLayout is not a world class bin packer, but it should do the job. You can always repack the curves afterwards once you've left H and you're in your CNC software. cutList.dxf cutList.obj Also, OBJ supports groups. Whatever groups are in the SOP you export, you will have them in OBJ as groups, or as LAYERS in DXF (I know less about that format though) That should be more than enough if you maybe need to split them further in a shell (a script in a terminal), or maybe the CNC software can deal with that.
  13. Hi, I have a pretty simple setup. A file node to a polyslice. I am slicing the file into N curves (200 in this case, but could be many more). I need each curve to be saved out as a single curve DXF file (which I will use in a laser cutter to cut the shape). If I try to just save the geo as is it will export all curves on camera viewport. I need a single curve on a specific axis (top down). Can't figure out a way to do this - clearly I'm no expert as there must be an easy solution. Thanks for the help!
  14. Earlier
  15. Manually I can make a 4x4 flipbook of the viewport in the flip book settings, but I can't see in opengl node (ropnet) any option to make a flipbook of the viewport, I see only camera, and it chooses only 1. any idea how to make a flipbook with 4x4 grid in an automated way (I trigger the opengl with a TOPs rop fetch)?
  16. Nothing I am aware of and it doesn't even make sense. What you can do is to use Edit Material Network LOP if you see the material in Solaris. But I recommend to edit the material on one place only.
  17. Thank you so much. But this method doesn`t solve my problem
  18. First up apologies for digging into a very old post here.. I've recently been looking for something similar, so it's great to find this thanks @Atom ! This somewhat works in the viewport comment by displaying the current time, but for me it does not update per frame when flipbooking or dragging through the timeline. @nikosgr00 Your solver method is working well and does update per frame when dragging the timeline, as well as when flipbooking. Between these two versions I'd love to know if it would be possible to instead of showing the current time of day, it could show elapsed time starting from zero seconds. Any ideas on calculating the math behind that?
  19. Search for houdini wingtip vortices https://www.sidefx.com/forum/topic/93890/?page=1#post-410577
  20. I'm trying to create a fighter jet downwash similar to that in Top Gun: Maverick, but I'm not satisfied with the swirls compared to my reference. I am attaching the MOV of my output, reference, and HIP files. https://www.youtube.com/watch?v=MkIx7mG8oos https://youtu.be/ql2oHbGsG1Y?si=U-t-BObuF8B9-44p&t=156 downwash.mov downwash_turbulence_forHelp.hip
  21. This seems to be strange: If it is a dense field, density and temperature would use the same amount of memory, but velocity is built from three floats and should be three times bigger. So I suppose your workflow to save the files is somehow incorrect. e.g. in my test, density is about 57mb, temp is 74mb and vel is 584mb.
  22. I'm trying to render Karama XPU in solaris on a scene with 2500 USD instances and I repeatedly get this error message "KarmaXPU: device Type:Optix ID:0 Name:"NVIDIA GeForce RTX 5090" has registered a critical error "Unable to create CUDA context "CUDA_ERROR_ILLEGAL_ADDRESS" (maybe old driver? requires 570+)", so will now stop functioning. Future error messages will be suppressed" Sometimes it renders ok in the viewport but often fails and only renders on cpu, and wont reset unless I reopen the file. If I try to render though the USD render rop, the GPU sits idle and the CPU works at 100% I've tried updating drivers and going back to old drivers (this is the only solution I could see people posting online) but it doesnt seem to help. Removing textures doesn't seem to help and reducing instance count doesn't seem to help. I thought it might be a VRAM issue but the VRAM is only operating at around 30% when rendering. Any suggestions would be greatly appreciated, I'm at a bit of a loss as to what to try next.
  23. I dont know how you save velocity, but usually you need to popst proccess adjustment for velocity and it save you space.
  24. Oooo yes that is interesting. Thank you. I will give this a go
  25. Here is why: These are a few tests for a tiny sim (DTV: Density and Temperature and Velocity). For the same simulation, same frame, if you save: Density alone -> 55mb Temperature alone -> 55mb Velocity alone -> 55mb Theoretically I should be able to get away with 165mb total storage if you mix floats you're still good: density and Temperature: 55mb + 55mb = 110mb (exactly as simulated and saved, exact sum) Up till now, I should be able to use only 110mb (density & temperature together) + 55mb (separate velocity) = 165mb total storage Once you feel like mixing apples and oranges, funny things start happening: Density (float) + Temperature (float) + velocity (vector) = 425mb total storage (I repeated the test 3 times because I couldn't believe my eyes) can you explain the difference between 165mb and 425mb? (425/165=2.5)? Try repeating the test and see for yourself. Can you explain to me why I should buy 2,5 times the storage instead of 1 just because floats and vectors are being mixed in one output? In case of a large simulation, instead of paying for only 1 drive, you need to pay for 2.5 drives. Scale that cost up for more servers / farm, and your numbers start having a louder voice. Instead of 20 drives you need 50 drives, for the same bloody sim, it gets funny real quick. No one paid attention so far (I wonder why) to the fact that the dopimportfiels is mixing apples and oranges at a very high dollar cost for no reason + ridiculously 3x longer geo saving time, It needs to be fixed by having 1 simulation but the dopimportfields offers 2 outputs, one for floats and one for vectors, from the very same simulation (no re-sim): floats separate (all of them): Density, Temperature .... vectors separate (all of them): velocity ... and just for arguments sake, I did resim, and it was way faster to resim twice (float separated from vectors) than to sim once (mixing apples and oranges) because saving geometry takes forever when you mix apples and oranges. But this is not what I want, re-simming is not the way to go, I need saving separate floats and vectors from the same simulation. Is it possible to do it with PDG (2 simultanuous dopimportfields for the same frame from the same sim)? or to change the dopimportfields node?
  26. This is not good way because in this case you will have 3 simulations. You can cache all together and than take field you need and work with it. If you still need separate files(I dont know what a spesial case you have) you can save it from cached VDB to save a lot of time.
  27. Hi rev, you could paint on the large terrain and transfer its mask to the smaller tiles using a volume wrangle. write_mult_volumes.hip
  28. I have a smoke sim in a DOPNET and a Dopimportfields initialised for smoke showing density, vel, temperature. is there a way to run the simulation only once and save separately the density, vel and temperature, each in a separate file cache? is it possible to do it with several dop import fields nodes? or somehow have as many outputs to the dopimportfields node as the number of fields, e.g. Density has its own output density, and the same for the others each with its own output? and then a file cache with 3 inputs each saving separately one of the fields
  1. Load more activity
×
×
  • Create New...