Jump to content

Search the Community

Showing results for tags 'workflow'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 17 results

  1. Hello magicians, I'm working on a large scale FLIP scene and need some tips regarding to workflow. The main scene has 2 FLIP sims, 1 for a waterfall, and 1 for a river. On my current setup I made the river using narrow band, and the waterfall using a emitter, not sure if this is the best workflow in terms of speed and direction, the current setup looks like this (blue = waterfall / red = river) And here is a viewport sample Questions: 1) Is this the best approach for mixing a waterfall with a river? emitter + narrow band? 2) I saw in other post that some people breaks the geometry in equal modules and then put them together, should I break the river within 3 equal parts to save time and quality? 3) I readed that when you export particles, is useful to delete attributes that won't be used, I did a quick test with all attributes and 1 frame was 800mb, deleted some and took it to 200mb, is this a right approach? 4) For final meshing, should I create VDB / polygon soup and export passes in order to make detail stuff like foam? 5) Should I export particles in wedges? 6) There is a dop workflow to upres particle/flip mesh like "gasupres" within pyro? Would love to hear any tips regarding to large scale flip, will keep reading on the forum in the meantime Thanks!
  2. Hey all, I have started using redshift a few months ago and so far i like it a lot.! I currently have one 1080ti 11gb gpu in the machine I am using it on. Would like to get either another one of those cards since the gpu prices seem to have dropped lately back to something more reasonable, or get 2 1070ti 8gb cards. I have heard that the performance diff is only like a 20% gain for the 1080 over the 1070's, so might be better off with getting 2 of the 1070's. The main question is though, what happens when you hit the render button on your redshift node for a sequence inside houdini if you have more than one gpu.? If I had 3 gpu's, would it automatically render frame one on the 1st gpu, frame 2 on the second, frame 3 on the 3rd and so on..? Would it render the first frame across all 3 cards simultaneously by breaking up the frame to submit to the different cards, is that even possible.? Do the extra gpu's only get used if you use a distributed rendering software like deadline or by launching RS renders from command line.? It seems like you have to launch from command line in order to get the other gpu's to render, but I have never worked with a machine with more than one gpu installed. If I were to submit a 100 frame render from within houdini by clicking the render to disk button on my RS node, would it only use the main gpu even with other gpu's installed.? Any info from artists using multi gpu systems would be great. I didn't find a lot of info about this on the redshift site, but might not have looked deep enough. The end goal would be to have the ability to kick off say 2 sequences to render on 2 of the gpu's while leaving my main gpu free to continue working in houdini, and if its time to end the work day, allow the main gpu to then render another sequence so all 3 cards are running at the same time. I will most likely need to get a bigger PSU for the machine, but that is fine. I am running windows 10 pro on a I7-6850K hex core 3.6ghz cpu with 64gig ram in an asus x99-Deluxe II MB if that helps evaluate the potential of the setup. One last question, sli bridges.? Do you need to do this if trying to setup 2 additional gpu's that will only be used to render RS on.? I do not wish to have the extra gpu's combined to increase power in say a video game. I don't know much about sli and when/why it's needed in multi gpu setups, but was under the impression that is to combine the power of the cards for gaming performance. Thanks for any info E
  3. Hi there! I would like to render my water sim and whitewater in c4d to integrate it with an animated scene. I got two questions. 1. I was wondering what a good workflow is to render the simulation out over to c4d, are alembics still the best way to do this or is houdini engine faster or are there other ways? my 7 million points seem to be very very heavy for c4d in alembic... 2. How do I render out the whitewater correctly in c4d (octane render) does this need to be GEO or particles for rendering? and also how do I get that nice lifespan on it too. Thank you for taking the time to help me out! Cheers!
  4. Hello, i am currently working on my final student projet where i make a plane crash with Houdini. I base the workflow of this project around the RIGIDS III tutorial by Steven Knipping. Everything is working fine until i bring in a simple glue constraint, and when the simulation gets activated at a said frame the simulation simply does not happen. It works when i disconnect/disable the constraint. I already tried to unpack the alembics upon importation with the alembic node but that didnt change anything. What is weird is that everything was working fine when i used the model given with the course. I haven't really found anything that helped me online, so hopefully one of you can help me with this issue, i would appreciate it A LOT. I will attach the files down below. Thank you ! plane_sim.rar
  5. Hey, I am looking for advice on a good workflow for rendering and compositing a pyro sim with both fire and smoke. I know that fire has no density, only smoke. So, I turned on two image planes, one for smoke_mask and one for fire_mask. What exactly do I do once I am inside nuke and what are the intended purposes for the fire and smoke masks? How are they to be used together? Thanks, Evan edit: Do I need to render out fire and smoke separately or is one pass with both fire and smoke mask enough?
  6. Hi I consider myself a new user of Houdini. Other than using it sporadically a few years ago . Been a Maya power user the last 12 years so I've been set in my ways I guess Anyways , I'm embarking on learning Houdini afresh for procedural modelling, and I'd like a few pointers and suggestions on what I should focus on Sorry for long post initially I'd like to get to grips with managing 'loop iterations' and creating multiple variations of assets which can be scattered or accessed using IDs. I'd normally write a python Maya UI and offer the user a 'step by step, now click this' type of approach which is .....because Maya I guess a simple exercise would be a field of cabbages : Import an alembic containing several leaves Duplicate n number of randomly selected leaves around a pivot repeat to make layers warp them to make them look a bit organic --do this n number of times and store results in a cabbage array -- take a random cabbage place it somewhere on a plane assign a random range rotate / scale --etc etc-- So from the above example I can see that I'd need some basic functions such as - gathering n number of leaf asset(s) IDs from an imported file -or splitting an imported geometry into different assets ( primitives ) - selecting n number of them - set their pivot to their minY - radial rotating them around a pivot - declaring the result as a new cabbage asset ID - have n number of these - scatter them across a plane - avoid intersections Without python it's pretty crazy to do this in Maya, with python it's easy . So what sort of nodes and workflow should I adopt in Houdini ? Can I avoid scripting ? Im guessing nodes like copyToPoints, group, transform are ones to use, but what about for loops and if ? Can I represent those sorts of queries in a node based manner ?
  7. Fairly new to houdini.. Is there a way to keep working on your scene whilst generating a flipbook? Having to wait every time really slows the workflow. Coming from Cinema4D; it has an external preview viewer which runs in the background and allows you to keep working - any such thing in Houdini?
  8. No time for SIGGRAPH this year? Here's another GridMarkets presentation for you to watch from home: www.gridmarkets.com/siggraph2016-jeff-ranasinghe VFX Supervisor Jeff Ranasinghe talks about how Deeps, Dependencies and Automated Movie Generation can help speed your workflow. We also had some time to do a brief interview with Jeff! We hope you like the material.
  9. FLIP Cache - Best Workflow

    Hey guys, I assume that the best workflow could be: 1st - Write the particles simulation to disk 2nd - Read the simulation files to generate the mesh (+ cache the geo file) Now, what is the best way to save the particles simulation? Should I do it inside the "particle_fluid" node using a "ROP Output Driver" or a "Flie" node? Should it be connected to the "import_particle" node? Something like this: and how should I read those particle files back? Please let me know if there's a better workflow on it. Thank you, Alvaro
  10. Hi everyone, The new GridMarkets Houdini artist feature is out! Johnny Farmfield, Houdini artist from Sweden, shows us how working with attributes is key to understanding the Houdini workflow. Don't miss his entertaining and extensive tutorial. Have a great week, Patricia and the GridMarkets team
  11. Let's say I have an asset I have created with a number of dependencies to the files system, from ready flies, to caches, to creating point clouds and a not trivial (aka linear) approach in which the asset, based on certain conditions or user selection does one thing or another. If I want to publish this assets and all its dependencies to a repository I will have to traverse this, potentially, huge network with subnetworks inside (think Chops inside Sops for example) and rewire all these paths, move files to the repository and then export the asset. My feeling is that this particular task is kind of done in the Render Pre-flight but I wonder if it is accessible and more important, is this the best way to address a more compartmentalised pipeline? How are you guys approaching this? any suggestion? thanks in advance and hope this thread becomes a source of inspiration for everybody.
  12. NEW TUTORIAL: Alembic Vector Motion Blur Tutorial- A Houdini, Maya, Arnold and Fusion or Nuke Workflow Thanks for watching and Sharing! If you have any suggestion, please come them in! Thanks
  13. Hello everyone, I have imported an animated file (a tree in this case) to houdini using an alembic file and I was wondering how to isolate primitive for shading each part separetely (trunk, leaves and so on)? Each parts are already .on different group stored in the alembic file, I am also a max/vray user and you have the possibility to turn the visibility on and off for each primitive group of the alembic so I am guessing there must be a way to do exactly that in houdini? I have looked on the forum and cannot seem to find proper information about this. If someone can help me with that or share any informations it would be great!
  14. You might have already seen this when it was published back in January, but here's a Gamasutra article about procedural workflows for games written by Side Effects Software President and CEO Kim Davidson. http://www.gamasutra.com/view/news/233899/Sponsored_Go_Procedural__A_Better_Way_to_Make_Better_Games.php
  15. Hello, I am working on a school project and we decided to try deep workflow. But I am encountering many problems as I have nobody to explain me the correct process. Today I managed to solve very weird problem - when I imported into Nuke deep sequence rendered with htoa I wanted to composite a card with texture into it. Because sequence has moving camera imported deep seemed to be flowing in space (I understand why - Nuke aligned deep data to origin so it is flowing because distance changes). The next step should be setting camera input into deeptopoints node. This should transform pointcloud to their static place (geometry is static) . But it did not. When I set camera into deeptopts node the pointcloud changed a little bit but remained flowing in space. I cannot understand why because in theory it should stick to its place. So after hopeless trying of various modifications (changing camera units, importing camera from maya...) we came up with a solution. Now it works but I don't understand why. The solution is really strange but it consists of using deepexpression node and dividing deep data (not color) by number 68. It sounds silly but it solved problem - points are transformed to their correct position. Any ideas why so strange thing happened and why 68?
  16. Ocean Float Workflow

    I am trying to figure out the correct workflow to have an object float on an ocean whilst also interacting with that ocean, creating splashes etc. The ocean I am creating is quite rough so there is a lot of low frequency displacement/movement generated by the ocean spectrum (big rolling waves). Things I have tried so far: flipsolver > Volume Motion > Solver > Feedback Scale set to 1.0 Object density set to 400 Object bounce set to 0 Splash Tank: Object floats correctly. Good motion. Not much real interaction with the ocean. Doesn't really create splashes. Particles get generated within the object due to the nature of the Splash Tank. Wave Layer Tank: Unstable behaviour. The object seems to generate a large splash which then causes it to bounce around erratically and eventually sink (sounds cool but not what I'm after). Wave Tank: Doesn't keep up with the ocean deformation very well at all. Beach Tank / Flat Tank: Unsuitable. From my tests the Splash Tank result is best but doesn't really allow for proper interaction between the object and the ocean. The Wave Layer Tank might work if I could get past that instability. Am I going about this the wrong way? Any suggestions welcome.
  17. Hey guys, I like to hear about your workflow, in regards alembic assets, material assignment and organization. my main concern is applying shaders inside alembic root node for pieces, when I rebuild/update the asset, I lost all my shaders connections. How u handle that? What´s the suitable approach in day production basis when assets still changing or refining? Thanks
×