Jump to content

lukeiamyourfather

Members
  • Content count

    801
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    8

lukeiamyourfather last won the day on June 13 2017

lukeiamyourfather had the most liked content!

Community Reputation

131 Excellent

4 Followers

About lukeiamyourfather

  • Rank
    Houdini Master

Personal Information

  • Name
    Luke
  • Location
    Dallas
  1. Creating HDA node in python "very" slow

    You might could save the Houdini file as text and insert your stuff where it needs to go that way.
  2. Creating HDA node in python "very" slow

    I would use the Performance Monitor window to monitor exactly what is making it take so long. There might be low hanging fruit to address to improve the performance of the HDA or how the HDA gets deployed. I've seen things as simple as an if statement in an expression causing an order of magnitude slowdown because it was causing other networks to cook unknowingly.
  3. pyro shader issue

    The Pyro shader has changed. Use the new one. If you can't use the new one then open the scene in the version of Houdini it was created in. Major versions of Houdini will break reverse compatibility which is something to keep in mind (and a reason to hang onto old installer files).
  4. When to care about topology?

    It depends on how portable the model needs to be and what other factors are needed like sane UV maps for artists to work with. The Boolean of today is vastly different from the Cookie node of the past (significantly more reliable). An asset that can be subdivided is always going to be more valuable and portable but it's not worth the effort in some cases.
  5. Douglas-Peucker polygon simpification

    I don't know what algorithm they're using but the results look similar. http://www.sidefx.com/docs/houdini/nodes/sop/polyreduce
  6. Please allow me...

    The people who come here are professionals and they just want to get stuff done. Not argue pointlessly. I don't know what you're referring to but I'm guessing a thread got closed. If you're after "discussion" with less moderation there are lots of other sites that provide that.
  7. Hardware problems

    Random resets are usually related to heat, power, damaged hardware, or driver issues. Make sure the computer is clean and free of dust buildup and in a well ventilated place (not in a closed cabinet or a closet).
  8. Cache Optimization

    In the big picture 100MB per frame for a simulation cache is not that much. Drives are cheap and 10GBASE-T networking hardware is now affordable. If storing a 50GB cache is an issue I suggest looking at upgrading hardware.
  9. Bullet: small objects not stopping to move

    I think they're good. This thread is from 2014...
  10. NTFS vs ext4 file system for houdini swap

    FAT isn't suitable for a caching partition because it has a file size limit of 4GB. I also wouldn't use exFAT or NTFS because they're implemented in Linux via FUSE which is a user space file system framework (significantly slower and less reliable than kernel level implementation). There's a commercial product for reading ext3 in Windows but it's not something I'd rely on given the choice. There really isn't a good option that will serve both Linux and Windows. I would setup a script that formats the drive for you when you need it (one for Windows and one for Linux). Either that or just use it in one operating system rather than multiple. If you need to use the same data in multiple operating systems consider network storage rather than local storage. Network file sharing protocols like SMB obfuscate the underlying file system so it doesn't matter what operating system the client is running. A file server with 10Gb Ethernet and many drives (or a few fast drives) can be just as fast if not faster than local storage.
  11. 2 x 512GB M.2 SSD's - set up as RAID 0 or...

    If the M.2 SSD are SATA it might be worthwhile to deal with RAID to get the extra performance. Most M.2 drives on sale now are SATA, not NVMe. If they're NVMe then it's not worth while because they're already faster than an application could fully utilize anyway. If you're running Linux you could use ZFS on Linux to create a pool of them and bypass the motherboard RAID (which would probably be slower than ZFS on Linux anyway).
  12. How much RAM is enough

    This requires overclocking so all bets are off. It might work for one person but not another. With four single rank DIMM the max speed for the memory controller on that CPU (without overclocking) is 2133 MHz which is what you're seeing now. This thread might be helpful. https://linustechtips.com/main/topic/788240-buying-ryzen-confused-about-ram-read-here/
  13. mantra: Error writing data to image device

    Please don't revive four year old threads.
  14. How much RAM is enough

    What processor do you have? The memory controller is on the processor and it'll affect the maximum speed of the memory. Most higher end memory is marketing bullshit and doesn't affect performance for most users. Overclocking involving more than just the processor multiplier being the exception to that. In the "real world" the differences will be minimal between one speed of memory and another. It's also a moot point to compare anything to 3200 MHz memory because almost nothing out there can support that without overclocking.
×