Jump to content

pezetko

Members
  • Posts

    263
  • Joined

  • Last visited

  • Days Won

    12

pezetko last won the day on August 7

pezetko had the most liked content!

2 Followers

Personal Information

  • Name
    Petr Zloty

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

pezetko's Achievements

Newbie

Newbie (1/14)

  • Very Popular Rare
  • First Post Rare
  • Collaborator Rare
  • Dedicated Rare
  • Week One Done

Recent Badges

139

Reputation

  1. Best way to get these fixed is to record (video) exact steps to reproduce the issue and submit BUG report for each one. (RFEs are for new features). IMHO "instability" is bit exaggerated as this one doesn't crash Houdini.
  2. Hi, I haven't seen this HDA, but that error sounds like it is py2.7. In python 3.7 reload is no longer build-in. So for Python 3.7, you have to use: from importlib import reload Reload function is usually used during development to quickly modify module code without restarting the application. So its usage could be probably omitted (just delete/comment out the code like this): reload(something)
  3. Hi, you forget to call the method. print (my_geo.children) prints that method object. Not calling it. You need: print(my_geo.children())
  4. Hi, it should be the same. Just tested both 456.cmd and 456.py on 18.5.514 (apprentice) on Windows 10 and it is working fine (456.cmd script is running on scene open/new file). O would prefer 456.cmd over 456.py as the python version modifies the session module and it could result in inserting this script multiple times if you don't add guard against that. Btw: Now the 123.py works only in Houdini FX. The Core has houdinicore.py instead. https://www.sidefx.com/docs/houdini/hom/locations.html#startup Alternatively, you can use pythonrc.py it is running only on startup (same as 123 scripts). But for always-on autosave, I would go with 456.cmd variant.
  5. Hi, in Mantra you can sidestep the problem with stylesheets and cvex for primitives. E.g.: Data Binding to intrinsic:indexorder In this example, I'm targeting packed primitives themselves. If I would add subtarget: Primitive I could target individual polygons on a rubber toy. You can find some examples in the crowd section of the documentation. packed_target.hipnc
  6. https://www.sidefx.com/forum/topic/77115/?page=1#post-329299
  7. On Windows, you can use https://www.dependencywalker.com/
  8. Cascadeur is not bad but general UX was not good in the previous version. Rigging is also lacking a lot. Some animators hate the AI that adjusts the poses for them. Hope they improved the UX in the rewrite. The license policy for open beta is also very nice. For another potential Maya rival, there is also https://rumba-animation.com/ It all depends on support, speed, stability, and pricing policy. Maya has a huge advantage in API and prevalence.
  9. Just realized that there is a difference between inFile.x (x position with scale and offset applied) and inFile.X (X position without any scale or offset). So Lidar Import SOP produces scaled and offset position from las. And for the laz there is laszip executable from https://rapidlasso.com/laszip/ that you have to have somewhere on your PATH. There is also c++ based laz-perf https://pypi.org/project/lazperf/
  10. Requests for Enhancement: https://www.sidefx.com/bugs/submit/ I was able to download the file from google drive but my old K4000 can't display it in Houdini.
  11. Just a few more optimizations to native python types, and testing on bigger las file. Python laspy SOP is 1.2 - 1.6x slower than Import Lidar SOP (1.6x if I apply scale, offset and add classification attribute what Import Lidar SOP does not perform), not bad. This is the code: from laspy.file import File import numpy as np node = hou.pwd() geo = node.geometry() file_path = geo.attribValue("file_path") def load_color(inFile): missing_color = ["red", "green", "blue"] for spec in inFile.point_format: if spec.name in missing_color: missing_color.remove(spec.name) if missing_color: return None color = np.vstack((inFile.red, inFile.green, inFile.blue)).transpose() return (color / 255.0).reshape(-1) # transform from 1-255 to 0.0-1.0 range) with File(file_path, mode='r') as inFile: # --- load point position coords = np.vstack((inFile.X, inFile.Z, inFile.Y)).transpose() # 632 ms scale = inFile.header.scale # should be already np.array offset = inFile.header.offset # there is no offset in simple.las example from laspy library # --- compute scaled and offseted positions and transform in 1d array pos = (coords*scale+offset).reshape(-1) # 300 ms geo.setPointFloatAttribValues("P", pos.tolist()) # 2203 ms # --- add classification attribute geo.addAttrib(hou.attribType.Point, "classification", 0, False, False) geo.setPointFloatAttribValues("classification", inFile.Classification.tolist()) # 450 ms # --- load color colors = load_color(inFile) if colors is not None: geo.addAttrib(hou.attribType.Point, "Cd", (1.0,1.0,1.0), False, False) # add color atttribute geo.setPointFloatAttribValues("Cd", colors.tolist()) # --- load intensity geo.addAttrib(hou.attribType.Point, "intensity", 0.0, False, False) # add intensity atttribute geo.setPointFloatAttribValues("intensity", (inFile.intensity / 512.0).tolist()) # transform from 1-255 to 0.0-1.0 range)
  12. Nice, I think fast ssd is much more important nowadays. I didn't try *.laz with laspy, may worth a try. I know I submitted a few RFEs for Lidar Import SOP, one for *.laz support as well as one for supporting a newer version of *.las format. If you find some of those features important, submit yours, more "votes" (RFEs) doesn't hurt Btw: if you replace np.concatenate(array) with np.ravel(array) the second one is 10x faster. And this is even faster than the second one (but only few ms). array.reshape(-1)
  13. Julien means the same thing I already mentioned. Loading and building attributes in Python is much slower than doing same thing in C++ most of the time. Lidar Import SOP is a fast C++ node compared to pylas. Loading your *.las example takes 20ms with Lidar Import SOP on my machine but same file takes almost 6 seconds with Python SOP with pylas (3 seconds just for transforming numpy nd array to serialized form and another 2.5 seconds for setPointFloatAttribValues method). If you do have enough memory and you do end up reading most of the file anyway it's faster just to use Lidar Import SOP to load all points fast and then use Python SOP to add only additional data that Lidar Import SOP cannot read (like classification) and blast what you don't need. Like this:
  14. Hi, no, this is just default comment from Python SOP. In the latest version, https://forums.odforce.net/applications/core/interface/file/attachment.php?id=56259 classification is just an integer attribute on the points.
  15. If you have any memory issue just check latest update version that using try/except block https://forums.odforce.net/applications/core/interface/file/attachment.php?id=56259 I'm not sure if laspy does implement context management protocol (so using "with File() as file_pointer:" statement may not close the file at exit) Just verified that and with File() as fp: statement works as it should.
×
×
  • Create New...