Jump to content

HPC Linux machine 48gigs of ram


nuaga chomp

Recommended Posts

I think I may know the answer to this question but here goes....

I am running the apprentice version of Houdini on a 48 gig, 8 core Linux machine and its incredible slow. For some reason Houdini will only use 1 gig of that 48 gigs (its acting like its operating on a 32 bit machine).

The catch here is that the gpu is part of the mother board i.e. this is a blade from a server. Is my lack of dedicated graphics card the cause for Houdini's poor performance? And is there a permission issue causing to have limited access to the 48 gigs? Are these two problems causing the poor performance or is it mainly a gpu issue?

Any help you could give me would be greatly appreciated.

thanks

Link to comment
Share on other sites

Well, dumb question: Did you install the 64-bit version of Houdini? :) Besides, how do you know it will only use 1 GB of memory? Have you got other apps to use more? Also, how are you telling this memory size? Note that resident memory size and VM size are two different things. Are you running the memory hscript command in Houdini?

Link to comment
Share on other sites

Well, dumb question: Did you install the 64-bit version of Houdini? :) Besides, how do you know it will only use 1 GB of memory? Have you got other apps to use more? Also, how are you telling this memory size? Note that resident memory size and VM size are two different things. Are you running the memory hscript command in Houdini?

Link to comment
Share on other sites

Also: this is the first app running on this machine. (we will be running some Finite Element Analysis software on it soon as well). So no other applications are running on this machine at this point. We have a Linux/computer scientist handling the benchmarking on the machine so I'm not sure what software he is using to find the memory size. But this is 48 gigs of residence memory. But for some reason Houdini stops at 1 gig and kicks into virtual memory. The whole point of having 48 gigs is to never go into virtual memory. The hard drives (we have a terrabyte) are just too slow...

Link to comment
Share on other sites

The virtual memory use after 1 gig would suggest that it's not Houdini specific, rather it's an operating system level problem.

-Drew

Also: this is the first app running on this machine. (we will be running some Finite Element Analysis software on it soon as well). So no other applications are running on this machine at this point. We have a Linux/computer scientist handling the benchmarking on the machine so I'm not sure what software he is using to find the memory size. But this is 48 gigs of residence memory. But for some reason Houdini stops at 1 gig and kicks into virtual memory. The whole point of having 48 gigs is to never go into virtual memory. The hard drives (we have a terrabyte) are just too slow...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...