Jump to content

rzh0013

Members
  • Content count

    4
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Everything posted by rzh0013

  1. Houdini and Infiniband

    Hello Albin, Wow I can't believe it's been almost 2 years since we had this discussion. Well some things have changed and now I have a networked NVME raid 0 array of two samsung 960s and my secondary machine has upgraded to a quad socket dell r280. The nvme drives are housed in the r820. Testing with crystal disk mark, my remote connection from my main PC is getting 2802MB/s and 3044MB/s reads and writes respectively. Locally on the r820 I'm getting 5244MB/s and 3785MB/s respectively. I should note that this is on win 10 for the desktop and server 2012 r2 for the the r820. So while the speeds are great, I would like to see an improvement to where it's as if the disks are located locally. To this end, I've seen some information about NVMEoF. So I will be looking into this, have you had any experience with this? Cheers! Ryan
  2. Houdini and Infiniband

    Hello all, I was wondering if anyone has any experience working with Infiniband and Houdini. I have recently purchased two Voltaire HCA700ex2-q Infiniband cards (40Gb/ps) and have them connected via a Mellanox 40Gb capable fiber qsfp+ cable. I've already upgraded the firmware on both cards to the newest available version. And both cards can see each other and I am able to set up and transfer files between them using shared drives in Windows 7 Professional x64. (I will be trying this next in CentOS 7) What my question is: how do I get the two computers to see each other in Hqueue so that I can use my faster Infiniband connection rather than my much slower gigabit Ethernet connection which is getting fully saturated during distributed fluid sims. Do I need to edit the client information in some way to use the IP from the Infiniband card (cards are capable of 10Gb/ps IPoIB)? Thanks, Ryan
  3. Houdini and Infiniband

    Hello Albin, I'll have to see about getting some OEM licenses for Windows 10 then to check out SMB3. I'm wondering if this could also be enabled in CentOS 7. The plan is to get several pcie nvme drives in raid0 and have them be the drive that the project files get copied to when the job is submitted. I'm currently looking at the Intel 750, but am open to suggestions. On my test machines I have on either end: machine 1: Intel 4930k, 64GB Corsair RAM (can't remember speed), single OCZ 480GB ssd, GTX 770 Classified, Quadro K5000, MSI GD45-A(model might not be right but it's a GD45) Machine 2: Dual socket hp z600 (each socket has a 4 core Xeon), 24GB RAM, a pair of 240GB ssd's in raid0 (one is a Seagate the other a Kingston ssd now), Quadro K5000, whatever the standard mb from HP is. Both machines also have a Mellanox Connectx2 qdr 40Gb/s Infiniband card (also capable of 10Gb/s IPoIB) and both cards have been flashed with the latest firmware available for those cards. They are directly linked using a Mellanox 15m qsfp+ fiber optic line. Thanks for for the links to the series, there's a lot of good information there for future projects as well. Cheers! Ryan
  4. Houdini and Infiniband

    Thanks Albin, I actually did figure out the hosts thing a little while back and got it up and running. I meant to update the post but haven't had time as I'm currently setting up a similar system at work to handle our Octane Rendernodes. The reason I need the 40Gb pipes is to handle the data going between the GPUs. At at work we're getting some pcie nvme drives in soon and I plan on upgrading my home workstations as well sometime in the next month or two. I've been keeping an eye out for the Seagate nytro branded drive that's supposedly their response to the hp and dell pcie 3 x16 that are capable of 10Gb/ps. As as far as ramdisks go though, I haven't had a lot of success in getting a good transfer rate in Win7 but that could be some sort of weird bottleneck. In fact I'm seeing better transfer and read speeds between my two ssds which strikes me as odd. I think the ramdisk was topping out at 200MB/ps on writes. Whereas the ssds were hitting 700MB/ps sustained. I'll update this thread with my findings from time to time. I'd love to chat with you about your experiences with this if you have the time. Cheers! Ryan
×