Jump to content

GPUs- dual 1070 vs dual 1080 vs...


art3mis

Recommended Posts

Hi

Plan on building a power user Houdini machine to replace my 4 yr old iMac. My understanding is that GPU based renderer's like Octane, Redshift work very well with dual GPU's, so after a bit of research my first choice is to go with dual NVIDIA 1070's.

Can anyone familiar with GPU's suggest a reason that the extra $ for 1080's or higher warrants the extra cost?

My budget will be around $2500 US for everything and it has to last me for the next 5 years.

Here is a starting point I am using (ugly case recommendation but otherwise...)

http://techbuyersguru.com/2500-extreme-4k-gaming-pc-build

 

Edited by eco_bach
Link to comment
Share on other sites

One thing to remember about GPU renderers are that they can't combine video texture memory at this time. So two 8Gb 1080s does not give you 16gb of texture memory. Redshift does employ a technology called OOC (out-of-core) which will use slower system ram when it runs out of single video card memory to complete the render of a frame. Other GPU render systems simply fail when they run out of vRAM. I am investigating the use of Redshift and it does look promising but I have not completed my entire evaluation. I have run into some features which exist in Mantra but are not yet developed in Redshift. The Redshift team has been very helpful on their forum. I recommend try the demo if you already have Houdini Indie.

A better use of money might be to take that extra video card money and get an i7 6800 instead of the i7 6700K and bump the system ram up to 64GB instead of the 32gb. Having the extra ram for sims will be nice and you can always add another video card later if you need to.

There are a lot of 1070 vs 1080 reviews out there but basically the 1080 is 20% faster than the 1070. And compared to my pitiful 660GTX the 1070 is 235% faster, so I am eyeing the 1070 as a possible upgrade.

 

But even on my 660gtx I can render thousands of frame in hours, with Redshift, compared to what would take Mantra days.

Edited by Atom
Link to comment
Share on other sites

It sounds like you haven't tried a GPU renderer at this point. I wouldn't buy hardware for a GPU renderer unless you're 100% certain it will meet your needs. I'd start with a single card and go from there because you can add another card anytime if you find a GPU renderer is good for your workflow.

Edited by lukeiamyourfather
Link to comment
Share on other sites

2 hours ago, eco_bach said:

My budget will be around $2500 US for everything and it has to last me for the next 5 years.

 

I think 5 years is pushing it, after the Pascal range comes Volta:

NVIDIA-2015-2018-Roadmap-Pascal.png

Link to comment
Share on other sites

When doing GPU rendering you will probably want a dedicated GPU controlling your viewport, so you can use your machine even when rendering.

This is why a lot of people will get a smaller card (maybe a 750ti), in addition to your rendering cards.

So make sure you can fit at least 3 cards in your motherboard/pc. As Luke said, get only one (expensive) card and take it from there.

Also plenty of deals to be had with the 980ti, if you are just getting started this will be sufficient for a few years at least. In particular the hybrid model seems a popular choice, to keep temps down. If you get a 1080, you will paying top dollars today.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...