Jump to content

Geforce vs Quadro Viewport performance


Vivo3d

Recommended Posts

What is the Viewport performance with a Geforce (something like a GTX 980 Ti or a Titan X) compared to a Quadro/Firepro? I saw that Houdini is officially supporting certain Geforce cards now - does that mean that a Titan X is going to perform like a Quadro m6000 and a m5000 like a GTX 980 Ti even with high res models or animation? Or would something like a m4000 still outperform a Titan X/980 Ti.

Link to comment
Share on other sites

I saw that Houdini is officially supporting certain Geforce cards now - does that mean that a Titan X is going to perform like a Quadro m6000 and a m5000 like a GTX 980 Ti even with high res models or animation?

 

Pretty much, yes. Unless the Quadro has a lot more VRAM than the equivalent GEForce, and the scene is so large that the VRAM on the GEForce isn't enough to hold it all - in that case, the GEForce would bog down due to PCI-express traffic while the Quadro wouldn't. But that's probably a rather rare case.

 

The Quadros also offer other features that GEForces do not, such as 10b color, Quad-buffer stereo and ECC. There might also be other factors involved in the driver that affects performance slightly (fp64, dual DMA engines). But generally most users don't need these, and with the viewport rewrite in H12 and GL3-only restriction in H14, we've moved away from most Quadro-only accelerated features in favour of modern GL techniques.

Link to comment
Share on other sites

Thanks, sounds really good! I'm using a Firepro v7900 with 2GB so I'm hoping in a big jump. And my Eizo's are only Flexscan ones, so I won't be using the 10 bit anyway.

 

The OpenGL is the only thing that is worrying me a bit - Nvidia cripples the software/driver performance to push the Quadros for professional use. But I hope that you guys know how to trick them and give us the best performance also with Geforce cards.

Link to comment
Share on other sites

I can't speak for other pro apps that use OpenGL, but at least Houdini shouldn't have any issues with the GEForce.

 

The only thing to be aware of is that the GEForce drivers are driven by games development, so sometimes there are regressions with new drivers. So you do need to be ready to rollback a driver if it completely messes up your pro apps. That's a point to consider, if that's worth the savings to you.

Link to comment
Share on other sites

Well, usually when a driver is working fine, I'm sticking to it.

Now that you guys support Geforce cards, is there any particular driver you tested or test to certify them? If so, any chance for us mortals to find out what version you tested the hardware with?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...