Vivo3d Posted November 28, 2015 Share Posted November 28, 2015 What is the Viewport performance with a Geforce (something like a GTX 980 Ti or a Titan X) compared to a Quadro/Firepro? I saw that Houdini is officially supporting certain Geforce cards now - does that mean that a Titan X is going to perform like a Quadro m6000 and a m5000 like a GTX 980 Ti even with high res models or animation? Or would something like a m4000 still outperform a Titan X/980 Ti. Quote Link to comment Share on other sites More sharing options...
malexander Posted November 30, 2015 Share Posted November 30, 2015 I saw that Houdini is officially supporting certain Geforce cards now - does that mean that a Titan X is going to perform like a Quadro m6000 and a m5000 like a GTX 980 Ti even with high res models or animation? Pretty much, yes. Unless the Quadro has a lot more VRAM than the equivalent GEForce, and the scene is so large that the VRAM on the GEForce isn't enough to hold it all - in that case, the GEForce would bog down due to PCI-express traffic while the Quadro wouldn't. But that's probably a rather rare case. The Quadros also offer other features that GEForces do not, such as 10b color, Quad-buffer stereo and ECC. There might also be other factors involved in the driver that affects performance slightly (fp64, dual DMA engines). But generally most users don't need these, and with the viewport rewrite in H12 and GL3-only restriction in H14, we've moved away from most Quadro-only accelerated features in favour of modern GL techniques. Quote Link to comment Share on other sites More sharing options...
Vivo3d Posted November 30, 2015 Author Share Posted November 30, 2015 Thanks, sounds really good! I'm using a Firepro v7900 with 2GB so I'm hoping in a big jump. And my Eizo's are only Flexscan ones, so I won't be using the 10 bit anyway. The OpenGL is the only thing that is worrying me a bit - Nvidia cripples the software/driver performance to push the Quadros for professional use. But I hope that you guys know how to trick them and give us the best performance also with Geforce cards. Quote Link to comment Share on other sites More sharing options...
malexander Posted November 30, 2015 Share Posted November 30, 2015 I can't speak for other pro apps that use OpenGL, but at least Houdini shouldn't have any issues with the GEForce. The only thing to be aware of is that the GEForce drivers are driven by games development, so sometimes there are regressions with new drivers. So you do need to be ready to rollback a driver if it completely messes up your pro apps. That's a point to consider, if that's worth the savings to you. Quote Link to comment Share on other sites More sharing options...
Vivo3d Posted December 1, 2015 Author Share Posted December 1, 2015 Well, usually when a driver is working fine, I'm sticking to it. Now that you guys support Geforce cards, is there any particular driver you tested or test to certify them? If so, any chance for us mortals to find out what version you tested the hardware with? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.