LaidlawFX Posted January 4, 2011 Share Posted January 4, 2011 Just as a warning to all do not buy a lap top with an... integrated intel graphics card ... it will not run houdini It may looks nice, be light weight and all those other wonderful things, but it will not run the best software on the planet Quote Link to comment Share on other sites More sharing options...
malexander Posted January 4, 2011 Share Posted January 4, 2011 Just as a warning to all do not buy a lap top with an... integrated intel graphics card ... it will not run houdini It may looks nice, be light weight and all those other wonderful things, but it will not run the best software on the planet Unfortunately, Intel's support of OpenGL is rather poor (to put it politely). If you take a look here, you can see that the sole OpenGL game, Chronicles of Riddick: Dark Athena, is unplayable at low detail settings (2.7fps), even with the new & improved Core i7-2000 series HD graphics. Intel's OpenGL driver does not work well with multiple GL contexts either, which is a requirement for Houdini. If you have Windows, you can set the environment variable HOUDINI_OGL_SOFTWARE = 1 and Houdini will run (somewhat) with Microsoft's aging software renderer. But you're far better off with an Nvidia or ATI card in the laptop. This is essentially why Houdini's system requirements explicitly single out Intel GPUs as unsupported. Quote Link to comment Share on other sites More sharing options...
LaidlawFX Posted January 4, 2011 Author Share Posted January 4, 2011 (edited) Unfortunately, Intel's support of OpenGL is rather poor (to put it politely). If you take a look here, you can see that the sole OpenGL game, Chronicles of Riddick: Dark Athena, is unplayable at low detail settings (2.7fps), even with the new & improved Core i7-2000 series HD graphics. Intel's OpenGL driver does not work well with multiple GL contexts either, which is a requirement for Houdini. If you have Windows, you can set the environment variable HOUDINI_OGL_SOFTWARE = 1 and Houdini will run (somewhat) with Microsoft's aging software renderer. But you're far better off with an Nvidia or ATI card in the laptop. This is essentially why Houdini's system requirements explicitly single out Intel GPUs as unsupported. You are my hero ... I owe you a beer ... maybe a couple of beers ... I don't want to use my laptop for any of the heavy lifting, I got a workstation at home and work to do the heavy lifting. Just reference when I do some docs, or I get antsy when I travel. The ogl def does some weird dancing when I'm dropping nodes, but it got through a simple series of test fine. Thanks Edited January 4, 2011 by LaidlawFX Quote Link to comment Share on other sites More sharing options...
kubabuk Posted January 4, 2011 Share Posted January 4, 2011 I'm successfully running houdini on macbook 4th gen (2008). Perhaps it's not the fastest machine on earth but at least it's stable and good for tests. Indeed shaded view tends to lag a lot but disabling Material shaders in the viewport seems to fix that issue. Quote Link to comment Share on other sites More sharing options...
malexander Posted January 4, 2011 Share Posted January 4, 2011 I'm successfully running houdini on macbook 4th gen (2008). Not sure if you're running a Macbook with the Intel graphics, but ironically Apple's OpenGL drivers for Intel graphics are far better than Intel's own. Sadly the same isn't true of Apple's drivers for Nvidia and ATI/AMD graphics. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.