Jump to content

Search the Community

Showing results for tags 'nvidia'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 8 results

  1. Hey, I took advantage of some of the price drops on GPU's lately with the release of the 20 series cards from Nvidia. I Got an evga 1070ti ftw2 card for 429$( also has a 20$ MIR to drop final cost to 409$). I put this into my machine that has had an evga 1080ti FE card in it since I built it a year and a half ago. I wanted to share the "real world" test results in case anyone else is wondering if it is worth it to pick up another card. The PC is running win10pro, 64gb ddr4 ram, intel i7 6850k 3.6ghz, primary drive is a samsung 960 evo m.2 ssd and a secondary crucial ssd, 1000 watt evga gold G3 psu, Houdini 16.5.268 and redshift 2.5.48 ( i think ) etc... I ran a redshift test on a scene that is a decent size pyro sim, rendered 60 frames at full HD with fairly hq setting. With just the 1080ti in the pc, the render took 38min17seconds. With the addition of the 1070ti, the render took 25min26seconds for the 60 frame sequence. Adding the second card took almost 13 minutes of the sequence render time. I would say it is worth the roughly $400 bucks. With the option of enabling/disabling gpu's in the redshift plugin options, I ran a single frame of the render and here was the result: with just the 1080ti - 26 seconds for the first frame. With just the 1070ti - 34s, with a little boost to the gpu settings on the 1070ti using the evga overclock software - 32 seconds( not enough for me to want to keep it overclocked beyond how it arrived). With both gpu's enabled - 15 seconds. I think I would be willing to buy another 1070ti while the sale/rebate is going on if it will reduce the render time a further 13 minutes. I'm assuming it would, but maybe I'm not adding something up right here. If adding one 1070ti to the machine cut 13 minutes off the render, wouldn't the addition of another 1070ti take another 13 minutes off the render time.? It would be incredible to drop the test sequence render time down from 38 min to 12 min for roughly $800 in hardware upgrades.! I ran all the PC's parts through a component checker online and even If I add a 3rd card, it should still have about a 100watts of buffer on the 1000w psu. Would probably want to add some more/better case fans if increasing the GPU count from one to three.! Anyways, thats what adding an extra card did for me. E
  2. My RTX 2080 experience

    Just a heads up for anyone with a hardware setup similar to mine. Asus Extreme X399 and AMD Threadripper. And also plan on mixing new RTX with older cards. You might be spending a lot of time fiddling with your hardware trying to make everything work properly! I have a TitanV which has been rock solid on my box for over a year in both Windows and Ubuntu (its dual boot). But spent half the weekend trying to get a 2nd RTX2080 running nicely in my setup. 1st kept getting a Code OE error without posting which I finally resolved by swapping card PCIE slots. Then was getting a Bad TLP followed by a Bad DLLP repeated error after getting past Ubuntu's grub menu. I grabbed the latest RTX compatible Nvidia driver 411.xx and booted into Windows to resolve this. Then when benchmarking experienced random power shutdown more than once. Was able to successfully reboot after waiting a minute or so but just added to my frustration. I am guessing these issues are related to my BIOS, the X399 is fairly new and may still have a few bugs and compatibility issues with new hardware. But have decided to return and save my pennies towards a 2nd TitanV. YMMV. Also consider watercooling these guys or having really good fans, they get quite warm! (The baseplate on my TitanV feels just warm to the touch under load, the baseplate on the 2080 feels well...hot!!) I will probably revisit RTX technology again in a year when the next gen cards come out.
  3. Hi, Has anyone seen much information about the latest Nvidia cards that will be coming out end September in relation to GPU rendering.? I have been waiting for the specs on these cards for months, and finally they have been released, but of course all the articles so far that I have seen are still somewhat speculation on performance and "leaks" of specs that may or may not be real and all are geared towards gaming. I must say, some of these leaked tests aren't too impressive, like 5% performance increases on the new GTX2080ti over the old 1080ti, but I would have to assume that's because the software doing the tests isn't taking advantage of the RT and tensor cores. I am disappointed that on a $1200 card, they still only have the same 11GB of ram as the 1080ti has, although it is faster/newer ram, I was hoping for more ram.! Have there been any statements made by redshift or otoy about what speed improvements will come form having a card with "RT and tensor cores"..? Just wondering because I will be needing another 2 gfx cards in the next month, and the 10 series cards are having great price drops recently, some 1070ti's are as low as $399. If these new flashy RT cores are going to be a huge performance gain, then I will probably hold out for at least the 2070's. Any info would be great. Thanks. E
  4. I'm looking into graphics cards and i'm finding it hard to nail down exactly what to go for in terms of balancing budget with performance. Obviously getting the 1080ti would be better than the 1070 but there is, of course, the massive difference in price. Just within the 1070 model, limited to EVGA there are still multiple versions to decide between. Is there much wrong with going for the cheapest one? https://ca.pcpartpicker.com/products/video-card/#c=369&m=14&sort=price thanks everyone.
  5. If anyone has issues after manually installing the Nvidia driver on Linux, the solution (at least for me) is to run the installer again but follow with a -uninstall command and then use Software&Updates to reinstall the proper driver.
  6. The Great Transmutator Contest

    Hello VFX community! We (PopcornFX & Real Time VFX) are organizing a VFX contest! Send your submission until April 13th to be one of the 3 winners of our sponsored VFX challenge! Huge thanks to our partner #RealTimeVFX & our great sponsors HoudiniSideFX Substance by Allegorithmic NVIDIA Jury & prizes to be announced shortly! Stay tuned Follow the link and take your chance to get cool prizes! https://realtimevfx.com/t/competition-the-great-transm…/1841
  7. I've gone ahead and added additonal PPA's to get more recent versions of the Nividia driver. http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu which gives me version 375.20 Is there anything wrong with getting more recent drivers this way? The default PPA included with my ubuntu-mate install http://ppa.launchpad.net/nemh/systemback/ubuntu had a much older version.
  8. Hi Plan on building a power user Houdini machine to replace my 4 yr old iMac. My understanding is that GPU based renderer's like Octane, Redshift work very well with dual GPU's, so after a bit of research my first choice is to go with dual NVIDIA 1070's. Can anyone familiar with GPU's suggest a reason that the extra $ for 1080's or higher warrants the extra cost? My budget will be around $2500 US for everything and it has to last me for the next 5 years. Here is a starting point I am using (ugly case recommendation but otherwise...) http://techbuyersguru.com/2500-extreme-4k-gaming-pc-build
×