Jump to content

Search the Community

Showing results for tags 'gpu'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 37 results

  1. Hey, I took advantage of some of the price drops on GPU's lately with the release of the 20 series cards from Nvidia. I Got an evga 1070ti ftw2 card for 429$( also has a 20$ MIR to drop final cost to 409$). I put this into my machine that has had an evga 1080ti FE card in it since I built it a year and a half ago. I wanted to share the "real world" test results in case anyone else is wondering if it is worth it to pick up another card. The PC is running win10pro, 64gb ddr4 ram, intel i7 6850k 3.6ghz, primary drive is a samsung 960 evo m.2 ssd and a secondary crucial ssd, 1000 watt evga gold G3 psu, Houdini 16.5.268 and redshift 2.5.48 ( i think ) etc... I ran a redshift test on a scene that is a decent size pyro sim, rendered 60 frames at full HD with fairly hq setting. With just the 1080ti in the pc, the render took 38min17seconds. With the addition of the 1070ti, the render took 25min26seconds for the 60 frame sequence. Adding the second card took almost 13 minutes of the sequence render time. I would say it is worth the roughly $400 bucks. With the option of enabling/disabling gpu's in the redshift plugin options, I ran a single frame of the render and here was the result: with just the 1080ti - 26 seconds for the first frame. With just the 1070ti - 34s, with a little boost to the gpu settings on the 1070ti using the evga overclock software - 32 seconds( not enough for me to want to keep it overclocked beyond how it arrived). With both gpu's enabled - 15 seconds. I think I would be willing to buy another 1070ti while the sale/rebate is going on if it will reduce the render time a further 13 minutes. I'm assuming it would, but maybe I'm not adding something up right here. If adding one 1070ti to the machine cut 13 minutes off the render, wouldn't the addition of another 1070ti take another 13 minutes off the render time.? It would be incredible to drop the test sequence render time down from 38 min to 12 min for roughly $800 in hardware upgrades.! I ran all the PC's parts through a component checker online and even If I add a 3rd card, it should still have about a 100watts of buffer on the 1000w psu. Would probably want to add some more/better case fans if increasing the GPU count from one to three.! Anyways, thats what adding an extra card did for me. E
  2. Can someone please give me an indepth explanation as to why the CPU vs GPU sims look different? I have noticed when making a fire sim, GPU gives a bit more streaky and jagged sim (better licks) than a CPU sim which tends to be a bit more billowy with more 'shrooms' or rounder shapes in general. Sadly GPU crashes on any large scale sims yet I really like the look. How can I get CPU sims to look as streaky and jagged as GPU? Any tricks to stop a GPU sim from crashing? Cheers!
  3. Hi, Has anyone seen much information about the latest Nvidia cards that will be coming out end September in relation to GPU rendering.? I have been waiting for the specs on these cards for months, and finally they have been released, but of course all the articles so far that I have seen are still somewhat speculation on performance and "leaks" of specs that may or may not be real and all are geared towards gaming. I must say, some of these leaked tests aren't too impressive, like 5% performance increases on the new GTX2080ti over the old 1080ti, but I would have to assume that's because the software doing the tests isn't taking advantage of the RT and tensor cores. I am disappointed that on a $1200 card, they still only have the same 11GB of ram as the 1080ti has, although it is faster/newer ram, I was hoping for more ram.! Have there been any statements made by redshift or otoy about what speed improvements will come form having a card with "RT and tensor cores"..? Just wondering because I will be needing another 2 gfx cards in the next month, and the 10 series cards are having great price drops recently, some 1070ti's are as low as $399. If these new flashy RT cores are going to be a huge performance gain, then I will probably hold out for at least the 2070's. Any info would be great. Thanks. E
  4. Desktop spec for FX

    Hi everyone, I am working with our IT department to built up a desktop for heavily FX work(mostly water and pyro simulations). Below are three options that we are considering about it, I am looking for any comments or recommendation on different configuration. Thank you
  5. Hey all, I have started using redshift a few months ago and so far i like it a lot.! I currently have one 1080ti 11gb gpu in the machine I am using it on. Would like to get either another one of those cards since the gpu prices seem to have dropped lately back to something more reasonable, or get 2 1070ti 8gb cards. I have heard that the performance diff is only like a 20% gain for the 1080 over the 1070's, so might be better off with getting 2 of the 1070's. The main question is though, what happens when you hit the render button on your redshift node for a sequence inside houdini if you have more than one gpu.? If I had 3 gpu's, would it automatically render frame one on the 1st gpu, frame 2 on the second, frame 3 on the 3rd and so on..? Would it render the first frame across all 3 cards simultaneously by breaking up the frame to submit to the different cards, is that even possible.? Do the extra gpu's only get used if you use a distributed rendering software like deadline or by launching RS renders from command line.? It seems like you have to launch from command line in order to get the other gpu's to render, but I have never worked with a machine with more than one gpu installed. If I were to submit a 100 frame render from within houdini by clicking the render to disk button on my RS node, would it only use the main gpu even with other gpu's installed.? Any info from artists using multi gpu systems would be great. I didn't find a lot of info about this on the redshift site, but might not have looked deep enough. The end goal would be to have the ability to kick off say 2 sequences to render on 2 of the gpu's while leaving my main gpu free to continue working in houdini, and if its time to end the work day, allow the main gpu to then render another sequence so all 3 cards are running at the same time. I will most likely need to get a bigger PSU for the machine, but that is fine. I am running windows 10 pro on a I7-6850K hex core 3.6ghz cpu with 64gig ram in an asus x99-Deluxe II MB if that helps evaluate the potential of the setup. One last question, sli bridges.? Do you need to do this if trying to setup 2 additional gpu's that will only be used to render RS on.? I do not wish to have the extra gpu's combined to increase power in say a video game. I don't know much about sli and when/why it's needed in multi gpu setups, but was under the impression that is to combine the power of the cards for gaming performance. Thanks for any info E
  6. OpenCL Sim caching with multiple gpus

    Hi, is there anyway to cache my pyro sim in opencl with my 4 gpu (4xgt1080)? i couldn't find any preferences on houdini or topic in search. Thank you.
  7. Removing this post as we got the info needed. Thank you.
  8. I'm looking into graphics cards and i'm finding it hard to nail down exactly what to go for in terms of balancing budget with performance. Obviously getting the 1080ti would be better than the 1070 but there is, of course, the massive difference in price. Just within the 1070 model, limited to EVGA there are still multiple versions to decide between. Is there much wrong with going for the cheapest one? https://ca.pcpartpicker.com/products/video-card/#c=369&m=14&sort=price thanks everyone.
  9. Rendering slow

    I have a geometry that has a clay material with a texture and it is animated during 45 thousand frames. My computer is 2 x XEON 2,4GHz E560 and i have 8 GTX1080ti and 16 x 8GB Ram Mantra is not using GPU for rendering of course but the % usage in CPU is low while rendering why? what can i do to accelerate rendering? How can i use GPU rendering ? thanks Martin
  10. I know GPU renderers take advantage of multiple GPU's, but does H16 OpenCL performance also benefit?
  11. Which NVIDIA GPU?

    Hi all, I would very much like to try out some GPU rendering and therefore I am looking for a Nvidia GPU. I had found a Asus 1080 Ti Poseidon 2nd hand, but Now want to check in with you guys what you Think is the best solution? any 1080 Ti models that are better than others? If so, which one would you recommend? cheers :-)
  12. OpenCL simulations on ADM GPU

    hi guys, sorry can't speak English well. I planning to buy new hardware for OpenCL simulations on GPU. I study a little about pro-GPUs computing performance and find some information in https://en.wikipedia.org/wiki/FLOPS "Built using commercially available parts. AMD Ryzen 7 1700 CPU combined with AMD Radeon Vega FE cards in CrossFire tops out at over 50 TFLOPS (OMGOD!!!) at just under US$3,000 for the complete system." is it right? Can combine two AMD Radeon Vega FE cards in CrossFire with CPU? Does it help to speed up simulations in Houdini? Which Single-precision or Double-precision floating-point is more important in Houdini? is it depend on 32bit or 64bit process in software? https://pro.radeon.com/en-us/product/radeon-vega-frontier-edition/ OpenCL™ 2.0 Support Enables professionals to tap into the parallel computing power of modern GPUs and multi-core CPUs to accelerate compute-intensive tasks in leading CAD/CAM/CAE and Media & Entertainment applications that support OpenCL. Does Houdini support OpenCL 2ver?
  13. Hi I have 8 GTX 1080ti and i can simulate pyro in fast way with openCL activated however if i render out in mantra linux, it only use low percentage of my 6 dual core i7 processors. why? is it possibile to increase this cpu % or use Gpu for rendering?
  14. Ok, so not a whole lot of Houdini Love over on the Redshift forums. Mostly Maya, 3dmax, C4D (in that order) https://www.redshift3d.com/forums/viewthread/10860/ Hence the reason for my posting here. Anyone care to share any Redshift tips, tricks and tutorials specific to Houdini users? Here is one great tip that Adrian from Redshift was kind enough to share for customizing RS Spot lights https://www.redshift3d.com/forums/viewthread/12283/#79060
  15. Laptop Quadro or GTX, pros & cons

    Hello people! I am buying a laptop workstation to use while traveling for a few months later this year and I don't know if I should buy a laptop with a Quadro or a GTX card in it. What are the pros & cons? (price excluded) Does Houdini run better in general on Quadro cards? Also, feel free to drop some laptop recommendations if you have any! I've got my eyes on a Dell Precision 7710. Best regards, //Simon
  16. R9 Fury X

    Hi everyone, I'm considering changing out my FirePro W8100 and was offered a R9 Fury X, which on paper looks better for Single Floating Point calculations, however I am unsure of whether this is useful for Houdini or if I am to look for something that performs better with Double Precission Floating Point? Budget is very limited, so unfortunately no money for Titan X or 1080, etc. Would a Fury X be a viable option over the W8100? Cheers
  17. Hi all, is it possible or a way to enable all my GPUs for OpenCL? i have 3x titan-x and a quadro m4000 total of 4 cards. thanks
  18. Hi, Please let me know if there is a simple setting in preference where we can enable or switch to GPU or OpenCL. I am not sure whether viewport uses GPU, I think it does and I would also like to know if we can directly switch a DOP node to use GPU for its calculations. Can Mantra and Arnold use GPU? Please let me know the specific settings. Thanks again.
  19. Hey, before messing inside my PC, I thought I'd ask here: I have a old GTX580 sitting around, and a GTX980 in my PC. Could I use it to help on GPU rendering? (I'm testing RedShift). Is there some step to follow in order to have it working as dual GPU rendering? Is that 580 too old? Is it worse it?
  20. Hi everyone , When do ı simulate in houdini about pyro,fluid,grain, I disappoited during simulation or at cache time. :/ For this reason , i ask to build pc for houdini dynamics. But I don't know requirements for system . Processor option may be amd or intel. Max budget 3000$ 3500$. In addition , System ? Cpu? Motherboard? Min Ram? Min GPU ? Min Core ? be grateful for your helpful.
  21. Hi, I just upgraded to Octane Render 3 for Houdini, I still need to buy a GTX 1070 for my GPU expansion board and I'm up and running with GPU rendering in Houdini. My question. I really like Octane Render for it's image quality and simplicity. Why aren't their so much character animations done with Octane Render and why seem the Redshift3D guys pushing their software much more then Otoy is doing? Can someone with experience in both render engines tell me the pros and cons of Redshift3D and Octane Render? My humble assessement is that Octane Render is much simpeler to setup and it gives better -> Maxwell Render style -> image quality -> while Redshift3D seems more like a VRAY on GPU type solution with a lot of tweaking required to get decent results. I haven't bought Redshift3D yet, but planning to perhaps in the future. They seem to push their marketing better. It would be nice to have some preset material of Octane in Houdini. -> Although from Rohan Dalvi's video's the shader system doesn't seem to be to difficult. Comments please about the comparison between Redshift3D and Octane. The do and don't about using high res textures in VRAM. Why is Otoy not pushing their software more towards character animation? Thanks in advance.
  22. Hi Plan on building a power user Houdini machine to replace my 4 yr old iMac. My understanding is that GPU based renderer's like Octane, Redshift work very well with dual GPU's, so after a bit of research my first choice is to go with dual NVIDIA 1070's. Can anyone familiar with GPU's suggest a reason that the extra $ for 1080's or higher warrants the extra cost? My budget will be around $2500 US for everything and it has to last me for the next 5 years. Here is a starting point I am using (ugly case recommendation but otherwise...) http://techbuyersguru.com/2500-extreme-4k-gaming-pc-build
  23. As much as I am sold on the improved performance of Octane and Redshift (assuming you have a decent gpu) I am highly doubtful they can match the 'high realism' of renders like this one by Maxwell http://www.tonifresnedo.com/maxwell-render/rendl-rossy-lamp/#/gallery_2663/2 Any experienced Redshift or Octane users care to chime in?
  24. This operator allows you to run an OpenCL kernel as part of your SOP network. Depending on the GPU, some operators can be orders of magnitude faster than even VEX. In this case the OpenCL code is 144 times faster than VEX on GTX 970.
  25. GDDR5 or DDR3 GPU?

    Hi folks, I need some suggestion. Should I purchase GDDR5 1GB or DDR3 2GB Readon R7 GPU card? Thanks in advance.
×