GTX Titan or Radeon r9 Graphics Cards?

My current setup is an 8-core FX 8150 oc'ed to 5GHz, 16GB RAM, nVidia GTX-570. I'm debating upgrading my graphics card to either the nVidia GTX Titan or the Radeon r980x card. I'm leaning more toward the GTX, simply because it will accelerate Blenders 'Cycles renderer' as well. Will either of these cards work with hit film? I don't see them on the supported cards list.
Thanks :-)

Comments

  • MichaelJames
    MichaelJames Posts: 2,034 Enthusiast
    Neither.. id say get the GTX 780 SC... you get the titan performance with a cheaper cost.   or the 780 ti.  Id rather SLI 2 780 SCs.  I can run 2.5k raw footage on my system up to 29 fps thanks to my gpu in davinci resolve.  If I got a second GTX 780 id probably be crushing raw.  If you have the extra couple hundred.. put it towards a second GPU
  • Robin
    Robin Posts: 1,669 Enthusiast
    Id rather SLI 2 780 SCs.  I can run 2.5k raw footage on my system up to 29 fps thanks to my gpu in davinci resolve.  If I got a second GTX 780 id probably be crushing raw.  If you have the extra couple hundred.. put it towards a second GPU

    I'd be careful with that, because telling from these three threads (whoa alliteration...), an SLI setup with more than one graphics card will certainly boost performance in optimized games, but the difference for video editing would be quite negligible as HitFilm can't take advantage of it. Not sure about other video applications though.

  • LandonParks
    LandonParks Posts: 22
    edited November 2013
    SLI configuration is out. Mainly because for my uses (mainly rendering, playback acceleration, etc.) it has no advantage and can even crash some programs. The reason I'm leaning on the GTX Titan is because its the largest card that has a single-gpu with the most memory and over 2000 CUDA-cores. Many programs (including Hitfilm) will not work with dual GPU's, so I'm trying to also avoid cards that have two GPU's on the same card as well. I know that a lot of performance comes down to the amount of VRAM you have on your GPU - and the Titan has 6GB of it. I just don't know if that GPU is supported by Hitfilm, as its not listed. I know some programs have issues with the newest Kepler-GPU chips, so I wanted to make sure that Hitfim can take advantage of the card, etc. While some of the ATI/AMD cards as nice as well, I like the idea of sticking to nVidia because of CUDA, which is taken advantage of in After Effects Ray tracing and Blenders 'cycles' render engine.
  • MichaelJames
    MichaelJames Posts: 2,034 Enthusiast
    @Robin, I never said Hitfilm... and I did cite the program that would work with it.  I do see your point to make sure to specify that it won't do anything for hitfilm.  Davinci resolve does not work with Sli, but it will utilize the second GPU for extra processing power.
    I am curious can that be done for Hitfilm?  Not sli but multiple GPUs?  I know AE doesn't support that but isn't resolve very similar to a NLE except with more focus and tools on grading?  I'll ask this in the correct section.

    From Resolve's configuration guide.
    "A certified WIndows system with a current model GPU and 3gb or more of GPU RAM can play UHD and 4k-DCI resolution images using a HD resolution timeline and displayed on a HD monitor.  After you have completed your grading simply change the timelinie resolution for a UHD or 4k-DCI Render."
    "Davinci Resolve with 2 gpus is better for work in sd, HD and 2k in realtime and provides twice the grading performance of the single dedicated GPU configuration.  This Resolve configuration also supports HD stereoscopic grading.  With the Decklink 4k extreme video card you can grade and monitor in UHD/4k but depending on your GPUs playback may not be real time.  WIth the faster GPUs listed below you can use UHD/4k timeline and monitor in UHD/4k in real time.
    Some of the image processing operations like temporal noise reduction need high gpu ram and so these operations might need high GPU RAM and so these operations might not be available or operations could be restricted on GPUs with limited RAM so consider this when selecting your GPU....
    Davinci Resolve 10 for Windows will use CUDA on Nvidia hardware and OpenCL on AMD/ATI GPUs.  However devices running both Nvidia and ATI hardware simultaneously is not supported on windows Resolve systems."
  • MichaelJames
    MichaelJames Posts: 2,034 Enthusiast

    A Titan has 2688 Cuda cores and the 780 has 2304.  The 780 has 3gb of vram vs the titan which has 6gb. 
    From Tom's Hardware there is said to be a 10% performance difference between the two cards but that the 780 could easily be overclocked to almost match a stock gtx titan for a $350 price difference. 

    I went with the 780 SC because 10% stock performance difference is not worth $350 to me.

  • LandonParks
    LandonParks Posts: 22
    edited November 2013
    Hi Michael. I never thought to look at the cards in the 570-580 range. I know they can be more affordable than the Titan. What really bugs me is that most programs that render can't use 2 GPU's. I wish this could be worked out. I know that Resolve can use two GPU's. I use resolve to grade the BMCC footage from my camera, and it came with a full license. Frankly though, I have never ran into any hiccups within resolves with my current configuration, so I'm not sure what situation would warrant dual GPU's for that program.
    *Edit: I just looked at the prices for the 780's. $450-$500 is not bad at all, when compared to the performance difference. I'm beginning to think a $1,000 Titan may not be the best way to go.
  • MichaelJames
    MichaelJames Posts: 2,034 Enthusiast
    edited November 2013

    I got the 780 when it cost about 780 earlier this year.  Sure most programs may not need it but if you can have 2 780 sc's in your case for the price of 1 titan... you can overclock both 780s to be individually as fast as a titan.  That may not help you with hitfilm and most NLEs but if you have a grand to burn why not get the most out of it?   That's cool you have a bmcc too.  Im still trying to learn it and get the required accessories to film.  Any tips for filming outside and possibly showing the sun?  Down the line in major purchases after paying off my BMCC I plan to get a thunderbolt equipped laptop(pc of course).  I have a thunderbolt equipped pc and so for green screen projects at home or in a controlled studio set up i'd have no problem lugging my computer in but it would be nice to have a affordable laptop to do everything on.
    I just got 1 Tiffen .9 ND filter and im trying to see how many i'd need to have a wide open aperature and have a reasonable depth of field outside.

  • I'm still playing with the camera myself. I haven't had it that long, and I have yet to really shoot anything real with it. I did a test for the dynamic range, and I found that it holds up 10x better than my GH2 in bright sunlights. Being that I'm a novice myself, I can't really offer much advice - other than you might still need some ND's if your shooting bright sunlight. Matte box helps loads too.
    I actually bought the camera to use in the film I'm shooting in July, and wanted to have plenty of time to play around with it before then. The film is 100% green screen studio, so much of the issues that have arisen on the camera will not affect the shoot (battery life, portability, etc). For outside (general purpose shots), I'd probably still use my GH2 for its shallow DOF and more "filmic" looking image. BMCC is great for the studio or green screen work, I just don't know how practical it'll be in the field yet.
  • MichaelJames
    MichaelJames Posts: 2,034 Enthusiast
    Ive shot outside with the bmcc for my timelapse... It let me take the blown out sky from white back to detailed color and clouds.  It was just annoying that the sun had the black dot from clipping. 
    Im filming outside with no battery solution or secondary ssd(budgetary reasons).  So i'll be running my extension cord and I have something to hook my ssd up to a pc using a usb 3.0 drive to offload the footage.  (i'll also be filming on my property).  Not ideal to say the least but I actually just mapped out my shots and took photos so that should help cut down on the need for endless takes and guessing how you want your shot composed. 
  • Masqutti
    Masqutti Posts: 340
    This is rather complex subject and highly depends on software, where you need your power. As said, Hitfilm or AE can't utilize dual GPU's, but on the other hand, many 3D software can.
    And then there's CUDA vs. OpenCL. If you use blender alot, openCL is not fully implemented so to stick with Nvidia for a while. If not, some 3D softwares already supports openCL and then, AMD is your choice.
    Then, with Nvidia: CUDA cores have been suppressed after 500-series. That means, GTX570, 580,590, are your best price/performance-ratios. with dual PCI-E slots, you can get dual GTX590, 3GB versions, and those have dual chips on board, so, it's basicly a quad-sli. And that beats almost everything hands on, except the Titan I believe!

    If you plan to use AMD, there is this R9-290X that will kick Titan's *** and costs only 600$. I dunno when it's being released tho. If you REALLY need cuda, go with the Titan or dual GTX590's...
    I hope blender supports OpenCL in *near* future so the Nvidia-noose loosens..
  • MichaelJames
    MichaelJames Posts: 2,034 Enthusiast
    As someone who transitioned from the GTX 580 3gb classified to  the GTX 780 SC...  The fact that it draws significantly less power is very nice.
    Here is a breakdown of the GTX 780 vs the GTX 590.  I was always hearing that the 590 didn't work super well and was not really worth the money.  The 590 has twice as many output processors... amuch wider memory bus and a high memory bandwidth.

    In a comparison of the 780 vs the 780 ti thats a closer call with the 780 Ti winning but its not a savage beating like the 590.
    Here is my card the 780 sc vs a titan

    In the quest for the best GPU you need to decide where performance meets money.  You have an 8 core processor oc'ed to 5ghz.  So your PC already has some money invested and it has some strong legs to walk around on.  Some of the spec differences are nice but when you see the real world performance its very close.  Pick your favorite/most used program and find out the performance difference... between cards. 
    If its a program that only works with 1 GPU and you are most likely going to over clock both your GPU what is the difference between performance levels of the GPUs overclocked?  If its 10% difference is that worth 500 or $300 bucks more?  A 90% on a test may not be as impressive as a 100%... but at the end of the day its the same grade.   If your favorite/most used program allows for 2 cards... then that changes the ball park.  Not every program will use 2 cards which is true but if the program you really use  the most/is the most critical program does... then why not gear for that?  If you have 1 grand + tax ready to go and your most beloved program will take 2 cards... why not go for that? 
    When i got my 580 classified i was satisfied and skipped the 600 series. i'll probably get another 780sc and skip the next generation and most of the following generation until the GTX 980 ti is out.
  • Masqutti
    Masqutti Posts: 340
    Yeah Titan is overpriced, no doubt about that.
    Mantle (for amd) and OpenCL(for amd, and nvidia) are hopefully systems, that will shake Nvidia's and Intel's market domination. They should be. Comparing to now, we'll get cheaper and more powerful computers in 2 years, I hope! :)