Daz RTX 4060ti or RTX4070?

rewrewrwe

Newbie
May 31, 2017
46
105
Hi all, new dev here and Im wondering about these 2 cards. I know that Nvidia is cancer company but we have to use them, so are these cards good? How fast can they render? 4070 is a lot more expensive in my country so Im wondering if the difference is really worth it. Im talking about ordinary VN scenes set in one room with 3 people max in it and decent optimization. Only benchmarks I have found are for gaming and blender but literally nothing for DAZ. I also have 16gb of ram.
 

n00bi

Active Member
Nov 24, 2022
518
606
Hi.

Check this link out from Nvidia them self.

Scroll down until you see the 4060 series on the list.

And i would say. dont buy the 4060 if you have a bit $ to use.
the number of cuda cores is what matters if you want to do gpu rendering.

4070, is a much better choice.
4070 Super, a better choice.
4070 Ti Super, even a better choice.

Also. Nvidia is launching the 50XX series soon.
So you might want to wait and hope that the 40xx series drop in prices.
or you might be unlucky and not finding a 40xx cos everyone bought them cos the 50XX will cost your soul.

I recently bought a 4070 Ti Super,, it set be back about $1050 :cautious:
Damn Nvidia and their prices, its making me depressed..

If you really need to go for a 4060.. atleast try and find a Ti version as it has 1K + more cuda cores.
also you might want to get atleast 16GB additional ram. 16GB is on the lowside.
 
Last edited:
  • Like
Reactions: rewrewrwe

Vollezar

Gingers are love. Gingers are life.
Game Developer
Aug 14, 2020
1,339
7,411
4060Ti if it has 16GB vRAM. Memory is far more important. I used a 4060Ti 16GB. Made rendering the scenes with a lot of characters and/or assets much easier. I didn't have to render some scenes in 2 or 3 parts because of the low memory, as I had to do with the 3060Ti before I got the 4060 Ti.

4070 won't render a whole lot faster than the 4060Ti and if you go over the 12GB memory it will be slower because now you might have to render in two parts and then combine those images. And setting up the lighting for those kinds of renders can be a massive pain in the ass.
 

MissFortune

I Was Once, Possibly, Maybe, Perhaps… A Harem King
Respected User
Game Developer
Aug 17, 2019
5,828
9,299
Nvidia isn't a cancer company, imo. Yeah, their prices can suck. But the fact of the matter is that they're the only ones who actually care about hobbyist/semi-professional hobbyists that won't/can't spend $5000 on a A6000 or something of the sort. You ever notice how Nvidia has support in every single meaningful software? Adobe, Blender, Daz, Maya/3DS, etc. Anything that works with GPU prefers Nvidia. That kind of support isn't happening for AMD or Intel. The latter two seem to be convinced the only people buying GPUs are gamers, which just isn't true. I'd argue a majority of high-end card sales are going toward hobbyists, and Nvidia's wide support appeals to those hobbyists.

VRAM is king with Daz, though. Same for any other rendering based software like Blender. The aforementioned 4060ti with 16gb of VRAM is a solid option if you're certain on getting a new one and don't have the income.

That said, if you're patient and can save a bit more money, and don't mind buying used, I'd very much recommend waiting until the 50 series comes out. Prices on modern cards will start dropping, especially the 30 series. You can already find used 3090s for $500-$700. Probably even cheaper locally. I imagine it drops pretty heavily after said 50 series starts dropping. That, or prices go up because nobody can get their hands on 50 series cards. That's not gospel, especially since it sounds like you're outside the US. $700 USD might very well be $5000 in your currency.

Something to think about is power in your current system. The aforementioned RAM (system memory, not VRAM) should be pumped up to 32GB, bare minimum. Your system RAM should ideally be double that of your VRAM. Depending on the GPU you end up getting, your PSU might be something to take a look at. Check the max wattage of everything in your system and see if you have the headroom for an upgraded GPU. Though, a used 750W or 1000W should be fairly cheap.
 

Turning Tricks

Rendering Fantasies
Game Developer
Apr 9, 2022
1,519
2,807
I agree with the others above about (1) VRAM is king and (2) your system RAM needs to be at least 32GB.

I too have a 4060 Ti - 16 GB and it's dropped now to like $450 USD for the MSI Ventus model. It works great and only draws 165 Watts at max, which it will never reach using DAZ alone. DAZ actually doesn't stress GPU's out much - It just eats VRAM like candy. Gaming is about the only thing I do that can max my card out.

The 15GB of VRAM will handle almost anything you might want to do right now. I've only maxed it out a couple of times and today, when it happened during an image series of renders, I found out it was because of one of the weird memory leaks DAZ has. I couldn't figure out why my image series was terminating mid-way through the timeline. So i watched my task manager and my GPU was showing over 15GB being used. Which it shouldn't have for this scene. So I deleted a few environment sections not visible in the renders and rebooted DAZ and my usage dropped back down to just over 6GB. DAZ is twitchy and you have to remember to just restart it at times to clear any of these leaks.
 
  • Like
Reactions: Vollezar

n00bi

Active Member
Nov 24, 2022
518
606
I agree with the others above about (1) VRAM is king
If Vram is the King then Cuda cores is tha Queen.
Yes Vram is nice. but so is the number of cores aswell IF....

...You want to use other rendering engines besides Iray.
for example with Octane Render it will use cuda cores to calculate GI, AO, volume etc, speeding up the rendering a bit if you use thouse sort of lighting etc.
And same with RedShift, it will also use cuda cores for GI, AO etc, Rt cores for path tracing and ray tracing, speeding up light calulations.
I dont think there is a Redshift kit for Daz3D but i believe there is a Octane Rendering Kit for Daz3D.
But Octane is ofc not free.
Also using these rendering engines you need to change a hole lot of things in regards to the materials and lights you use.
Iray dosent really use the cuda cores effeciant.
 

MissFortune

I Was Once, Possibly, Maybe, Perhaps… A Harem King
Respected User
Game Developer
Aug 17, 2019
5,828
9,299
If Vram is the King then Cuda cores is tha Queen.
Yes Vram is nice. but so is the number of cores aswell IF....

...You want to use other rendering engines besides Iray.
for example with Octane Render it will use cuda cores to calculate GI, AO, volume etc, speeding up the rendering a bit if you use thouse sort of lighting etc.
And same with RedShift, it will also use cuda cores for GI, AO etc, Rt cores for path tracing and ray tracing, speeding up light calulations.
I dont think there is a Redshift kit for Daz3D but i believe there is a Octane Rendering Kit for Daz3D.
But Octane is ofc not free.
Also using these rendering engines you need to change a hole lot of things in regards to the materials and lights you use.
Iray dosent really use the cuda cores effeciant.
I don't think Octane are supporting Daz anymore either way, but your point still stands.

I think the biggest difference is that Cuda cores are meaningless if you're running out of VRAM. But if you have enough VRAM, then Cuda cores will get used. I'd think that much would generally apply to most software packages. VRAM is and should be the priority when it comes to 3D rendering, imo.
 
  • Like
Reactions: Vollezar

n00bi

Active Member
Nov 24, 2022
518
606
I think the biggest difference is that Cuda cores are meaningless if you're running out of VRAM
Yea. indeed. all the cores in the world wouldn't speed things up if you don't have the ram to put all the calculations etc in.
Sadly on the PC's we cant assign/share system memory as vram like its done on a Mac with Nvidia.
 

Turning Tricks

Rendering Fantasies
Game Developer
Apr 9, 2022
1,519
2,807
Yea. indeed. all the cores in the world wouldn't speed things up if you don't have the ram to put all the calculations etc in.
Sadly on the PC's we cant assign/share system memory as vram like its done on a Mac with Nvidia.
Another thing about Cuda cores is that Daz really doesn't use a lot of the GPU's processing power. I was surprised when I saw how little my 4060 was using.

Take right this moment for example. I'm cooking some special 4K renders for my subscribers. I do them in UHD 3840 x 2160 and this one I have going right now is only using 6% of my NVidia's processing power and 6.5GB of VRAM.

6%! That's unreal.

But then I load up World of Warships for some fun and I am getting 175 fps.. haha!


GPU_snip.JPG
 

n00bi

Active Member
Nov 24, 2022
518
606
Another thing about Cuda cores is that Daz really doesn't use a lot of the GPU's processing power.
That is not entirely true. In you Image press arrow under "3D or "Copy or Video Encode ..." to set the cuda cores to be displayed.

Fire up daz with the standard "Tutorial Starter Scene" and do a render and watch..
You will see it uses all of your cuda cores, but as i said Iray is not as efficient as Octane or Redshift.
Iirc Iray uses a mix of GPU and CPU pipeline.
In Redshift you can do this something like this too. its called hybrid rendering, but that is something you don't touch as it slows things :p
Im not sure about Octane as i never used it.
Sad that they droped support for Daz3D tho.

Daz3DTest.jpg


Playing Indiana Jones at Max Setting "Supreme" locked @ 60Fps.
I dont care about 100's of fps. i want a fixed rate, caped at some freq, 24, 30, 60 120 etc, im not a big fan of the fps being all over the place :)
But damn its Vram Hungry. The game is silk smooth tho.

IndianaJones.jpg

But as you see. the cuda cores means 0 for gaming.

Also if you are doing Other stuff aswell as CV or having a local AI.
You will bennefit from having a lot of cuda cores.

explorer_mz94IaM2i4.gif
 

Turning Tricks

Rendering Fantasies
Game Developer
Apr 9, 2022
1,519
2,807
That is not entirely true. In you Image press arrow under "3D or "Copy or Video Encode ..." to set the cuda cores to be displayed.

Fire up daz with the standard "Tutorial Starter Scene" and do a render and watch..
You will see it uses all of your cuda cores, but as i said Iray is not as efficient as Octane or Redshift.
Iirc Iray uses a mix of GPU and CPU pipeline.
In Redshift you can do this something like this too. its called hybrid rendering, but that is something you don't touch as it slows things :p
Im not sure about Octane as i never used it.
Sad that they droped support for Daz3D tho.

View attachment 4375492


Playing Indiana Jones at Max Setting "Supreme" locked @ 60Fps.
I dont care about 100's of fps. i want a fixed rate, caped at some freq, 24, 30, 60 120 etc, im not a big fan of the fps being all over the place :)
But damn its Vram Hungry. The game is silk smooth tho.

View attachment 4375511

But as you see. the cuda cores means 0 for gaming.

Also if you are doing Other stuff aswell as CV or having a local AI.
You will bennefit from having a lot of cuda cores.

View attachment 4375534
That's interesting. I don't game as much like I used to ages ago, so I stopped worrying about stats and just noticed that the new GPU runs my older games amazing well.

But my point still stands... CUDA cores are, of course, the main speed determination for Iray renders. But the difference between a top of the line 4090 with 24GB of VRAM and my 4060 Ti is not near as much as most people think. I know this because I was using F95's Iray server for many months. I know how long my renders took on it and on my own rig now and the difference is fairly modest. I don't have benchmarks to quote but generally speaking I used to be able to do a 4K full frame render on the Iray server in under 30 minutes to 6000 iterations, while my 4060 can do that in about 50 minutes.

That's purely empirical data but still valid. And for smaller images that are less GPU intensive, that time difference becomes even smaller. Like I can rip through a 1080p render in under 10 minutes to 10K iterations, although most of my renders only need 5-6K for best results.

In any case, sure beats the hell out of 2-3hour CPU renders! haha!
 
  • Like
Reactions: n00bi

n00bi

Active Member
Nov 24, 2022
518
606
But the difference between a top of the line 4090 with 24GB of VRAM and my 4060 Ti is not near as much as most people think.
Fair enough. I cant do any comparisons by myself between 40xx Nv cards other than look at stats.
As my previous card was from amd, i am mostly using c4d and it has somewhat support for amd card. but it really suck.
I mean really really suck. A bit faster than cpu rendering but still blehh.
Going from my prev amd card to this 4070 Ti Super was like switching from a old Lada to a space ship with ftl drive.

When it comes to rendering and gaming Vram is indeed King it has basically always been that way.
And number of Shader cores etc.
Cuda is mosly usefull where you can do a lot parallel calculations.
That is lights/shadows so forth. but can also be procedural things. mat/text.
imagine a finite mandelbot at 1k.. it only takes a few parameters. "little data" but can be parallelized like crazy cos of the symmetry if it. so a ton of parallel calculations.
but in games. they often trick the player by using baked lightmaps/shadow maps and not rely on real time calulations.
othen they just have 1-3 lights that can cast shadows. and that is often done in the shader by simple means as raymaching.
So few games uses Cuda and|or RT cores to its fully potensial.

So he should think about if he will be doing other things aswell like mentioned. AI, CV, Other modeling tools/rendering engines etc in the future..
Or is he only going to be doing Rendering with Daz.
If he is to only focus on Daz rendering not not mess around so much other things.
The a 4060 Ti might be the best "Bang for the buck"

I am happy with my card. it was expensive tho.
Dont just rush and buy a card.

Anyway.. I am never going back to Cpu rendering OR a AMD card :D
 
Last edited:

Turning Tricks

Rendering Fantasies
Game Developer
Apr 9, 2022
1,519
2,807
Fair enough. I cant do any comparisons by myself between 40xx Nv cards other than look at stats.
As my previous card was from amd, i am mostly using c4d and it has somewhat support for amd card. but it really suck.
I mean really really suck. A bit faster than cpu rendering but still blehh.
Going from my prev amd card to this 4070 Ti Super was like switching from a old Lada to a space ship with ftl drive.

When it comes to rendering and gaming Vram is indeed King it has basically always been that way.
And number of Shader cores etc.
Cuda is mosly usefull where you can do a lot parallel calculations.
That is lights/shadows so forth. but can also be procedural things. mat/text.
imagine a finite mandelbot at 1k.. it only takes a few parameters. "little data" but can be parallelized like crazy cos of the symmetry if it. so a ton of parallel calculations.
but in games. they often trick the player by using baked lightmaps/shadow maps and not rely on real time calulations.
othen they just have 1-3 lights that can cast shadows. and that is often done in the shader by simple means as raymaching.
So few games uses Cuda and|or RT cores to its fully potensial.

So he should think about if he will be doing other things aswell like mentioned. AI, CV, Other modeling tools/rendering engines etc in the future..
Or is he only going to be doing Rendering with Daz.
If he is to only focus on Daz rendering not not mess around so much other things.
The a 4060 Ti might be the best "Bang for the buck"

I am happy with my card. it was expensive tho.
Dont just rush and buy a card.

Anyway.. I am never going back to Cpu rendering OR a AMD card :D
I used to be an AMD guy. I still love them, but I learned Daz and I am pretty much locked-in on using Nvidia if I want to get any sort of performance. Honestly, I kinda ignored Nvidia for that last 20 years or so and when I started shopping for a new card, I looked at them some more and was shocked to see how big they have become. Their GPU based AI cards have become a global monopoly and they are like worth $3.3 Trillion now.

I remember I once got a VIP pass for an Nvidia launch party back in May of 2003. I used to run a website for EverQuest and because of that I got free press passes to the E3 expo in LA for a few years at that time. One of the Nvidia reps saw my press pass, so they handed me a VIP pass for their private party at the Kodak Center in Hollywood (where they do the Oscars).

Nvidia was launching the FX series cards at the time and they were exceptional - iirc - because they were the first GPU that could cast realtime shadows and do real fire effects. Boy! Have times changed! Haha!

Cool party though. When I walked in they handed me a custom backpack filled with goodies like PC games, clothing, etc. It was an open bar and a lot of the 'booth babes' from the E3 expo were there in skimpy cosplay outfits. They even had Smash Mouth playing. One of the funnest E3's I ever attended!

Back then they were the scrappy 'new guys'... now they are the world leaders in AI and Gaming GPU's... go figure.

vipparty2.jpg
 
  • Wow
Reactions: n00bi