• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FFXIV Dawntrail benchmark drop and graphical updates

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

mackerel

Member
Joined
Mar 7, 2008
scale3.jpg

It has just been announced the FFXIV Dawntrail benchmark will drop soon, 14 April 00:00 PDT or 07:00 UTC. This is the upcoming expansion with a graphical update to the game. I intend to test it vs the current Endwalker benchmark.

There are updates to the game's graphics affecting lighting/shadows, textures and likely more. Technically it also introduced upscaling support in the form of FSR 1.0 and DLSS 2.0. FSR will be enabled by default. The user has access to a render scale slider of 50% to 100%. Presuming 100% means native rendering. Does FSR1 do anything else when not scaling? There seems to be a separate AA setting shown elsewhere.

There is a dynamic resolution setting. The option shown in the screenshot says it can be enabled if it drops below 60fps. It wasn't shown what other settings there may be other than off. The benchmark has the same settings as the game so we should be able to find out.

DLSS support is implemented in an interesting way. It's on or off only. No user control beyond that. Also it is tied to the dynamic resolution setting. I think it would be nice to have user control over the settings, but let's see how it works in practice.

One concern I have is that in the existing game, there are many situations where the fps tanks because of CPU limits, not GPU. Typically this will be in areas with a high player count, like cities or hunt trains. Is it smart enough to recognise that this can't be helped by dropping graphical settings? This can't be testing in benchmark directly, but it might be possible by lowering available CPU resources to simulate that effect.

I was hoping this would be a simple run benchmark, get number, but with these settings there's going to be a lot more to explore. I think runs with FSR 1 at 100% scale and dynamic resolution disabled will be a starting point, but the performance-quality tradeoffs from upscaling will certainly be of interest.
 
Yes. The main info broadcasts are usually in Japanese since that's where the devs are. It is a worldwide game and worldwide release so I'd expect the benchmark to be multi-lingual - whatever the game supports already.
 
maxsummary.png

https://eu.finalfantasyxiv.com/benchmark/download/
Benchmark is up. Above are my initial results. At an initial glance perhaps there's ball park 30% average fps drop between Endwalker and Dawntrail benchmarks where the system is more GPU limited. I think at the higher fps it is more CPU limited so the apparent drop is lower.

I'm withholding my 7980XE system results because it is hard CPU limiting for some reason, which I'm not seeing on other systems. On the 7980XE I'm seeing thread 3 still getting maxed out. I tried turning on XMP (it was off for other reasons) and messing around with Windows power plans but they had no significant impact. This effect is repeatable between both benchmark versions and also the full game.
 
Thanks for the info and taking the time. :)

1080p (and down) is cpu limited in many cases... not surprised what you're seeing. I expect a 11700k with a 4070 at 1080p to be cpu limited. I expect a 5800H to hold back a 3070 at 1080p. I expect a 7940x to hold back most anything at 1080p... and same thing with the broadwell chip at 720p.

Edit: I'd be even more interested in seeing the gpus on all those chips at varying resolutions.
 
Last edited:
As this is an MMORPG it doesn't really benefit from astronomical fps numbers. IMO 3 digits is more than enough. I currently have a 97fps limit on my system to help keep the GPU cooler while playing. There's no magic to that number, it just happens to be where the slider stops around 100!

I did run the 4070 at 4k too, averaged barely above 60 fps.

What will be interesting is trying out the dynamic resolution feature on lower end GPUs. I might get around to it later today but I'm a bit benched out already.

BTW the 7940X is getting >100fps with a 1080Ti at 1080p. My main system is a 7980XE with 3070, but it is below 80fps average in this and the older Endwalker benchmark. I've tried various things but I think Windows scheduler is doing weird things. It isn't using effectively more than 2 cores worth and that's choking it. It's only around 100 MHz or so lower than the 7940X at equal core turbo so that doesn't explain it. Tried messing about with XMP, Windows Power settings, HT off and affinity. I need to observe another system to make sure it isn't similarly saturating a core and is spreading work out more evenly.

Oh, a small danger. If using nvidia and custom settings for the game, that seems to also apply to the benchmark. However removing that fps cap didn't change the problem observed so there is something else going on.
 
Right... I dont care about whats playable or not... just the cpu and the limits/curiosity of that.
 
Got it, you're looking at it from a pure testing perspective. I have both hats on, since I do play the game as well as test hardware. I've never been that big on GPU testing and I wouldn't put this one in the higher tiers of benchmarks. Still, it is another data point if you want a variety of games.

I'm now more interested in the new features of dynamic resolution and DLSS than raw performance, especially their impact to visual quality. Still, I need to know where things stand before applying those settings. The A380 system is ideal for trying out FSR and dynamic resolution. It will struggle at 1080p. Can it be enough to make it attain a 60fps target? Not sure how to test DLSS yet. The weakest RTX GPU I have is the 3070 Laptop. Might have to run that at 4k output to give it enough to chew on.
 
On one hand, using fsr/dlss pits more stress on the gpu, but at the same time the cpu has to spit more frames out too. It would be interesting to see how that works up and down cpu and gpu tiers.
 
I think the main goal of this implementation of dynamic resolution is to maintain a good performance floor, not so much to push out insane fps for reasons given earlier. The increase in minimum spec could put potato PC players in a tough spot.

My further testing might be tomorrow now. I think I'm tested out for one day.
 
I think the main goal of this implementation of dynamic resolution is to maintain a good performance floor, not so much to push out insane fps
Was that ever the point of those technologies (insane fps)? I always thought it was there to raise fps, period.

Ultimately im curious about the increases on the cpis when using the same gpu on them. when you use it. I have a hunch you'll get more returns out of the higher end cpus than you will the potatoes.
 
FSR/DLSS = improve performance with limited impact to quality. Yes it (generally) increases fps vs without, but it was more to help lower systems to get to a decent level. I don't feel it was intended to make an already fast system go even faster.
Dynamic resolution = help reach performance target while maximising quality

So they are kinda same thing but at same time different approaches. Because of the limited settings in this benchmark, if you want to use DLSS you have to also enable dynamic resolution.

I get what you're after through testing, which is not the same as what I'm after. At some point today I will start looking at the image quality impacts which is going to be a lot more difficult.
 
I see what you're saying, but I don't think the type of system has much to do with its purpose. It does the same thing for high-end systems that are brought to their knees by certain titles (CP 2077, Avatar, etc.). It allows you to use other IQ increasing features like Ray Tracing and still get a decent frame rate. It increases frame rates for everyone and every system (to what extent is the question). But, I'm being a bit pedantic here... :)

As far as IQ comparisons, I suggest the NV ICAT software.

I also saw this on the Wiki... gives you an idea of what NV expects with the different DLSS settings. I know you can't set it, but it may give you an idea of what it's 'set at' when you see the results.

1713188225443.png
 
I've used ICAT before, handy for comparing multiple sources.

I'm not looking forward to the number of recordings I'll need to make:
1: baseline reference (max, FSR1 100%)
2: as 1 with old AA (FXAA instead of TSCMAA)
3a: as 1 + dynamic resolution
3b: as 1 + reduced render scale
4: default + DLSS (which also enables dynamic resolution)

2 vs 1 will be focused on temporal stability so that has to be in video not stills.
3a will need a trace of frametimes over the duration to compare with 1 but I'll leave the image quality comparison to 3b
3b at least is easy, a straight image comparison for the upscaler. Wait, although FSR1 doesn't have temporal processing, TSMCAA does. I need to decide if I use FXAA for both, or TSCMAA for both. If I keep it to "what would a user likely choose" it should be TSCMAA.
Similarly 4 vs 1 as both will have temporal elements to it, requiring video.

Getting the result is fun. Making the captures in the first place less so. I'm happy I'm not a professional reviewer needing to do all this often under time pressure.
 
FSR/DLSS = improve performance with limited impact to quality. Yes it (generally) increases fps vs without, but it was more to help lower systems to get to a decent level.
In every game with DLSS quality (haven't seen this ultra quality mode) I've played so far (1440p to 960p), the image quality is actually superior to the original even though it's at a lower resolution, you notice it mainly in far off/shimmering objects. Any lower setting than that is debatable, but will depend on the game and what version of the DLL you're using, I tend to stick to the SDKs. I remember linking an RTX ray reconstruction video (DLSS on) that showed amazing quality even at performance mode somewhere on the cyberpunk thread.

From what I've seen, the Issue so far is ghosting and every now and then minor image glitches, not the quality itself.

Honestly no idea how it works with FSR :shrug:
 
As this is an MMORPG it doesn't really benefit from astronomical fps numbers. IMO 3 digits is more than enough. I currently have a 97fps limit on my system to help keep the GPU cooler while playing. There's no magic to that number, it just happens to be where the slider stops around 100!

I did run the 4070 at 4k too, averaged barely above 60 fps.


Just a FYI the games jiggle physics is based on FPS and the range that they work in is somewhere between 60-75fps, anything outside of that and you most likely wont see them as the models are stiff at higher FPS
 
I like how Intel has done taken the same path as Nvidia and used a DLL that can be switched at will for better versions

"Image Quality Enhanced: DLSS 3.7 vs XeSS 1.3 vs FSR 2 - ML Upscaling Just Got Better"

 
Honestly no idea how it works with FSR :shrug:
FSR1 as used by this benchmark is a spatial upscaler only. Better than nothing but it is a last resort. FSR2 and DLSS have a temporal element. If you look at a static screenshot FSR2 isn't bad, but it is worse than DLSS2 over time.

Just a FYI the games jiggle physics is based on FPS and the range that they work in is somewhere between 60-75fps, anything outside of that and you most likely wont see them as the models are stiff at higher FPS
What jiggle physics? I've been playing the game for... I dunno, around 8 years? Can't say I ever noticed it. Used both 60 Hz and higher refresh rate systems in that time.

I like how Intel has done taken the same path as Nvidia and used a DLL that can be switched at will for better versions

"Image Quality Enhanced: DLSS 3.7 vs XeSS 1.3 vs FSR 2 - ML Upscaling Just Got Better"
I need to rewatch that. Had it playing in the background while doing a dungeon run. Suffice to say I didn't pay enough attention to it!

Is FSR 3.1 still not released yet? I need to look back at the announcement again. I really hope it addresses the temporal instability of FSR2.
 
What jiggle physics? I've been playing the game for... I dunno, around 8 years? Can't say I ever noticed it. Used both 60 Hz and higher refresh rate systems in that time.


FSR 3.1 was announced about a month ago, but don't think it's out just yet :shrug:
 
Back