DF Weekly: can the new RTX 5070 really match RTX 4090 performance?
This week’s Digital Foundry Direct comes straight from Las Vegas, where Oliver Mackenzie and Alex Battaglia share their impressions on the technologies revealed on the showfloor – kicking off with an overview of the Nvidia RTX 50-series reveal keynote. It’s a claim in that presentation that I’m going to tackle in this week’s blog. Is the upcoming RTX 5070 really offering RTX 4090 series ‘performance’? Based on established parlance, the answer is – of course – no, but the reasons behind the claim are straightforward enough and prescient of the era to follow: big leaps in frame-rate via hardware are diminishing and, like it or not, the future is biased more towards software, with machine learning taking a leading role.
To tackle Nvidia’s RTX 5070 claim head on, the notion of performance parity is based entirely on the implementation of DLSS 4 multi frame generation, where the new $549 GPU is presumably tested against an RTX 4090 running the same resolution and settings but only using single frame generation. By all quantifiable metrics, the claim holds no water. Even running on a more modern architecture and process node, the RTX 5070’s 6,144 CUDA cores are no match for the RTX 4090’s 16,384 – meaning a similar disparity in terms of RT and tensor cores. A 384-bit memory interface on the RTX 4090 gives way to a 192-bit bus on the RTX 5070. While the RTX 5070 has faster memory, it only has 12GB of it, up against the 24GB of the outgoing machine.
Put simply, without DLSS 4, the RTX 5070 will not match the RTX 4090 – unless we’re talking about relatively simple games running to the limit of your display’s refresh rate (v-sync or capped). On the face of it, Nvidia’s claims are ludicrous – and yet hands-on reports from CES like this one from PC Games N are complimentary, detailing a Marvel Rivals demo showing a 5070 with frame generation showing much higher frame-rates than a 4090, though apparently parity is more common. My guess is that Marvel Rivals is CPU-limited in this demo, where frame generation shows the biggest multipliers to frame-rate, but even so, DLSS 4 is leaving a positive impression and the frame-rate claim is borne out, albeit with a big AI asterix attached.
0:00:00 Introduction0:01:47 Nvidia CES demos – RTX Mega Geometry0:14:25 RTX Neural Materials0:21:24 RTX Neural Faces and RTX Hair0:31:37 ReStir Path Tracing + Mega Geometry0:36:49 Black State with DLSS 40:42:47 Alan Wake 2 with DLSS 40:46:01 Reflex 2 in The Finals0:53:22 AMD at CES: AI denoiser demo, Lenovo Legion Go handhelds1:03:51 Razer at CES: Laptop Cooling Pad, new Razer Blade1:11:30 Asus and Intel at CES1:17:29 CES displays: Mini-LED, Micro-LED, OLED + monitor sins!1:30:07 Supporter Q1: Will Switch 2 support DLSS 4?1:32:22 Supporter Q2: Did you see the Switch 2 mockups at CES?1:33:56 Supporter Q3: Could you test DLSS against an ultra high-res “ground truth” image?1:37:52 Supporter Q4: Why would a developer use Nvidia-developed rendering techniques over their UE5 equivalents?1:40:38 Supporter Q5: Will multi frame gen solve game stutters?1:42:05 Supporter Q6: Will multi frame gen make VRR obsolete?1:44:27 Supporter Q7: Is Sony regretting sticking with AMD for their console business?1:49:49 Supporter Q8: What do you think of the FF7 Rebirth PC specs?1:52:37 Supporter Q9: What’s the craziest thing you’ve seen on the show floor at CES?
I would bet good money that Nvidia made the 5070/4090 comparison knowing that the idea would come under heavy scrutiny. The firm clearly believes that all the testing to come will be a net positive for its claims, even when negative points of the 5070 experience are brought to the fore. And at a brutal SEO-level, the more 5070/4090 comparisons that come along, the heavier the algorithmic boost the original claim will receive.