If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Remnant II might be designed around upscalers, but DLSS isn't a band aid for bad performance

To rely so much on DLSS, FSR and XeSS misses their point

An armoured challenger faces off against a bulky, axe-wielder in a fiery hellscape from Remnant 2.
Image credit: Gearbox Publishing

Ideally, the story of Remnant II’s launch would start and end with it being an ambitious success of a shooty Soulslike. Sadly not: in a Reddit post addressing complaints of wonky performance, even on higher-end graphics cards, developers Gunfire Games admitted to having "designed the game with upscaling in mind."

Working the likes of Nvidia DLSS, AMD FSR, and Intel XeSS into a game wouldn’t normally be a cause of incredulity. Indeed, as optional performance boosters, they’re pretty much always a welcome sight amid the duller texture filtering and ambient occlusion toggles of the average settings menu. The problem here is that with its dismal performance at native, non-upscaled resolutions, Remnant II essentially forgets about the 'optional' part – and in doing so, undermines what makes DLSS and its rivals such valuable tools in the first place.

To its credit, Remnant II does support all three of the big upscaling players – DLSS, FSR, and XeSS – so nobody will go without one based on their GPU of choice. But, my word, does it run badly if you switch them all off. Let’s take the GTX 1060, an older card but one that’s still the third most-used among Steam players, as well as one that easily outperforms the GTX 1650 listed in Remnant II’s minimum specs. When I used it to run Ultra quality at 1080p, it limped to just 18fps on average: surely unplayable to even the most hardware-disinterested player.

Sure enough, upscaling was essential to get this usually reliable GPU up to an acceptable pace. Even so, XeSS – chosen because to my eyes it looks less fuzzy than FSR at 1080p – only got it up to 29fps on its Quality setting, forcing a drop down to the blurrier Performance mode just to eke out 37fps. It’s not great, and it’s not as if the visuals themselves look or feel deserving of the performance cost. Remnant II is reasonably pretty, but hardly groundbreaking in its aesthetics, and there’s no ray tracing to bring the FPS down.

Three gunners run toward a jungle ruin in Remnant 2
Image credit: Gearbox Publishing

The situation remains unimpressive even if you’re lucky enough to own more cutting-edge PC kit. I also tried the RTX 4070 Ti, an £800-ish howitzer of a graphics card that can blow past 60fps in most games at 4K. No such luck in Remnant II, which only averaged 59fps on Ultra quality at 1440p. Eight hundred British pounds and Quad HD is still too much for a reliable 60fps at native resolution! To actually get the premium performance that the hardware deserves, it took me deploying DLSS on Performance mode to get 91fps, and that’s still lower than any RTX 4070 Ti owner is going to like.

I wouldn’t necessarily lump Remnant II with the absolute worst technical disasterpieces that 2023 has inflicted upon us; The Last of Us Part 1 and especially Redfall were broken in more varied, sometimes grimly spectacular ways. But at least those weren’t dodgy by design. Remnant II’s native performance has been intentionally, now openly left malnourished, apparently in the hope that upscaling will cover up its protruding bones.

This is not what DLSS et al were designed for, and it shouldn’t be an acceptable use for them now. Upscalers were never meant to act as emergency treatment for infirm games, but to take adequate performance and enhance it to the point where it felt like you were pulling premium framerates. DLSS’ first appearances came in games like Battlefield V and Final Fantasy XV: games which already ran competently on the PC hardware of the day. It was a bonus, a gift, a key to unlock the full potential of high-refresh-rate gaming monitors. If upscaling instead becomes a necessity, then it’s not enhancing anything, and it’s certainly not extracting extra value out of your other components and peripherals.

A player faces off against a dog-like alien in Remnant 2.
Image credit: Gearbox Publishing

And, as positive an inclusion as upscalers can be, the benefits do come with costs – and enough of them that making an effective requirement out of them feels even more iffy. Because they work by rendering frames at a lower-than-native resolution, the final picture quality will be fuzzier and more flawed the lower your display rez. This becomes a problem at 1080p, still the most popular gaming monitor resolution by far, where even upscalers’ highest quality modes can lose fine detail reproduction and add texture blurring compared to native rendering. Some framerate speed freaks might make the choice to sacrifice this picture quality in other games, but at least it’s a choice in spirit as well as letter. For anyone playing Remnant II on older or entry-level hardware, its FSR and XeSS toggles might as well come etched in stone.

By most accounts, including our reviewer Jason’s, the actual game-y bits of Remnant II are still enjoyable in spite of any technical missteps. Which is good! But in a corner of the PC gaming world that’s often characterised by joyless annual CPU releases and pisstaking GPU prices, upscalers like DLSS remain one of the highlights; science indistinguishable from magic, capable of inciting a dopamine hit every time you see that FPS counter shoot up. To see them misappropriated like this is, if not ruinous, then at least a little sad.

Rock Paper Shotgun is the home of PC gaming

Sign in and join us on our journey to discover strange and compelling PC games.

In this article

Remnant 2

PS5, Xbox Series X/S, PC

Related topics
About the Author
James Archer avatar

James Archer

Hardware Editor

James had previously hung around beneath the RPS treehouse as a freelancer, before being told to drop the pine cones and climb up to become hardware editor. He has over a decade’s experience in testing/writing about tech and games, something you can probably tell from his hairline.

Comments