I Put Nvidia’s RTX 50-Series Laptop GPU to the Test. Here’s What I Found

i-put-nvidia’s-rtx-50-series-laptop-gpu-to-the-test.-here’s-what-i-found

If you buy something using links in our stories, we may earn a commission. Learn more.

Two months after the anticipated desktop RTX 5090 and RTX 5080, the next phase of Nvidia’s next-gen graphics card lineup is finally here. I tried out the RTX 5090 mobile onboard the new Razer Blade 16, and while I’ll have broader thoughts on the machine soon in a review, I first want to focus on how the RTX 5090 itself feels on a gaming laptop.

Nvidia’s RTX 50 series has some interesting advances over the preceding RTX 40 series. The open approach to multi-frame generation, which uses artificial intelligence to generate additional frames to offer higher frame rates when gaming, has been fun to tweak. The extra VRAM in these cards is important in delivering smoother graphics too. But the increase in raw graphics performance is disappointingly small, and for a flagship GPU launching an entirely new architecture at the peak of Nvidia’s power, it left me wishing there was more on offer.

Photograph: Luke Larsen

The Laptop Matters

Nvidia did provide some caveats when handing over a Razer Blade 16 to test. It’s a thinner model than previous years—likely one of the thinnest gaming laptops with an RTX 5090 at the least. Razer had to shave off an extra 15 watts of total graphics power (TGP) to get it this tiny, meaning this laptop isn’t the best showcase for the 5090’s maximum performance. That said, Nvidia is no doubt proud to show off an RTX 5090 in a chassis that’s just 0.59 inches at its thinnest. The Blade also uses the Ryzen AI 9 HX 370 CPU instead of the beefier Intel chip in last year’s model.

But we’re about to be inundated with gaming laptops with graphics cards up to the RTX 5090. Unlike the RTX 5090 desktop GPU, it won’t be sold out and impossible to find at a decent price. Speaking of, it’s important to know that although it shares a name, the laptop version of the RTX 5090 doesn’t sit in the same category as the corresponding desktop GPU. Instead, it’s built on the same GPU (GB203) as the desktop RTX 5080. The mismatched naming has been used for the past few generations, despite the confusion it creates.

Photograph: Luke Larsen

One important change this year is that the laptop RTX 5090 comes with 24 GB of video random access memory (VRAM), sliding right between the two desktop GPUs in memory capacity. VRAM stores graphics data, and the more you have available, the smoother the graphics. This is the most VRAM we’ve ever seen on a laptop, even more than what you get in the $999 RTX 5080 desktop GPU. Given how sensitive newer games are to RAM usage these days, this should expand what’s possible and address one of the major complaints about the desktop RTX 5080.

In 3DMark tests, the mobile RTX 5090 lands just ahead of the desktop RTX 4070 Super, released at the beginning of 2024. Disappointingly, it doesn’t even offer a clear step up over 2023’s mobile RTX 4090. The extra memory will help quite a bit in specific games, but in all my 3DMark benchmarks, the RTX 5090 doesn’t have a clear victory over its predecessor.

But we live in strange times. With the RTX 5090, the primary question is not how good the raw performance is, but whether or not multi-frame generation lives up to the hype. As part of Deep Learning Super Sampling 4 (DLSS), it’s the primary feature Nvidia is selling the 50-series cards on, boasting that you can get RTX 4090 performance on a $1,300 laptop. Is that true? We might not know until the budget-tier mobile GPUs come out, but all the technology is in place to try it out on the RTX 5090 that’s now available.

It’s All About Multi-Frame Generation

Frame generation was a big deal when it launched in DLSS 3. Using predictive AI, DLSS can generate an artificial frame between every two rendered frames, greatly improving frame rates, powered by the GPU’s new Tensor cores. Multi-frame generation takes things a step further, letting you generate two or even three artificial frames for every two rendered frames. Nvidia says that equates to up to eight times the performance.

Photograph: Luke Larsen

As has often been true with DLSS, performance comes at a price in visual quality. Put simply, the more fake frames you add, especially to much lower frame rates, the more the image quality degrades. Frame generation works as promised, and when you stare at just the numbers, it’s impressive. If you are after those higher frame rate numbers, adding new frames to the game improves smoothness for sure, though maybe not to the degree you might expect. Generated frames don’t feel the same as native ones. The numbers get bigger as promised, but the more extra frames you add, the more input lag is inserted into the equation.

Cyberpunk 2077 is the most noteworthy example right now, hyped up for its all-in ray tracing features and native implementation of multi-frame generation. I tested the game on the Blade 16’s native 2,560 x 1,600-pixel resolution in the Ray Tracing Overdrive mode, tweaking the multi-frame generation as I went. In 2X multi-frame generation mode under the Quality setting, the RTX 5090 averaged 82 frames per second (fps). That jumped up to 115 fps in the 3X mode, making better use of the 240-Hz screen refresh rate on the Razer Blade 16.

Photograph: Luke Larsen

You can get even more juice from frame generation by changing DLSS to the Performance setting. In the past, I would have never recommended such a thing. But with DLSS 4, there have been huge improvements to image quality with the new Transformer model. It’s exposed as a toggle in Cyberpunk 2077 (unlike in other games), so you can compare screenshots side by side to see the difference. Shadows are more realistic, text looks sharper, and there are fewer AI artifacts and oversharpening.

Don’t get me wrong—the changes overall are still subtle. If you’re familiar with DLSS, you know that even in its new iterations, image quality can suffer, especially if you venture outside of Quality mode. Let your eyes wander toward the edges of the frame or to objects in the distance, and you’ll see garbled text, warped edges, and low-resolution textures. It’s not perfect, but the Transformer model polishes most of these errors.

The Wider Picture

That’s just one game. I also tested Indiana Jones and The Great Circle, Star Wars Outlaws, Black Myth: Wukong, and Marvel Rivals. Except for Wukong, these games also had native implementations of DLSS 4, but remain some of the select few.

Photograph: Luke Larsen

Nvidia has a way around the limited number of games that support it. In the Nvidia app, you can override a game’s settings to support 2X, 3X, or 4X frame generation. That’s how Nvidia achieves its claim of over 100 supported games and apps. While that works for testing, it’s incredibly cumbersome to use. The amount of frame generation you want to use will depend on the game, requiring multiple restarts until you’re happy with the result. Hopefully, this buys time for developers to add the setting in games, but for now, I don’t think it’s something many gamers will use.

Indiana Jones and The Great Circle was the most convincing example of DLSS 4. While you can play Cyberpunk 2077 maxed out with standard 2X frame generation at decent frame rates, things are even better with The Great Circle. You’ll average only around 60 fps in Supreme mode with full path-traced lighting effects, and without frame generation at all, you’re in the 40s. The 3X or 4X gets you that buttery frame-rate gaming experience without turning down ray tracing.

Artificially adding frames is less useful if you primarily play fast-paced competitive games like Marvel Rivals. You might assume that higher frame rates—even if they’re artificially generated—mean smoother action. The game might look better, but as any competitive player will tell you, the trade-off in input delay just isn’t worth it. The input delay is noticeable in slower-paced games but doesn’t impact gameplay nearly as much.

That said, I don’t mind that Nvidia has opened the sandbox and is allowing gamers to tweak settings using the override functionality in the Nvidia app. More options and customization to suit your PC, budget, and specific game are what PC gaming is all about, after all. I just wish this card had more raw compute to back up multi-frame generation.

Does It Make for a Good Laptop?

Most of what I’ve said applies equally to desktop GPUs as it does to laptops. But Nvidia also boasts the heightened efficiency of the RTX 5090 to help with battery life. Max-Q is Nvidia’s tech to improve efficiency and battery life, and there are a few key optimizations, like voltage for the GDDR7 memory, and a new Low Latency Sleep feature to turn off the GPU faster.

Photograph: Luke Larsen

Nvidia is promising up to an extra hour of battery life while gaming. I can’t do a one-for-one comparison with an exact previous model, but I tried out Balanced mode on battery, which quieted down the system a bit yet gave the laptop enough power to produce a smooth enough 60 fps in Marvel Rivals. I was surprised by how well the system balanced noise, heat, performance, and battery, but I still only got around an hour and a half of playtime before it died. That’s a step in the right direction, though I wouldn’t say it’s game-changing.

A lot more can be said about these individual laptops as they launch. These top-of-the-line models should only be reserved for people interested in experiencing the best ray tracing in the latest titles. The broader support for multi-frame generation may be more useful on the lower-tier laptops when they come out later this year, offering a better gaming experience without paying a premium. Larger systems may be able to take full advantage of what the RTX 5090 can offer.

But tested in a thin laptop like the Razer Blade 16, the mobile RTX 5090 doesn’t quite feel like the next-gen GPU refresh it should be.

Related Posts

Leave a Reply