Boost RX 6800M Performance With These Changes! ASUS Strix
G15 Game Testing
The ASUS Strix G15 and Radeon RX 6800 graphics are being
held back by memory. I have tested this new ASUS gaming laptop in 13 different
games both with its stock memory and with my own memory to show you just how
much performance is being left on the table, and I've also compared it against
other gaming laptops so you can see how well the 6800M stacks up against
Nvidia. These are the specs of my Strix G15. I have got the Ryzen 9 5900HX CPU,
RX 6800M graphics, 16 GB of memory, and a 300Hz 1080p screen, but there is also
a 1440p option. You can check out other specced models as well as updated
prices with the links down in the description. So why have I spent extra time
testing out this laptop with two different sets of memory? I found that the
Strix G15 uses the same memory as the Lenovo Legion 5 Pro, and as I covered in
this dedicated video the memory that it ships with isn’t exactly amazing. Both
memory kits tested in the Strix G15 have two 8 gig sticks in dual channel, both
runs at DDR4-3200 CL22 and both are single ranks. The difference is that my kit
is 1Rx8 with lower secondary timings compared to the 1Rx16 kit that the laptop comes
with. It’s kind of annoying because these sorts of memory details aren’t listed
anywhere on the spec sheet when you actually go to buy a laptop, so the only
way you could know is by watching videos like this one. I asked ASUS why
they’re using it and was basically told supply issues, which sucks because this
simple change can make a bigger difference compared to buying a more expensive
laptop with a more expensive GPU.
Now gaming laptops like this that use both AMD processors
and graphics make use of the smart shift, which is similar to Nvidia’s dynamic
boost. The basic idea is that power is shifted between the CPU and GPU as
needed based on the workload to offer optimal performance. In a GPU-only stress
test, I found that my6800M would run up to about 150W, but I found 145 was more
common, and then with the CPU also active in the stress test the GPU would run
at about 115 watts. The ASUS Armoury Crate software lets us pick different
performance modes, I’ve done all testing with manual mode because it lets me
mix out power and fan speed for best results. Unfortunately, there’s no MUX
switch, so it's not possible to disable the integrated graphics for a speed
boost in games, but we can connect an external display that will connect
directly to the RX 6800M graphics. I’ve tested this too and as you’ll see the
results are very impressive. But before we continue, I’ve got to tell you about
this video’s sponsor, Skillshare!
Skillshare is an online learning community with thousands of
inspiring classes for creative and curious people. I’m always looking to
improve the video that I’m making on the channel, which is why I’m currently
watching Video for Instagram- tell an engaging story in less than a minute by
Hallease. Sure my focus is on YouTube, but being able to get information across
quickly in the video is what I’m all about. Skillshare has lots of other
classes for you to explore and improve yourself by learning something new. The
first 1000 people to use the link in my description will get a free trial of Skillshare
Premium membership, and after that, it's only around $10 a month. Alright,
let’s find out how well this laptop performs in 13 different games with its
stock memory and with my memory. After that, we’ll see how the 6800M compares
against other laptops and then I’ll use an external screen to push it to the
limit. Cyberpunk 2077 was tested in little China with the street kid life path.
I’ve got the stock RAM shown by the red bars and the new RAM shown by the
purple bars. With ray tracing presets there’s basically no difference, probably
because we’re so GPU heavy here, but at the same time, the framerates with ray
tracing on the 6800M in this game aren’t great. It’s behind the Nvidia options
because I usually test with DLSS, but AMD’s FSR isn’t here yet. Otherwise, the
new RAM offers a 21% boost to average FPS at low settings.
Red Dead Redemption 2 was tested with the game's benchmark,
again a much smaller difference at max setting levels here when compared to the
lower presets. Basically, no change at ultra settings, while the new RAM was
offering a 14% boost to average FPS at low settings. Call of Duty Warzone was
tested with either all settings at minimum or maximum, as it doesn’t have
predefined setting presets. This game saw the biggest difference with the new
RAM at max settings out of all 13 games tested with a 24% higher average
framerate, quite a big improvement for such a simple change. Shadow of the Tomb
Raider was tested with the game's benchmark, and this game had the largest
difference out of all 13 titles tested at the lowest settings, with the new
memory offering 28% higher average FPS or 30 FPS in this case. Max settings
also saw a massive 19% improvement. For Control I’ve tested with ray tracing
enabled and disabled, let’s start with it off. Again basically no difference at
the highest setting, but this is a GPU-heavy game so makes sense, while low
settings were reaching 10%higher average FPS with the new memory.
Like we saw in Cyberpunk, there’s almost no difference with
ray tracing enabled, and again the frame rates aren’t great here because I
usually test with DLSS on Nvidia laptops. Assassin’s Creed Valhalla was tested
with the game's benchmark, and although the new memory was offering a 5% or so
boost to average fps, it’s below average out of the 13 games tested and on the
lower side. Microsoft Flight Simulator was more of a middle of the pack result,
close to the 13 game average in its differences. Now max settings were technically
reaching9% higher average FPS, but we can see this is just 3 frames, so we’ve
got to be careful when dealing with percentage values, 9% sounds nice but 3
FPS, not so much.
The gains in Watch Dogs legion were bigger, a massive 25%
boost to average FPS at low settings which in this case is almost 20 FPS, much
more noticeable I’d argue, while max settings were around 14% higher with the new ram, or about 8 FPS in this case, though there’s a larger 21% boost to 1%
low at ultra. Reasonable gains in Battlefield 5 too, about10 FPS at max
settings or about 9%, right in line with the 13 game average, while low
settings were 15% higher with the new RAM, presumably as processor performance
and memory matter more there. Fortnite was tested with the same replay file with
each memory configuration and the differences in average FPS were below the 13
game average, but hey in a more competitive game like this you could argue that
every frame counts more compared to some of the others. The same goes for CS:
GO, although all setting levels were scoring basically the same in terms of
average FPS, the new memory was offering at least an 8% boost to average FPS.
This game generally gets bottlenecked by the GPU, so using
an external screen should increase frame rate dramatically, more on that soon.
Rainbow Six Siege had much larger improvements to the 1% low performance with
the new memory compared to the average frame rates. Max settings were hitting
7% higher average fps with the new memory while the 1% low gains were higher at
21%. The Witcher 3 doesn’t really need high framerates, and honestly even the
stock RAM at ultra settings is still giving excellent results with this modern
hardware on an older game, but we can boost average FPS by 6% at max settings
or 14% at minimum settings by changing memory. These are the differences
between the stock RAM and my new RAM in all 13 games tested at the highest
setting levels available. On average the new RAM was able to offer US A 9%
boost to average FPS in games. Some titles like red dead redemption 2 and
cyberpunk saw basically no difference, probably because they’re GPU bound
there, while others like warzone had big gains.
I’ve also compared with all settings at the lowest preset,
and now the average difference with the new memory is closer to 15%. Even in
the worst case, we’re looking at at least a 6% improvement to average FPS, with
well above 20% gains possible depending on the specific game. Now let’s find
out how this all AMD configuration of ASUS Strix G15 compares against others.
I’ve tested Battlefield 5 in campaign mode at ultra settings, and the Strix G15
is highlighted in red. I’ve got both the stock memory and the new memory
results, which in this case offers a 9% boost to average FPS. The 6800M can
beat the lower power limit 3070in the Omen 15 with the memory upgrade and is
close to 2080 Max-Q options, though the1% lows with the Radeon GPU seem lower
compared to most others whether or not we change the memory. Shadow of the Tomb
Raider was tested with the game's benchmark with the highest setting preset. As
we saw earlier, this game had one of the biggest differences out of all titles
tested with the memory upgrade, almost a 19% boost to average FPS. The Strix
G15 can compete with RTX 3070 and3080 machines with the new memory like AMD was
claiming, while the stock memory puts it closer to the 3060 and under the power
limited 3070 in the omen 15 Far Cry 5 was also tested with the game's benchmark
at max settings, and this test typically depends more on processor performance.
The new memory was offering an 11% boost to average FPS here, again moving the
Strix G15 closer to some of the Ryzen plus Nvidia 3070and 3080 options, such as
the ASUS Scar 15 which is right above it with the same processor, but RTX 3080
GPU. I’ve confirmed that many other reviewers that have this laptop for testing
have the same memory as me, so the results other channels have shown could be
improved too, and I did also confirm with ASUS that the stock memory in my unit
is what the retail model will actually ship with if you actually buy this model
- it's not just the review units that are like this.
As mentioned there’s no way of disabling the integrated
graphics with the laptop screen, but if you connect an external monitor to the
Type-C port on the back it connects directly to the 6800M, bypassing the GPU.
I’ve got the results with an external screen connected shown by the green bars,
and I’ve tested it both with the stock memory and with my memory. With the
stock memory and external screen, we're just a couple of FPS behind the RAM
upgrade plus the laptop screen. If we both combine an external screen with the
RAM upgrade, the performance uplift is nothing short of astonishing. It’s just
a few FPS behind MSI’s larger and much more expensive GE76, though this is just
a single game. I don’t think I’ve ever had an all AMD laptop with both AMD
processor and graphics that let you disable the GPU. Now at the same time,
there also haven't been a whole lot of all AMD laptops, so now that higher-end
options are coming it's possible that this feature is still yet to come, I’m
not sure yet. At the very least though, unfortunately, this feature is not
present in the Strix G15, because as we’ve just seen we can get an exceptional
performance increase with it. I’ll cover thermals and everything else about
this laptop in the upcoming full review, so make sure you’re subscribed for
that one. Now let’s check out the screen response time. The ASUS Armoury Crate
software gives us the option of enabling or disabling panel overdrive, which
affects screen response time. With overdrive disabled, we’re looking at a 7.9ms
average grey-to-grey response time, but we can lower this down to around
5.6mswith overdrive enabled, which is the default, though this does add a bit
of overshoot. I’ve got a link in the description if you need an explanation of
what all these numbers mean. Although it’s not the slowest 300Hz screen I've
tested, looking at you MSI GE66, it’s not below the 3.33ms needed for
transitions to occur within the refresh window, in any case not too bad
compared to others, but could be better, and expect different results with the
1440p screen option. There’s a lot more that we need to discuss in the ASUS
Strix G15 advantage edition, so makes sure you’re subscribed for the upcoming full
review. Come and join me in Discord and get behind-the-scenes videos by
supporting the channel on Patreon, and if you need more information on why the
memory upgrade improves performance so much check out this video over here, as
I've gone into it in depth on the Legion 5 Pro. It uses the same memory as the
StrixG15 by default, so all the details are covered in that one, I’ll see you
there next.