The Last of Us Part 1 on PC arrived towards the end of March in a pretty desperate state - a good experience only available to those with more powerful hardware, with profound issues facing gamers on more mainstream components.
Progress has been made, however, across a series of patches and we thought it might be a good time to check in on the port to deliver an update on the game's current quality level. There's a lot of good news to impart here, but also some lingering issues - and we're concerned that they may never be addressed.
But let's address the good news first. At launch, there were profound issues with texture quality on 8GB graphics cards - which is highly problematic bearing in mind how many of them are in the market.
Setting texture quality to high bust through the GPU's VRAM limit, resulting in tremendous stuttering issues. The alternative was to drop back to medium or even low quality textures, depending on your resolution - the problem being that the quality level looked more like 'ultra low'. In fact, textures from the PS3 original were more highly detailed.
I'd venture to suggest that this issue is now resolved. Medium textures now look absolutely fine, with only a minimal hit to quality compared to the high preset.
Optimisations to memory management now mean that 8GB GPU owners can actually use the high texture preset instead if they wish, which gives a look very close indeed to the PlayStation 5 version. Naughty Dog deserves praise for delivering this. It demonstrates how PC scaling should work, and why we should not blindly assert that 8GB GPUs are now obsolete.
Our fresh new look at The Last of Us Part 1 on PC reveals that while issues remain - and some new ones added - at least owners of 8GB GPUs can now enjoy a highly attractive experience.
So, how was this achieved? Based on the difference in medium textures, it looks like there are completely different mipmaps showing up when the game is set to medium. It's as if the game had an art pass carried out for the medium textures to keep resolution higher and to retain more detail - so textures that may not have even existed before.
On top of this, there's also a change in how texture streaming works, to accommodate GPUs with different memory sizes. This is controlled by a new option called 'texture streaming rate', which is a brilliant addition.
This option defaults to 'normal' on medium and high presets. When set to normal, the texture cache size is smaller, thus freeing up more VRAM but keeping the texture quality the same in the view frustum. However, if the camera moves rapidly, some environmental textures can stream in later.
At the fast or fastest setting, texture popping is virtually gone, but this comes at the expense of more VRAM utilisation. If you are playing the game on an 8GB GPU, I recommend high textures with the normal or fast setting, which looks absolutely fine.
In the video embedded on this page, you'll see that there have been iterative improvements to GPU and CPU utilisation, but make no mistake - The Last of Us Part 1 is still very heavy for PC users and looking at unlocked performance based on the
VRR mode of the PS5 rendition of the game, the console version is 55 percent faster in a GPU-limited scenario compared to an RTX 2070 Super, which is typically a match for PS5 in most other rasterised games.
Post a Comment