Indiana Jones and the Great Circle Is A Beautiful Game But Also Signals The End Of Viability For Graphics Cards With Less Than 16GB Of VRAM ▉
Wolfgang Andermahr writing at German Tech site ComputerBase.de, translated through Safari’s translation system:
However, it is unfortunately virtually impossible to say how much VRAM the game really needs. The game allocates around 14 GB with the Hyper preset in the benchmark, with Giga it is 13 GB and with Ultra slightly more than 12 GB – so the differences are small. With a 16 GB graphics card, there were no problems with the Hyper-Preset in Ultra HD even after longer playing time. Whether this also applies to a 12 GB graphics card is unclear.
8 GB is clearly not enough for the hyper textures already in Full HD, there is no more than a jerking orgy. And also the Giga textures are still too much for an 8 GB graphics card, which runs much better than Hyper, but still stalls strongly. And also the ultra textures are still too much, even if the average frame rate is right, the game still stalls significantly. Only from the setting medium do 8 GB accelerators no longer have problems – yes, even high textures are still too much.
With a 12 GB VRAM, WQHD can be played well with the hyper textures and even in higher resolutions, corresponding graphics cards are stable, at least with short playing times – and experience has shown that the memory consumption of games with the id-Tech engine does not grow further.
However, the maximum performance is not yet available with 12 GB in Ultra HD. Only from a 16 GB graphics card does the frame rate increase to the maximum, even if the frame pacing is no longer confused with 12 GB.
The warning lights have been flashing for quite some time that this was imminent. It started with games like Hogwarts Legacy and The Last Of Us Part 1 not being able to keep up frame wise and suffering from stuttering and low frame rates. The option of course was to turn the resolution down to acclimate the card and get the games to perform. Reviewers like Steve at Hardware Unboxed and leakers like Tom at Moore’s Law Is Dead have been warning about this for the better part of, I believe it to be, two years now. Everyone just continued to cry that these games were unoptimized and to some extent they were. Patches did improve performance for existing cards. I’m sure there are other games that I can’t recall where this was the case as well.
The problem is all these people warned that buying anything less than a 16GB graphics card in 2023/24 was, well, stupid, if you wanted to continue to play the big AAA titles at any kind of resolution. With Indiana Jones it seems that time has arrived.
Apparently the game isn’t even playable on these cards at anything above Full HD:
The benchmarks for Indiana Jones and the Big Circle look strange at first in WQHD, as some graphics cards deliver very poor results and suddenly there is a very big jump – which is simply because on graphics cards with less than 12 GB of memory the VRAM is too small for the maximum texture details.
All graphics cards with 8 GB and 10 GB do not get playable results in WQHD, this applies to the Radeon RX 7600 as well as to the GeForce RTX 4060 (Ti) and the GeForce RTX 3080. Only from the GeForce RTX 4070, Radeon RX 7700 XT or similar graphics cards does this suddenly change, then more than 60 FPS are easily achieved, only the Radeon RX 6700 XT does not quite reach the goal, but comes into a playable range.
For UWQHD, i.e. 3,440 × 1,440 in 21:9 format, it only needs partially faster hardware, from a GeForce RTX 3080 Ti, GeForce RTX 4070, Radeon RX 6800 XT or a Radeon RX 7800 XT you are on the safe side. Only for Ultra HD it has to be really fast, 60 FPS is only available from a GeForce RTX 4070 Ti or a Radeon RX 7900 XT. Nvidia users should activate DLSS apart from HDR anyway, but Radeon owners, on the other hand, have no real alternative – only the dynamic resolution could at least half remedy a lack of performance.
Even though Indiana Jones and the Big Circle consistently use hardware ray tracing, AMD graphics cards are doing well. Especially the two fastest Radeons are well on the road. In WQHD and UWQHD GeForce RTX 4080 Super and Radeon RX 7900 XTX work absolutely the same fast, only in Ultra HD can the Nvidia graphics card stand out by 9 percent. Since all Radeons fall behind in Ultra HD, it is conceivable that the additional pixels are a bit too much for the ray tracing capabilities of the AMD GPUs, which then have to struggle accordingly. Don’t be fooled by the Radeon RX 7800 XT, which suddenly approaches the GeForce RTX 4070 in Ultra HD. The latter simply run out of only 12 GB of memory, correspondingly the FPS drops.
Even in Full HD there is quickly a lack of VRAM, even if the textures are reduced by two levels to the Ultra setting. Graphics cards with only 8 GB run slightly faster than with “Hyper”, but are still slow: GeForce RTX 4060 and Radeon RX 7600 only come to barely playable 30 FPS. And the GeForce RTX 3080 with 10 GB is hardly better either, the former high-end graphics card does not reach more than 36 FPS even with the textures reduced by two levels.
Graphics cards with at least 12 GB achieve their normal performance, the Radeon RX 6700 XT reaches 60 FPS in Full HD without any problems, while the actual competing product, the GeForce RTX 3060 Ti, is limpsing around at 30 FPS. The Arc A770 is also a problem despite a 16 GB VRAM – so there is no limitation in this regard. The Radeon RX 6700 XT is a whopping 61 percent faster, the 60 FPS mark remains far away despite sufficient memory.
Given that this was developed by Machine Games, runs on id Tech 7, and seemed to run incredibly well on Xbox Series X and S, I’m going to go ahead and say this is not an “optimization problem” and more a “you were warned to to buy a graphics card with less than 16GB of VRAM” problem.
> ▍