Overall Game Perfomance state - and why GGG is in silenced mode about it?
" I never said it's not enough. I just told it's not enough for maxed settings and juiced 6-players content. 4070 is not fresh videocard anymore (it's almost 2 years old) and you're supposed to update your hardware on regular basis (every 1-2 years), CPU, GPU, RAM. and probably motherboard. If you are out of trend, prepare to face performance issues. Computer hardware is cheap nowdays, there is literally no problem to buy fresh PC with 4090 and wait till 5090 release. Moreover everything below xx80 or even stock (non OC'd) xx90 considered budget. I don't understand why do you blame the game playing it on budget GPU? With office operating system isntalled. Okay! |
|
The game's become exponentially more complex, and so your CPU needs to process more shit, and your GPU needs to render those processes. It's only at higher resolutions where GPU matters as much. Single core performance is extremely important, though as this game is one of few with multithreading, so is multi-core performance. Single-core I'd argue is still more important though. It's debatable whether or not X3D chips are as valuable here due to single-core performance not being as good. I have tested 7950X and a 7800X3D and the performance difference was not noticeable. I am using the X3D now.
Those with comments as unwieldy as "a 4070 is enough to run this game on max settings... max juice" have no clue about the demands this game's capable of making of hardware. None. That input is so devoid of intelligence, it's actually lowering the collective IQ of anyone who's exposed to it. There are many reasons for performance instability while playing PoE. As it's multi-threading, it will heat your CPU up like you're running benchmarking software. This can cause throttling pretty quickly on chips that run hot/don't have sufficient cooling. Additionally, the game's engine has been updated so many times, there's probably an order of magnitude more computations your CPU's doing these days vs when say a 1800X or 6900K would have run it like butter. To put things in perspective: - The most expensive consumer-grade products you can buy cannot run this game flawlessly in all scenarios. Look at popular streamers popping legions in affliction and their games freezing. How about them pressing alt and their game's crashing every single time, without fail on medium-juiced content. It is not possible. Challenge to the "all it takes is a ≤4070 for max settings and max juice"
Spoiler
PLEASE upload a video of a 4k, 350% quant t16 with 100% deli, beyond, Legion, Harb while using headhunter on a 4070. Just use any old CPU as all it takes is a 4070
I'm struggling to come up with new goals to keep me playing this game.
|
|
Why settle for anything less than a quantum computer?
And let's not forget to update it every 6 months like clockwork. |
|
It's amazing how some manage to call graphic issues and the lack of graphic optimizations "complexity".
|
|
" This is just stupid. No one is talking about "max settings". I have one of those 4070. A modern, mid-to-high ranged card that runs 'all' titles out there on high/ultra @1440p (and even som on 4k). In PoE, I have most settings on low, except my resolution, which is on 3440x1440. The game brings me down to 20 fps everytime I meet a Harbinger. Hell, it activates dynamic resolution in freakin' town. If you think that is OK in any way, shape or form on a modern, mid-to-high ranged GPU and a solid CPU, you are just... Wrong. I've been where you are: I've spent years without problems, and even defended the game a few times regarding performance. But enough is enough. It's worse than ever out there. You ar eindirectly saying that players SHOULD have a better card than a 4070 to get over 40 FPS, you know that, right? Sometimes, just sometimes, you should really consider adapting to the world, instead of demanding that the world adapts to you.
|
|
" What do you expect from the game forum? It's hard to understand some topic if you don't face it yourself. Without experience you cannot say whether a given performance is adequate to the scene. On Probation Any%
|
|
" The bulk of processing power on a videgame comes from visuals. Damage calculations, enounter possibilities, general die rolls and shit even a pentium 2 can handle And thing is, effects and particles on poe are not great, they arent super detailed so there shouldnt be an issue about having even hundreds of them onscreen even on older hardware. To compare to other games, sins of a solar empire can have literal thousands of ships firing shit onscreen and it still dont eat nowhere near as much hardware as a double harbinger boss. Starcraft 2 can also have thousands of units onscreen firing shit at one another for a fraction of the hardware. FF7 can also have double digits of far more detailed monsters onscreen casting spells and STILL dont eat as much hardware as juiced enocunters I can perfectly understand that GGG is an indie developer and they are almost certain stuck with legacy code from poe first came out, but the way it keeps moving BACKWARDS is just ridiculous. Okay, maybe its too much to ask for a developer with a single title on their belt to improve optimization, but at the very least the game shoudnt move BACKWARDS |
|
" Rendering stacked translucent materials is heavy as hell, as it happen mostly in PoE. On Probation Any%
|
|
for Nvidia RTX users: do you play with some sort of video running at your second moniter? and perhaps one time tested that new, fancy Super-Resolution?
S-R will use 35-50% of your gpu power if its turned on and detects video d:-D*
|
|
" This was true some time ago, but there are multiple tricks to create translucent surfaces on the market, and interactive light is the norm on videogames by now. All 3 games i mentioned have convincing lightning and shadow effects that interact with enviromental light sources in more convincing ways than poe does on its units Even if poe cant employ modern tricks because the old engine cant handle it, not having the option to disable it is by itself idiot programing, as disabling interactive light to save performance is standard on PC gaming |
|