The priority of an AAA game developer is to provide as much graphic fidelity for a specific compute budget, not to consume the least compute for a specific graphic fidelity. If they "optimize that stuff better", the outcome wouldn't (and shouldn't!) be a lower usage of system resources but rather fitting in even more graphic details while still capping out all resources.
They do obviously have the reasonable right to demand all the system resources that are available, because a game is usually an immersive experience that is the only important thing running on the system at that time, and the only purpose of those greatly increased system resources is to be used for gains in visual quality - there's no reason to not try and use all of that compute power of what would have been considered supercomputers just a few years ago.
They do obviously have the reasonable right to demand all the system resources that are available, because a game is usually an immersive experience that is the only important thing running on the system at that time, and the only purpose of those greatly increased system resources is to be used for gains in visual quality - there's no reason to not try and use all of that compute power of what would have been considered supercomputers just a few years ago.