Hi, here are my impressions with DLDSR. I'm using an RTX 3060 with a 3440x1440 monitor. Typically TSW2 runs at 50-70 FPS. Two DLDSR modes appeared in the GPU control panel: 1.78x (4587 x 1920) and 2.25x (5160 x 2160). The first resolution doesn't seem to work with TSW2. My screen goes black immediately after setting that resolution in the game and the only thing I can do is a hard reset. The second one does work and seems to improve graphics quite a bit. There is far less flickering action going in the distance. Sadly my FPS went down to 35-40 which is less than I'm comfortable with. So, unless I figure out a way to make 1.78x resolution work, I'll give DLDSR a pass for now.
My old case was very air movement constrained, the GPU was running at 85 and the memory at 95. Worse yet all that hot air was heading into the CPU. I've replaced the case with a lovely air flow job that supports a riser to get the GPU away from the CPU with plenty of fans and my GPU came down to 75 and the memory 85. I then altered the fan profile and now it runs at 65 GPU and 80 memory.
I gave it a try too and I think I'll stick with 150% screen scaling in the game. It looks better to me. That and the 2x resolution nearly makes my desktop unusable.
I am using the 1.78x, dropped the game upscaling to 100 % (previously 150), and the game to me seems identical and gets about 10-15 FPS.
Apologies all as this is a bit off topic, but paul.pavlinovich can you tell me which case you have so I can see what you mean please? I have a Bequiet 500DX which has a mesh front and have a fair few fans on it, I suspect what is. Happening is that the CPU cooler is possibly interfering with airflow round the GPU. But not sure.
Yes I've been trying all the different combinations, 1.78, 2.25, upscaling and smoothness settings. My 4k display maxes out at 60 fps, so the only gain I see is maybe slightly better graphics ( though that could be wishful thinking ).
A quick google suggests quite alot of the 30xx cards have bad pads as you say, Gigabyte included. The suggestion is to replace them/upgrade them. Might give it ago. I mean what's the worst than can happen. These cards are cheap and easy to get hold of right!!!!
Hi Pete, I have a Thermaltake View 71 Snow https://www.thermaltake.com.au/view-71-tempered-glass-snow-edition.html and I have my 3080TI mounted on a riser so the hot air from the GPU rises past the CPU and gets exhausted out the top. When I changed to this case my temps dropped from 85 gpu 95 memory to 75 gpu 85 memory, since then I put in an extra inlet fan and tweaked the gpu fan curve to be more slightly active at lower temps, they come on at 55 and the curve is about 10% taller. Its a bit louder now but not bad and now I have 65 gpu and 70 memory which will be much better for the longevity. 5cip I would be concerned if I had a 110C chip. That is very hot. Most commercial chips are rated to around 70C with industrial perhaps as high as 85C and military up to 125C. The GPU and its memory may be about to handle the temps if they were designed for it, but what about the cheap silicon around it not to mention that space heater is in the same box with your CPU and mobo. They will not be designed for that. Could get even more interesting if you've still got spinning disks. Paul
paul.pavlinovich my gpu doesnt climbs up to 110°c i only mean that this is the possible temp for the memory ...happily the highest temps are between 82°c - 90°c when i play high quality games and have maxed out all settings....most games are between 60°c - 75°c the gpu remains in most cases between 50°c - 70°c
Hi paul.pavlinovich Thanks for the pic and the detailed explanation. Will certainly have a think into whether a vertical mount is an option for me. After watching a few YT videos, I bought some new thermal pads.It oesn't look too bad if you are careful with the fan leads. I will probably give this a go too when I have time. Need to do a bit of benchmarking before to ensure that it does improve the RAM temps and (even more worrying) make sure any of the other temperature don't rise. Pete
I would put it in the "not sure, don't do it" bucket . I know that its near impossible to fry an intel CPU these days - you can run it with no heat sink at all and it'll just limp along pretending to be a 386 but I don't know about GPU hardware - what it might do if it ran away thermally. I use FurMark for testing GPU's if FurMark can't cook them, nothing can . Paul