Your rig+performance reports, lets have it people

here
i7 4790K
32 gb
super 1660 gt
4K monitor

ultra settings, including shadows, was mostly having around 25-30 fps which is fine to me. had a few MAJOR studders, which i don’t know if they were caused by server, or bottleneck graphic bandwidth.

seems like that upgrade isn’t that worth it.

my only nag is that i’m still not sure that my ultra setting looks the same as the 4080 settings.

3090
i9-11900k
64gb ddr4
nvme

game runs fine but has tons of traversal stutters

at 4k it absolutely cooks my GPU. the memory junction was reaching 104C. in 1440p the temps are fine though.

it seems like it needs some optimization and the stuttering is annoying as hell but overall it’s decent.

the cinematics being locked to 30 fps is crap, though

4080 MSI
I9-13900k
64Gb DDR5 5000mhz
Lian li dynamic evo case
Main gear 1000W PSU
Corsair elite h150i
Samsung 980 pro 2TB nvme
Msi pro z790A mobo

Ran it on ultra and 2k 165hz

please include performance, I’m guessing was a smooth ride tho

1 Like

Yeh other than a few lag spikes here and there it was flawless. Temps stayed cool on both my cpu and gpu I don’t think my gpu broke over 58c hot spot of 69-72 depending. Cpu was at a cool 48-57c frames never seemed to move other than a couple times phasing people into the town when teleporting out of a dungeon

I9 13900KF
RTX 4090
32GB DDR5 5200 MHz RAM
2TB SSD NVMe

Game ran really good, not a single stutter during the 20h or so that I played, only issue I had was that the CPU temp spiked when in Menus/UI/Vendor Windows. :beers:

Asrock Taichi Z390
Intel i5 9600K
32GB DDR4 3400 (4x8GB)
RTX 4070 Standard Founder’s Edition
LG C2 Gaming TV 43"



Performance & Settings:

Everything maxed out.
Resolution set to 4K.
FPS Pre-DLSS2/Frame Gen: 85-100
FPS Post-DLSS2/Frame Gen: 105-115
FPS Post-Launch /w DLSS3/Frame Gen expectations: 115-130 FPS
TEMPERATURES: GPU never exceeded 65c/150f. CPU temps barely change from idle temps.

Wait, you playing 4k on ultra with a 1660?
Not throwing shade, just asking: How?

I played on “medium” afraid of high temps.

Edit: To stay in topic: Im Ryzen 5 2600, GTX 1660, 16 ram. As said, played on “medium”, no problems or any bugs. Game run pretty well, but in the beta i had some little stutters when leaving cutscenes back to game.

i5-3570k at 4.4GHz
GTX 1080
32GB DDR3 1600MHz RAM
PCIE 4.0 NVME SSD running on PCIE 3.0 4x (motherboard limitation)

1440p High settings with Textures increased to Ultra, no FSR
It usually runs at 70~120 fps with dips into 50ish sometimes.

RTX 3080
i9-11900k
64gb ddr4
nvme

all drivers up to date never have problems with any other games. But could never get past the Diablo Logo screen went grey and just locked up.

1 Like

Stutters will always be local hardware issues. Connection “stutters” will manifest as rubberbanding or character models suddenly skipping around all over the place to catch up to where you last performed an action.

Doesn’t matter what you have, be it
GTX 1060/1070/1080
RTX 2060/2070/2080
RTX 4060/4070/4080
The settings will look the same, but your performance will drop significantly based on the GPU’s capabilities.

An upgrade is definitely warranted because the 1660 is dated.
If you can find one, I do believe the RTX 4070 is still going for $599.
I dunno what your power supply situation is, but you’ll need minimum 650 watts.

Screenshot of my afterburner osd:

https://i.imgur.com/aTix2VX.jpg

7800x3d
ddr5
3070Ti
34" ultrawide

Ultra settings/high res pack - 140ish fps but usually had set my fps lower to match my refresh rate. Made about a 10deg/60watt difference on my card.

8gb on memory card seemed to do fine.

i5 11600K
RTX 3070TI
32gb GDDR4 RAM
M.2 NVME SSD
1440P dual display

Diablo IV Ultra setting runs smooth like butter. During Cut scene the cut down to 30 fps made it look weird .

i9 - 10900K @3.70/4.90 GHz
RTX 2080 Super
32 GB
New Alienware curved 34" OLED monitor AW3423DWF

Had all the graphics set to ultimate and it ran fine and looked great.

1 Like

System Specs
CPU: i7 7700
RAM: 16 GB
GPU: ASUS Dual Geforce RTX 4070
Monitor: 1440p 165 Hz Gsync
PSU: 650W

Performance
1440p, Ultra Settings, DLSS3 Balanced:
CPU%: 70-85
RAM: 11.6 GB
GPU%: 50
GPU Power Draw: 112W
VRAM: 11 GB
Average FPS: 165 Hz

General Notes
CPU bottlenecks. When loading new areas, there’s a noticeable sluggishness for a few seconds, but overall not too annoying compared to upgrading CPU, mobo, ram, etc. Very happy that I upgrade from GTX 1060 3 GB. I was only getting about 45 frames on the same settings before I swapped the card out.

yep. everything set to max. I was shocked myself too. i’m not saying it was a smooth ride. but it was pretty good.
I think there should be vids coming on youtube later on, that might prove it.

i7-4770k @4.8ghz watercooled
EVGA 1660 Super SC +140 on Core,+900 on Memory watercooled
32G DDR3 GSkill CL8 (forget timing)
Sata SSD’s
MSI 32" 2k (1440) monitor gsync enabled

Overrode ingame settings with a Nvidia profile mixed between high and ultimate never dropped below 70 fps

i7 4790
16gb ram
radeon 580

I noticed nothing bad except sunday when it was to laggy to play.
No graphics problems at all.

Four-year-old Alienware laptop with 16GB, SSD, 2070 video to an external 3440x1440 widescreen monitor, with an XBOX Series controller.

Running the default configuration settings the game picked.

Did not collect performance details because I never felt the need to look. Game ran buttery smooth, looked beautiful, and had no crashes, hangs, lag, rubber-banding, etc.

I would say system thermal impact was “medium” based on the fan level, and overall, the performance and requirements felt very similar to modern WoW.

They seem to have done an excellent job on the technical side of things in the end.

Im finding interesting how until now im the only Ryzen boy in the entire thread.