SetsulYou are aware that most GPUs have "target temperatures" like 83°C? Are you going to stop buying those too?
So you want to buy a CPU with a boost mechanism designed to pour on voltage as long as the temperatures are safe, which AMD considers anything below 90°C, you want to overclock it, cool it with a single tower air cooler, and not have it reach 70°C under load.
Do you see how that might be a problem?
Accept the temperatures or disable PBO, maybe even boost, and don't even think about overclocking.
My question was solely meant to clarify whether or not I should be worried about such temperatures being reached. Obviously if those are the intended temperatures like on GPUs for which I am aware that there are such "target temperatures" indeed and AMD deem them perfectly normal and safe, I am willing to accept them. I had already heard about Ryzen chips running quite hot in general so I wasn't scared or anything, just asking to make sure doesn't hurt for a 450 euros CPU upgrade. I think I've said a couple times that I'm not really intending to overclock the 5800X anyway, PBO should be my go-to probably, with Noctua NH-U12S or be quiet! Dark Rock 4. Unless you think I should secure a double tower for PBO ? Again it's not so much about the temperatures per se, I'm not going to freak out if I see 80°C in Hardware Monitor, but I am if I get a shutdown. I also need to clarify whether Ryzen Master or BIOS is best to optimize boosts, voltage and the resulting temperatures. Do you know more about this ?
SetsulWhat do you mean, the architecture is unoptimized in terms of surface area? If AMD wanted a larger area they could just go back to 14nm, where Intel is stuck, and produce 8 core CPUs that need 200W instead of 100W. Are those easier to cool thanks to their larger surface area? Well, not really, because 200W.
What do you mean, "some sort of nerfed 5950-5900X"? How does having the same TDP imply them being similar CPUs? Why did you think CPUs with the same architecture would be completely different? What is the 5600X in your opinion, since it got a different TDP? What are the 5800 and 5900 (non-X) with their 65W TDP? A beefed up 5600X? Because those are all the same chips. 5600X and 5800(X) is one 8 core chip and the 5900(X) and 5950X are two chips. Yes, two chips are exactly twice as large as one.
Yes, to absolutely no one's surprise having twice the cores at half the power consumption each spread across twice the area (because, you know, the cores are the same size?) is easier to cool. You can get that by simply underclocking the 5800X massively and running it at 3.775 GHz when all cores are active, just like the 5950X would, instead of the 4.450 GHz the 5600X and 5800X usually run at with all cores active.
If you overclock the 5950X to get the same clockrate as the 5800X on all cores then suddenly, magically, the power consumption doubles and it is much harder to cool.
Why do you expect a BIOS update to lower temperatures?
I must admit I completely fucked up on clarifying whatever I meant with that, not even sure now what it was, guess I mostly wanted to show that I'm not asking here stupidly before having done any personal research and taking your free time for granted. Thanks for the explanation.
SetsulWell for PCIe 4.0 to matter for SSDs you'd need an SSD that saturates the bandwidth of PCIe 3.0 x4. So if you don't plan on getting an SSD capable of more than 4 GB/s the difference will probably be marginal. The USB 3.2 is more interesting. SLI support is kind of pointless since you'd need to first find an nVidia GPU that even supports SLI these days. x8 vs x4 would make a difference for Crossfire though, if you were planning on doing that.
I'm pretty sure I'm getting the Mushkin Pilot-E 500 which maxes out at 3,5 GB/s so I probably won't benefit from it in that department. No GPU upgrade is on my mind for the next 1 or 2 years, let alone two GPU upgrades so I don't think I'm interested in securing potential for Crossfire or SLI. As for the USB 3.2, I guess I would care for when I do backups, but since I only do them once every couple of weeks (maybe a bit more often if I've been working a lot), unless the gain in time is massive, I probably don't mind sticking with PCIe 3.0. Am I missing any potential impact from 3.2 outside of that ? I can't think of any in gaming at least.
SetsulMindfactory is reliable.
Thanks !