You are aware that most GPUs have "target temperatures" like 83°C? Are you going to stop buying those too?
So you want to buy a CPU with a boost mechanism designed to pour on voltage as long as the temperatures are safe, which AMD considers anything below 90°C, you want to overclock it, cool it with a single tower air cooler, and not have it reach 70°C under load.
Do you see how that might be a problem?
Accept the temperatures or disable PBO, maybe even boost, and don't even think about overclocking.
What do you mean, the architecture is unoptimized in terms of surface area? If AMD wanted a larger area they could just go back to 14nm, where Intel is stuck, and produce 8 core CPUs that need 200W instead of 100W. Are those easier to cool thanks to their larger surface area? Well, not really, because 200W.
What do you mean, "some sort of nerfed 5950-5900X"? How does having the same TDP imply them being similar CPUs? Why did you think CPUs with the same architecture would be completely different? What is the 5600X in your opinion, since it got a different TDP? What are the 5800 and 5900 (non-X) with their 65W TDP? A beefed up 5600X? Because those are all the same chips. 5600X and 5800(X) is one 8 core chip and the 5900(X) and 5950X are two chips. Yes, two chips are exactly twice as large as one.
Yes, to absolutely no one's surprise having twice the cores at half the power consumption each spread across twice the area (because, you know, the cores are the same size?) is easier to cool. You can get that by simply underclocking the 5800X massively and running it at 3.775 GHz when all cores are active, just like the 5950X would, instead of the 4.450 GHz the 5600X and 5800X usually run at with all cores active.
If you overclock the 5950X to get the same clockrate as the 5800X on all cores then suddenly, magically, the power consumption doubles and it is much harder to cool.
Why do you expect a BIOS update to lower temperatures?
Well for PCIe 4.0 to matter for SSDs you'd need an SSD that saturates the bandwidth of PCIe 3.0 x4. So if you don't plan on getting an SSD capable of more than 4 GB/s the difference will probably be marginal. The USB 3.2 is more interesting. SLI support is kind of pointless since you'd need to first find an nVidia GPU that even supports SLI these days. x8 vs x4 would make a difference for Crossfire though, if you were planning on doing that.
Mindfactory is reliable.