Found the engineer's reddit post he's talking about
"The 8-pin standard asks for 150W at 12V, so 12.5A. Rounding up a bit you might say that it needs 4.5A per pin. With 9-amp connectors, each one is only at half capacity. In a 600W 12VHPWR connector, each pin is being asked for 8.33A already. If you have 8.5A pins, there is functionally no headroom here, and if you have 9A pins, yeah that's not great either. Those 8.5A pins will fail under real-world conditions such as higher ambient temperatures, imperfect surface cleaning, and transient spikes from GPUs. The 9A pins are not much better. (13A pins are probably fine on their own. Margins still aren't as good as the 8-pin, but they also aren't as bad as 9A pins would be.)
I firmly believe that this is where the problem lies. These (not the 13A ones) pins are at the limit, and the margin of error of as little as 1 sixth of an amp before you max out a pin is far too small for consumer hardware. Safety factor here is abysmal. 9Ax12Vx6pins = 648W, and if using 8.5A pins, 612W. The connector itself is good supposedly for up to 660W, so assuming they are allowing a slight overage on each pin, or have slightly better pins than I can find in 5 minutes on the Molex website (they do), you still only have a safety factor of 1.1x."