As someone that works on the design, implementation and testing of a device that draws 6 to 16 amps (16A only in IEC regions, not North America) of current from AC mains for long periods of time, I can assure you that power cords can have problems. There are tons of cheaply (and some shoddily) made power cords. For the poorly made units, it doesn't take a lot of heat to cause problems. I'm sure many of you have seen this kind of thing happen with an extension cord at some point in your life. A connection with higher-than-optimal resistance (male plug to outlet, or female plug to your power supply) can generate a significant amount of heat. It doesn't take much resistance if the current is high.
The typical cheap power cord's contacts are almost completely fixated by the thermoplastic. There is a fixating cage that holds them in alignment while they're being molded, but it's not really part of the mechanical design other than holding the contacts during molding. It's a flimsy piece. When the cord gets warm/hot, the contacts have a lot of freedom of movement due to the thermoplastic softening. Aggravated if the cord is under off-axis tension, of course. And it tends to be an escalating kind of problem over long periods uf use; more heat means more contact movement, more contact movement tends to lead to higher resistance, more resistance means more heat. Severe enough, you eventually wind up with failure.
Much of the time, the problem originates at the plug to mains outlet connection. For example, 40 year old outlets that have high contact resistance for whatever reason (corrosion, mechanically damaged contacts, etc.). I've rarely seen problems at the other end for devices that are stationary nearly all of the time, as long as the original cord or one of equal or higher current rating is employed. These cords are typically (but not always) UL or ETL listed and hence are third-party validated for their intended application. But... I'd put my hand on the cord and both ends to feel for heat as a start. It's entirely possible something is wrong on the primary or secondary side of your power supply that is causing too much current to be drawn. And/or that you've been using cords that were intended for lower current, or were built poorly, or previously damaged, or...
I suppose it's also possible that your power supply is heating the plug for other reasons... fan(s) need replacement, or heatsinks are all covered in dust/smoke tar/whatever, etc.
I guess my whole point is that heat damages cord ends, and the amount required to damage them depends on the cord. If you want to investigate, start by feeling the cord ends at cold boot and over the next two to four hours of normal use. If it were me, I'd investigate... I would not want a situation where fire could be a potential result.
If you have an ammeter, use it. If you have an IR temperature gun, use it.
Hope this helps. I've left out a ton of stuff in the interest of time and space, but hopefully given you enough to at least check the basics and make you sleep easier.