I thought I was finally able to get 6k @ 60Hz using this StarTech HDMI cable:
6ft (2m) HDMI 2.1 Cable 8K - Certified Ultra High Speed HDMI Cable 48Gbps - 8K 60Hz/4K 120Hz HDR10+ eARC - Ultra HD 8K HDMI Cable - Monitor/TV/Display
www.startech.com
But I was only getting 5k (5120 x 2880) @ 60Hz and MacOS was just upscaling the resolution.
May I know how you check the RX580 can only output 5K 60Hz, but not 6K 60Hz?
In general, macOS will only tell you the rendering resolution, and the UI resolution (with refresh rate). It won't tell you the actual resolution that was transmitted to the monitor.
e.g. If you are using 5K HiDPI 60Hz, then the screen is rendering at 10240x5760 60Hz, with everything in 5120x2880 2x scale. Then downscale back to 6144x3456 60Hz, and send this signal to the monitor. Therefore, you are actually using full 6K 60Hz, but the UI looks like 5K 60Hz.
macOS should not upscale 5k 60Hz signal to 6k 60Hz for you. This part usually done by the monitor. If your graphic card is really outputting 5k 60Hz, then it's 5k 60Hz. This is the final signal generated by the OS, no more upscale. And once the monitor get the signal, and it realise the signal won't fill up the entire screen, it may allow you to run in boxed mode. Or it will stretch the output to fill the 6k screen.
If macOS can upscale a 5k 60Hz signal to 6k 60Hz, then it means the card can output 6k 60Hz, then everything should be able to render at 6k 60Hz from the very beginning.
For example, the following is from my own cMP. As you can see, a RX580 can output 7680x2160 @144Hz (it's 87.5% more demanding than 6k 60Hz). There is no such monitor exist on the world when I made this screen capture. But it's because I enabled the HiDPI function, which makes the rendering resolution went beyond my monitor's resolution.
macOS can take this kind of order, and it is entirely normal. e.g. Many people use 2560x1440 HiDPI on a 3840x2160 monitor. So, macOS will actually render the screen at 5120x2880, then downscale back to 3840x2160, and send it to the monitor. But in macOS system report, you will only see rendering resolution (5120x2880), and UI resolution (2560x1440), and it won't show you the signal is actually transmitting at 3840x2160 (which means the monitor is fully utilised).
Also, different macOS version may have very different behaviour on those "non standard" resolutions. In some OS, everything shows up for user to select. On some other OS, you may need something like BetterDisplay to select what you want. In some extreme cases, you may need to build the entire profile in SwitchResX in order to use the exact resolution you want.