All the way to to the right on "More Space" says "looks like 3200 x 1800"
The five options are
1600x900 - scaled 3.2x
2048x1152 - scaled 2.5x
2560x1440 - scaled 2x (default)
2880x1620 - scaled 1.78x
3200x1800 - scaled 1.6x
None of them scale 1:1, if that is what you are wondering
Retina, or HiDPI, or "Looks like" modes draw all text and objects at twice the height and width of "low resolution" modes (four times larger).
You can see low resolution modes (such as 5120x2880) by holding the option key and clicking "Scaled". Low resolution modes draw text and objects like Macs did before retina existed (regular hight and width).
Objects are drawn into a frame buffer (nvram of the GPU) with a certain resolution (for example, 3200x1800 for "Looks like 1600x900").
The frame buffer resolution might be lower (3200x1800) or greater (6400x3600) than the resolution of the display (5120x2880 for 5K display).
This frame buffer is then scaled by the GPU to produce the output resolution/timing which must fit in the bandwidth of the connection. Usually the output timing matches the resolution of the display (5120x2880).
However, you can create timings (non-scaled resolutions) that differ from the resolution of the display. The display has its own scaler to scale the output timing to the resolution of the display. Some displays can accept resolutions that are higher than their native resolution. For example, my 4K display can accept a 5K60 or 6K48 or 8K30 timing as well as the usual lower resolution timings 480p, 720p, 1080p, 1440p etc.
For each timing (low resolution mode), a HiDPI mode also exists using the same timing ("Looks like" half the width and height). For example, the 8K30 timing I created gets a 8K30 low resolution mode and a "Looks like 3840x2160" HiDPI mode.
For every scaled resolution, two modes are created for all the timings for a base resolution (usually the resolution of the display) a lower resolution mode and a HiDPI mode. By default, the base resolution is usually the same as the resolution of the display. For my 4K display, the base resolution is 4K which has timings for 30Hz, 50Hz, 60Hz, 95Hz, 120Hz. So a 6400x3600 scaled resolution would have 5 low resolution timings, plus 5 HiDPI timings.
macOS does not have a method in the UI to show the current output timing. You cannot know that a 5K framebuffer resolution is being scaled to 4K output timing by the GPU (as happens when a LG UltraFine 5K display is connected with USB-C instead of Thunderbolt) unless you view the timing info using an app like SwitchResX or looking at the output of a command like AGDCDiagnose. In the case of the LG UltraFine 5K or iMac 5K display, the 5K timing info is actually faked to appear as a single 5K timing instead of two 2560x2880 timings - but at least you'll know it's 5K total instead of 4K. The AGDCDiagnose will show the connections and original non-faked and non-overriden EDIDs (one for each half of the display).
macOS does not have a method in the UI to show the current output pixel format. It shows the frame buffer pixel format (e.g. ARGB2101010) but the output pixel format could be RGB 8bpc or YCbCr wth or without chroma sub sampling. Chroma sub sampling happens with my Mac mini connected to my 4K display with HDMI. Chroma sub sampling throws aways some color information. RGB 8 bpc produces a better image - especially for colored text.
macOS does not have a method to change output pixel format. You can use SwitchResX to change the framebuffer pixel format (between millions of colors = 8 bpc, and billions of colors = 10 bpc). To change from YCbCr 4:2:0 (chroma sub sampling) to RGB 8 bpc you have to override the EDID to remove the chroma subsampling option. Recently, Apple added a HDR option to the Displays preferences panel. This does change pixel format (enables HDR color and forces 10 bpc) - so maybe there's a hidden API that someone could use to change output pixel format.
macOS does not support 6 bpc. Windows Nvidia drivers have that option. It's fewer colors - but macOS can do dithering to handle that so you don't get banding when looking at a gradient. I've seen banding appear and disappear even when using 8 bpc or 10 bpc. It's kind of weird because an app might show banding one moment but not another moment. If there's no banding, it's difficult to know if that's because 10 bpc is being used or if dithering is being used. Some apps might draw using 8 bpc even if the framebuffer pixel format, output signal, and display are all set to 10 bpc.
macOS does not support MST hubs for multiple displays. Windows does. DisplayPort 1.4 MST hubs are more powerful than their older DisplayPort 1.2 hubs because they support DSC (Display Stream Compression) to allow more higher resolution displays that would normally exceed the bandwidth of a single non-DSC DisplayPort connection (the hub can convert DisplayPort signals - link rate and number of lanes - and can decompress DSC for older displays). Many USB-C hubs and some recent Thunderbolt 3 docks use an MST hub that is not utilized by macOS.
One thing macOS can do that Windows can't (I think?) is support 6K over Thunderbolt for non-DSC GPUs. Windows can do 6K using a GPU that supports DSC.