First, you certainly would want even/consistent contrast across the entire screen. Without this, you might darken areas that are actually just brighter due to blooming artifacts from local dimming. This rules out the XDR for many classes of work unless it can disable the local dimming.Can someone tell me why colorists need something better than an XDR? Are films ever projected or watched on anything that even relatively matches the specs of the XDR let alone a reference monitor?
Not saying they don't, I seriously don't know. It just seems like overkill.
Second, proper calibration throughout the entire chain is necessary to give the consumer at the end a fighting chance of looking at the correct image. If any part of the chain is off, it's basically not recoverable.
Third, displays are constantly advancing. A product produced today may be sold and resold many times in the future on ever improving display technology. So it behooves producers now to get it right, right now.
Fourth, 8K is being pushed ... and while I think it's wildly overkill, a 6K monitor clearly doesn't display 8K at 1:1.
Fifth, think of the variance in brightness you see in real life. Monitors today still don't come close to representing the brightness and contrast of sunlight streaming through your window next to a dark corner of your desk. At 1600 nits the XDR actually doesn't reach the peak localized brightness of top TV's today.
There are more reasons but that is enough for now.