Practically speaking, I don't see (so to speak) a 16K display as a likelihood, and 8K is going to be a long way off. The market is not likely to support it, since the vast majority of users won't be able to distinguish between 4k and 8k, no less 12K or 16K. A 16K display would have over 140 million pixels, and be somewhere in the range of 720 ppi. That's a lot of pixels for a manufacturer to get "right" (how many dead/bright pixels would be acceptable?), a lot of graphics chip capability... and streaming media content in native 8K-16K is going to demand a whole lot more internet bandwidth.
It's all an illusion anyway. Nearly all media is consumed via lossy data reduction schemes like JPG, MP-3, MP-4, and few consumers are the wiser. Do we really need ultra-high-resolution systems at the end of a "good enough" pipeline?
It's not about what someone can discern when cranking the volume to 11, or peering through a magnifying glass. It never has been. It's always been about the minimum thresholds of detection by an average person under typical conditions (24-30 frames per second for smooth motion, 1 decibel as the minimum detectable difference in audio volume, etc.).
Sure, Moore's Law will keep chugging along. At some point we'll transition to 128- or 256-bit microprocessors, too. 3TB primary storage will seem quaint...
No, display resolution is going to plateau, like the 300 dpi offset printing plate, the 24-36 MP DSLR sensor, and 20 Hz-20 KHz audio response - there are simply better ways to spend money and resources.