Nope. It’s built into the display. Apple quietly lengthened the cable by 2mm in 2018s so Apple is aware of it, whether they admit it or not. I’d like to hear them try and BLAME the 2018 on the screen manufacturers! Like they couldn’t get 2mm shorter ones! So yeah, document, document, document. Creating a work history at Apple Store is the best way to get things covered. People keep opting out of AppleCare but I’ve put it on every device it was offered over the years and when I walk in, especially armed with knowledge of what the problem already is, and sheer numbers of people affected, print outs of GPU or Kernal panics or Bridge OS crashes from T2 chip mysteriously left out of new 2019 iMacs you know, just like 2mm longer cables in 2018, THEY know.
Sadly take your pick, T2 chip with multiple daily to weekly Bridge OS crashes unsolved since Jan 2018 iMac Pro release + total elimination of data recovery ports for the SSD from the board or reliable 2017 T1 chip that pretty much fixed everything about the 2016 Models while adding 10bit H265 HEVC hardware acceleration necessary for almost all native 4K streaming like Netflix 4K and needs 7th Gen iGPU to confirm the HDCP 2.2 DRM compliance as well, but have the 2mm shorter display cable.
The one you can certainly rule out is 2016 8 bit Skylake MBP b/c speakers pop, 1st gen butterfly, shorter display cable, but basically just having to do a hybrid decode of 10bit H265 that still puts more on CPU despite graphics driver updates than claims made and until Metal 2 in High Sierra was entirely a software decode (decompresss & play) done entirely with a CPU load 50X H264.
Translations: You can use brute horsepower to do 10bit H265 even on an Ivy Bridge but with 50X CPU load to play that same 50% smaller H265 1080p video file you’ll be hard pressed to not affect battery life. (4K is pointless on a 2K Retina laptop btw, just makes CPU also have to transcode and downscale resolution to 2K while decompressing still giant 60GB file sizes) so even if you have 4K external monitor, definitely don’t mirror display on 2016!
I’d save in 1080p and let monitor do the upscale (again not mirroring, extend display) while being able to stream at even smaller 10bit 1080p H265 birate. Just make sure you match Chroma and refresh rates or upscale will look like garbage. Done right, it’s basically just doubling pixel density and basically indistinguishable from native 4K, even on a Roku Ultra if you Roku required 4K60HZ HDMI port on TV and go into Roku hidden menus and match up Chroma to 4:2:0 or 4:2:2 etc.
I’ve yet to upgrade my 1080p TVs b/c my Bravia gets so many compliments 10 yrs later, but 90% of people complain of upscale results are using 4K30HZ HDMI ports on TVs with same Roku requiring 4K60HZ and not matching Chroma etc. I now have a 4K HDR Apple TV and lack of mKv support (meant) my server having to decompress and demux and extract the MP4 and THEN reencode and COMPRESS video before sending (encoding 10bit x265 is even MORE taxing on a CPU than decoding, so my QNAP TS-451 Intel Dual Core Bay Trail J1800 w/Intel 4000HD while having Intel Quick Sync (hardware acceleration for H264) couldn’t handle 10bit x265 demux and re-encode and I had to use Infuse 5 as workaround because is supports mKv and about every audio and video codec patent because it pays for them! Recently though, Intel must of down some graphic driver software changes or “something” allowing to “direct play” mKv to get sent and demuxed by much more capable A10X chip, because I now and able to stream x265 mkv using PLEX. PLEX was also rebuilt from ground up and could be something on their end but Roku supports mkv and about every 10bit x265 HEVC in existence uses software that packages as AAC mkv? That’s why I felt forced to go Roku ultra (or Nvidia shield) but Apple TV interface and remote is so dang good that now that it’s demuxing mkv differently somehow, I’m finding it hard to go back to Roku Ultra. My CPU monitor definitely sits @5% instead of 98% so it’s direct playing at least until it hits the ATV 4K somehow!
(I know I went above and beyond question but so tired of people thinking native 4K is necessary. Even 2160p 10bit HEVC movie is 60GB so my 18TB NAS RAID 5 volume would only hold 275-330 movies! Meanwhile 6TB NAS drives costing $250x4 and cost of NAS itself has it costing $2000 to store 275 native 4K movies even in smaller files 50% smaller than H264! I now save and store 1080p in 50% smaller files and bitrate also then shrinks so things streams like a dream, and can even handle more remote streams out of the house to family and friends. If they want to upscale, their streaming boxes or 4K TV can upscale it and done right is so much more efficient and inexpensive.