The question is not whether those two chips arrive in a Mac Studio (it would be a major surprise if they didn't), but whether something even higher-end (justifying the Mac Pro case, because the Studio can't cool it).
Apple DOES seem to have some way of linking more than two chips (whether it is as fast as UltraFusion is a question) - they were fooling with it in Apple Car prototypes until that got cancelled. They were talking about a "quad-ultra" (or 8x Max) in the car. Of course it is possible that the (in)ability to go beyond Ultra is what cancelled the car AND made the Mac Pro disappointing. Maybe they had an idea, and it didn't work? Or the performance compromises were too great? Or the yields were too low?
The other thing to consider is what needs quad-M3 Max or greater performance. As a photographer, I'm absolutely shocked by how good the M3 Max is, both in performance and efficiency. I'm using 100 MP raw image files (the largest files readily available - there is ONE camera used in some very high end fashion and advertising applications that shoots 150 MP files, but it's $50,000). If the M3 Max handles the largest possible still photo files in large numbers with speed and aplomb (and will almost certainly handle the next generation of sensors, which should go up to ~160MP in semi-reasonable cameras), still photography is no longer a reason for a "beyond-Max" machine. There were certainly advantages to an M1 or M2 Ultra over a Max, but those seem to have disappeared for still photography in the M3 generation. Of course we haven't seen an M3 Ultra, but the M3 Max is real-timing even complex edits on very large files.
Video? I'm no professional, but I don't THINK it'll make much difference at 4K, with the possible exception of some super-complex effects. 8K - sure, but 8K distribution simply doesn't exist. Even IMAX projection is enhanced 4K, and, while 8K TVs for the home exist (8K home projectors don't), the only ways to get 8K content into them are to use a high-end computer as a streamer or to load still images from a USB memory key. It would take a HECK of a memory key to play 8K video, even if you had the files in the first place. No cable or satellite box can handle 8K, no Internet streamer short of a full-on computer can, either, and even if you DID use a computer, almost no 8K content exists for download or streaming at any price.
AI? Sure, but training big LLMs seems to take a supercomputer? 3D, especially 3D animation? Possibly, but how often are people doing that on the desktop? Isn't high-end 3D rendering generally done on render farms - your desktop needs to be powerful enough to do the interactive parts, but the big renders get pushed off to the server room?
It seems like there is a narrowing window between "an M3 Max is plenty fast" and " that's going to be pushed to the server room"?