Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
i don't see the base studio being good for the value simply because for 500 more, you get the same chip in the 16inch. but with a really good display and mobility.
And we were told that Apple Silicon would save a lot of money but it looks like Apple is pocketing all the savings. The price delta from the M1 Max to the M1 Ultra is an insane $1400 or $2200 depending on the GPU so maybe the yields are really bad that's why the price is so high.
 
  • Like
Reactions: Richtong
And we were told that Apple Silicon would save a lot of money but it looks like Apple is pocketing all the savings. The price delta from the M1 Max to the M1 Ultra is an insane $1400 or $2200 depending on the GPU so maybe the yields are really bad that's why the price is so high.

The Raspberry Pi is still quite affordable.
 
  • Haha
Reactions: Kuckuckstein
Okay, who will be the first person to say a desktop needs to support more than five monitors?
tempImageFC13Ne.jpg


I'll be the first! And I haven't even made use of the pair of HDMI ports on the back yet! 🤓
 
i don't see the base studio being good for the value simply because for 500 more, you get the same chip in the 16inch. but with a really good display and mobility.
compared to thr high end intel mac mini its good value. and its not for 500 more. it's actually $1000 more
 
Oh boy we're all video editors that need five screens to stare and look at or music producers now we're not we just want our people that want to get our work done quickly and be able to display a couple of browsers and maybe one or two applications and watch a video or two does anybody really need five displays other than some recluse that creates music in a basement somewhere.
 
Oh boy we're all video editors that need five screens to stare and look at or music producers now we're not we just want our people that want to get our work done quickly and be able to display a couple of browsers and maybe one or two applications and watch a video or two does anybody really need five displays other than some recluse that creates music in a basement somewhere.
I don't think that the purpose of them stating how many monitors can be connected to the machine with the expectation that this is how people should be configuring their personal or even Studio™ setup. I think they are simply trying to get across the amount of bandwidth that it is capable of. If this were a PC that didn't have any Thunderbolt ports, then touting how many displays it could have connected to it would seem like a strange marketing claim to make. But given that it is fairly known that a 5k and 6k display requires a significant amount of data throughput, so if you can do that fivefold, then that is definitely a potent machine. And since you can exchange the data that 4 additional displays would consume and use that bandwidth instead for moving files between an external RAID array or other system, you know that you're not going to run into a bottleneck. That's what they're attempting to get across to everyone.
 
Oh boy we're all video editors that need five screens to stare and look at or music producers now we're not we just want our people that want to get our work done quickly and be able to display a couple of browsers and maybe one or two applications and watch a video or two does anybody really need five displays other than some recluse that creates music in a basement somewhere.
I currently have seven on my desk. I could use more. I need a minimum of one for server side source code, a few for client side, several for libraries and such. It's nice to be able to look up functions. I also need one or more for the actual client side application to run and one for the server output.

I also use a dedicated computer for dealing with email, social media sites and this cool rumors site that talks about Apple.
 
Oh boy we're all video editors that need five screens to stare and look at or music producers now we're not we just want our people that want to get our work done quickly and be able to display a couple of browsers and maybe one or two applications and watch a video or two does anybody really need five displays other than some recluse that creates music in a basement somewhere.
It’s not just for monitors on a desk. It’s also for large LED walls and regular 16:9 playback off a single Mac
 

Attachments

  • 8BE67AD9-C43D-47FF-9D52-E7DC9664CCE9.jpeg
    8BE67AD9-C43D-47FF-9D52-E7DC9664CCE9.jpeg
    364.9 KB · Views: 106
The Ferrari floor mats are very on-brand for this setup!
Thanks! They did seem appropriate.

BTW, those are a pair of LG 5K Ultrafine displays flanking the top, with the remaining three being Pro Display XDR displays, making for 90,565,632 pixels across all 5 displays.

As I've received a few PMs regarding the workstation, yes...she's loaded.


Stay Blessed everyone!
 
Last edited:
Does the new display support daisy chaining? I’d hope so.
As the three "Hub" ports are for USB-C only, with the only Thunderbolt 4 port being the input from the computer, you might be able to daisy-chain an additional display (4K or lower) so as long as it doesn't exceed the total available capacity of the original TB4 signal between the computer and the Studio Display.

So if you're asking if you would be able to daisy-chain multiple Studio Displays together like you were able to do with Apple's first display that used Thunderbolt (TB2), the "Apple Thunderbolt Display" did, then the answer is NO because a Studio Display requires that it is connected to a Thunderbolt port, and the only thing for an additional monitor to connect to from a Studio display will be a USB-C port.
 
  • Like
Reactions: DeepIn2U
And we were told that Apple Silicon would save a lot of money but it looks like Apple is pocketing all the savings. The price delta from the M1 Max to the M1 Ultra is an insane $1400 or $2200 depending on the GPU so maybe the yields are really bad that's why the price is so high.
Part of that is the memory upgrade. You have to pick the 64GB Ram when going to Ultra.
It's pricy, but when you factor say an intel chip to compare it. You don't get well, everything that comes with a SoC. And, Apple isn't selling this to "other" companies or consumers directly. So the economies of scale are not there either.

But even with that being said. Where are you going to get an integrated CPU/GPU/Memory with 800GB bandwidth on the RAM that can compete with not just a top end intel i9 or Ryzen, AND either a 3060 or 3090 graphics card? It's not happening. It's at least $400 for the graphics card, and another $500 or so for the CPU. No, memory and not at that rate of speed. Nor that much memory for the graphics card. Even if you went with a 128GB version Ultra. Your not getting a 64GB graphics card on the PC side, let alone a 128 (say 120 if your giving at least 8GB for the Mac OS and related apps).

I think it's very hard to quantify a proper comparable system on the PC side to this SoC. Yes, it's expensive. However, what can you build to properly compete with it today? No intel/AMD CPU uses so little power (watts), to perform as well as Ultra does. No video card supports as much memory, and has as fast of access to the CPU. No dual socket CPU setup works the same way the Ultra does. It's in a different league.

I suspect the SoC going into the Mac Pro and iMac Pro will be more like a M1 Ultra/Xeon type CPU. Maybe no or very little on the GPU core side. Double the CPU cores with no "performance" cores. More ML/AI and Video encode/decode cores. And a LOT of PCI-e lanes for all the ports/cards you can want. And upclocked from 3.2 to 4GHz.
 
As the three "Hub" ports are for USB-C only, with the only Thunderbolt 4 port being the input from the computer, you might be able to daisy-chain an additional display (4K or lower) so as long as it doesn't exceed the total available capacity of the original TB4 signal between the computer and the Studio Display.

So if you're asking if you would be able to daisy-chain multiple Studio Displays together like you were able to do with Apple's first display that used Thunderbolt (TB2), the "Apple Thunderbolt Display" did, then the answer is NO because a Studio Display requires that it is connected to a Thunderbolt port, and the only thing for an additional monitor to connect to from a Studio display will be a USB-C port.

Damn! Seems like a VERY odd oversight/limitation done by Apple. Then again the TB4/USB-C4 bandwidth is the limitation.

When will TB5 debut?
 
Not able to do HDMI 2.1 on a $4K to $8K computer, seriously?

So does this mean it won't support 8K? What do professionals that are making 8K video do?

So a still hobbled HDMI 2.0 port instead of 2.1! 2.1 would have been a nice gesture

not hobbled
can do 8K and higher, likely as specs continue to outpace content or even content creation.
only HDMI can’t, it’s 2.0, 2.1 expands bandwidth to about 80% of Thunderbolt 4’s 40Gb/s.
so it’s not hobbled, professional editors shooting in 8K can still output 8K, even edit it in multiple streams and displays- seriously

but a single port, the HDMI, can’t send an 8K signal. You can, in fact, send 4 6K outputs from the Max configuration of the 14/16” MBPs!, as well as your near 4K internal and use your HDMI 2.0 for a lowly 4k 60Hz scratch display or color grader if you’re invested!

I’m sorry to be so late to the discussion, but I had to laugh at the extraordinary amount of folks cropping from their 10 and 12K REDs to 8K output (not proxied) for real-time editing purposes! :) And I’m not trying to be rude… but never in the history of humanity has a faster, meaner, quicker, and feistier laptop existed that more capable of/for video, still, and audio editing. I make a living doing 1 & 3, 2’s my hustle and hobby - and I have never enjoyed a more pleasurable, efficient and less time consuming job/workflow - than I have with my 16” 32/32 Max, likely an unnecessary upgrade itself. A friend and partner in business has the 16” model with the Pro chip and is essentially getting similar results as I in most tasks - and he’s a former Windows only user.
Whether we’re using FCP or the Adobe Creative Suite/Cloud, DaVinci or Affinity, audio, video or effects - from 2-8K, you simply can’t find a peer anywhere in the known galaxy… unless you’re talking about the Studio/Ultra, and the law of diminishing returns;)

I have easily saved 50% of my time, increased our output by 70% using two 16” Max and 2 Studios. They replaced a pair of 2019 16” 5600m machines and a pair of 27” iMacs

Still, the fact remains, content is king. And 8K today, there’s a total of about 0 content available that’s engaging, entertaining or educational. Screensaver motion shots aside, who cares if the HDMI is 2 or 2.1? Tomorrow it’s going to be 3.0 - but w/Thunderbolt 4/USB 4 - you have more throughput than any HDMI version’s capable of handling, and bi-directionally.
HDMI is used for audio as well, without video and it’s current spec is more than capable of handling the highest resolution audio possible, 4K @ 60Hz for video, if you’re in a pinch to hook to a projector, and any ‘pro’ is better to opt for the TB4 I/O than HDMI, probably from now on. We’ll see with USB-5 specs, whether HDMI will continue to be in another five years. Remember DVI, Component, Composite, S-Video, all the best for their era, and bested hooking a pair of stripped copper wires to two screws on the back of the TV!
 
not hobbled
can do 8K and higher, likely as specs continue to outpace content or even content creation.
only HDMI can’t, it’s 2.0, 2.1 expands bandwidth to about 80% of Thunderbolt 4’s 40Gb/s.
so it’s not hobbled, professional editors shooting in 8K can still output 8K, even edit it in multiple streams and displays- seriously

but a single port, the HDMI, can’t send an 8K signal. You can, in fact, send 4 6K outputs from the Max configuration of the 14/16” MBPs!, as well as your near 4K internal and use your HDMI 2.0 for a lowly 4k 60Hz scratch display or color grader if you’re invested!

I’m sorry to be so late to the discussion, but I had to laugh at the extraordinary amount of folks cropping from their 10 and 12K REDs to 8K output (not proxied) for real-time editing purposes! :) And I’m not trying to be rude… but never in the history of humanity has a faster, meaner, quicker, and feistier laptop existed that more capable of/for video, still, and audio editing. I make a living doing 1 & 3, 2’s my hustle and hobby - and I have never enjoyed a more pleasurable, efficient and less time consuming job/workflow - than I have with my 16” 32/32 Max, likely an unnecessary upgrade itself. A friend and partner in business has the 16” model with the Pro chip and is essentially getting similar results as I in most tasks - and he’s a former Windows only user.
Whether we’re using FCP or the Adobe Creative Suite/Cloud, DaVinci or Affinity, audio, video or effects - from 2-8K, you simply can’t find a peer anywhere in the known galaxy… unless you’re talking about the Studio/Ultra, and the law of diminishing returns;)

I have easily saved 50% of my time, increased our output by 70% using two 16” Max and 2 Studios. They replaced a pair of 2019 16” 5600m machines and a pair of 27” iMacs

Still, the fact remains, content is king. And 8K today, there’s a total of about 0 content available that’s engaging, entertaining or educational. Screensaver motion shots aside, who cares if the HDMI is 2 or 2.1? Tomorrow it’s going to be 3.0 - but w/Thunderbolt 4/USB 4 - you have more throughput than any HDMI version’s capable of handling, and bi-directionally.
HDMI is used for audio as well, without video and it’s current spec is more than capable of handling the highest resolution audio possible, 4K @ 60Hz for video, if you’re in a pinch to hook to a projector, and any ‘pro’ is better to opt for the TB4 I/O than HDMI, probably from now on. We’ll see with USB-5 specs, whether HDMI will continue to be in another five years. Remember DVI, Component, Composite, S-Video, all the best for their era, and bested hooking a pair of stripped copper wires to two screws on the back of the TV!
What are you talking about. HDMI 2.1 supports 10K at 120Hz, Dolby Vision, HDR10+, etc. Max. transmission bit rate is 48 Gbit/s. Max. data rate is 42 Gbit/s.


If you're claiming otherwise, then quote your source.

The real reason(s) that HDMI 2.1 isn't supported are:
1. The chips themselves don't support the bandwidth, with max output at 40 Gbit/s!!!
2. It gives Apple another feature to upgrade in the future to push people to upgrade to a newer machine.

Thus, none of the M-series chips support 8K displays so far. Sure, the chips can process 8K, but they can't output it to an 8K display. Thus why everyone is wtf, so first have to export the output and offload it to another machine to watch it on an 8K display. Bewildering.
 
Last edited:
  • Love
Reactions: SFjohn
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.