Why buy a $1599 dGPU when you can just rent it from the cloud.
Cloud is great when your loads are highly variable, as you are flexible in scaling up and down. It's a poor deal when they aren't.
Why buy a $1599 dGPU when you can just rent it from the cloud.
Why buy anything to own physically, when you can just pay more over it's lifespan to have something that can be taken away with little notice? Great deal.Why buy a $1599 dGPU when you can just rent it from the cloud.
Yeah I am sure syncing TB and TB of video data to the cloud will be super fun so you can use said "Cloud" GPU to get your work done.They could outsource the AMD/Nvidia dGPU requirements to the cloud.
Cashflow-wise it would be cheaper and when newer dGPUs becomes available you can easily subscribe to that.
That is how workflows are changing over time with tech lowering down prices.
Why buy a $1599 dGPU when you can just rent it from the cloud.
I'm no expert. Can someone explain to me how a maxed out Mac Studio differs from the 2023 Mac Pro? Because to me, they look pretty similar in specs. Not only: the Studio looks quite cheaper.
THe only real difference is that the Mac Pr has PCI slots. The people who need these slots were never going to put GPUs in them. That's a gamer thing. The people who really need these PCI slots will place Black Magic Vidio input cars and some high-end audio cards ue maybe some fiber-optic networking cards in the slots. without these cards the users can't access the media they work with
People complain that the MP's PCI is "4" and not "5" but the cards these people are using don't have "5" versions.
For the vast majority, the Mac Studio is the best deal. The only reason to buy a Mac Pro is if you office is wired with a legacy fiber network or some other connectivity issue like that.
What we don't see here is a real person saying "I can't use the Mac because the GPU is too slow. Very few people have use for a GPU faster then what is o the M2 Ultra.
David Lebensfeld, founder and VFX supervisor at Ingenuity Studios, was dubious. “That doesn’t seem like something a VFX studio would use,” he said after I described the product. Nobody on Lebensfeld’s team has expressed interest in the Mac Pro — there has been “zero chatter” about the product, he says.
Lebensfeld’s company is all in on Windows and Linux, and that’s common for studios of Ingenuity’s size. Switching over to the Mac Pro, given its price point, would just be impractical. Lebensfeld gets better value out of Windows PCs, which support the latest GPUs from Nvidia and can be equipped with the exact parts and specs that each team needs. When a part breaks, they can grab another one off the shelf.
In fact, some of the VFX and animation professionals I contacted for this story declined to be interviewed because they simply don’t know much about Macs — they just aren’t widely used in that industry at this point. The reality is that these types of studios need to keep their hardware functional and up to date. Replacing a full Mac Pro system — let alone a fleet of them — regularly would be an absurd cost.
There are a whole marked for people that needs more GPU power but Apple doesn't seem interested in it. Too bad really cause I think if the next MP is just is getting a small refresh with M3 and maybe some more ram it's going to be the last one.
https://www.theverge.com/23770770/apple-mac-pro-m2-ultra-2023-review - Worth reading to get some more insight on the state of the current Mac Pro.
The people who need these slots were never going to put GPUs in them. That's a gamer thing.
There is a very narrow group of Mac Pro users, who needed dGPU for 3D rendering and animation and equally narrow group of users in scientific research who needed huge amount of ECC Ram.
Yes, M3 might bring HW RT, we shall see…Apple is heavily investing in the 3D rendering market, for example Sonoma adds support for raytraced curves. The hardware is not quite there yet, because it lacks RT acceleration, but if you look at raw compute, Apple is already competitive with the other vendors. I do hope that future updates will bring quad Max die configurations and possibly replaceable SoC boards.
Scientific research is a different market altogether that needs fairly niche stuff, I don't think it would make much sense for Apple to go for it...
Apple doesn’t have to be king of the hill to win. Not sure why Apple must have a AMD/nVidia GPU killer.M3 Ultra & M3 Extreme should bring some needed GPU boosts...
N3X, A18 cores, hardware ray-tracing, higher GPU core counts, faster clocks...
This should all result in parity with the performance of one (M3 Ultra) or two (M3 Extreme) high-end AMD/Nvidia GPUs (8900/5090), and Apple silicon GPGPUs could provide compute/render power equal to up to (assuming two Duo cards) four of the aforementioned AMD/Nvidia cards...
Fingers crossed...! ;^p
Apple has the ability to buy years of production from Suppliers and manufacturing Partners, and are Trend-Setters at times, which makes them a Big-Ticket Customer.Sorry, no, I totally disagree with you. Companies worry about their competitors the most, and big ticket customers next (and Apple *isn't* a big ticket customer.)
Yes it is a marketing trick because renting is always more expensive than owning on your own.They could outsource the AMD/Nvidia dGPU requirements to the cloud.
Cashflow-wise it would be cheaper and when newer dGPUs becomes available you can easily subscribe to that.
That is how workflows are changing over time with tech lowering down prices.
Why buy a $1599 dGPU when you can just rent it from the cloud.
Yes it is a marketing trick because renting is always more expensive than owning on your own.
For example, you can rent an electric scooter but for three months of use you will pay as much as for a new one in the store . You can also borrow an entire mac with similar results
True, but more to the point, why buy a $1599 dGPU when you can rent a $30,000 dGPU from the cloud...?Why buy a $1599 dGPU when you can just rent it from the cloud.
Why buy anything to own physically, when you can just pay more over it's lifespan to have something that can be taken away with little notice? Great deal.
Well, there's the slight technical issue that the M2 Ultra only has 16 PCIe lanes available, so a decent dGPU would soak up all of them and leave nothing for all the other things MP buyers use PCIe slots for.I seriously think at least Mac Pro should support external GPU from AMD and Nvidia. There aren't any technical issues to use AMD/Nvidia GPU unless Apple not allows it. M2 Ultra is barely close to RX 6900 XT so why not?
...false analogy. First, you're talking about using a short-term rental service for long-term use. Go and talk about leasing a scooter long term and you'd expect to get a better deal - probably one that gets you a shiny new latest model scooter every year or two. There are all sorts of financial reasons why businesses often choose to lease kit (with bundled service, support, upgrades and replacements) rather than buy it outright. It can even make sense for individuals to lease cars if they want to replace them every couple of years (and if you're talking about computing kit which will be outdated after 18 months then, yes, you probably want that).Yes it is a marketing trick because renting is always more expensive than owning on your own.
For example, you can rent an electric scooter but for three months of use you will pay as much as for a new one in the store .
Then by all means buy the hardware equivalent of what you're renting in the cloud - doesn't have to be Mac, you've pretty much proved that by using the cloud to start with - and, pretty soon, others will follow suit and the cloud provider will have to re-think their prices. However, I strongly suspect that the true 'total cost of ownership' of the hardware won't be so attractive - and we're talking about business use where things like support/replacement/service contracts are a necessity. Also, don't forget to allow for the difference between the $1-2k computer you need as a cloud client and the $5-7k+ computer you need to run a high0end GPU.Cloud is expensive if you have consistent work load. I use cloud for training but running inferences can get very expensive in cloud. 3-4 months of cloud cost pays the GPU.
I have no idea what you talking about? Doesn’t make any sense. I use both cloud (for training) and workstation(non mac) and MBP. Cloud is great for training, because my workload is inconsistent, I don’t need to buy expensive Nvidia h100 or what ever the latest high end GPU. I scale up and down, cloud is economical. For inference I use 4090 , which can cost like 700 bucks a month to rent. You can recover the cost in 2-3 months and the contracts for service and warranty are nothing compared to cloud costs.Then by all means buy the hardware equivalent of what you're renting in the cloud - doesn't have to be Mac, you've pretty much proved that by using the cloud to start with - and, pretty soon, others will follow suit and the cloud provider will have to re-think their prices. However, I strongly suspect that the true 'total cost of ownership' of the hardware won't be so attractive - and we're talking about business use where things like support/replacement/service contracts are a necessity. Also, don't forget to allow for the difference between the $1-2k computer you need as a cloud client and the $5-7k+ computer you need to run a high0end GPU.
...but also remember we're talking about a likely future trend towards laptops/SFF clients + cloud and whether it's worth Apple making a huge investment in developing a "workstation class" version of Apple Silicon for a market which is being eroded by powerful laptops/SFFs at one end and increasingly mainframe-like cloud/datacentre kit at the other.
Isn't the cloud a bit of a red herring in this discussion? You can work on a MacBook Air if your main computational needs are outsourced to the cloud, but the Mac Pro is designed for people who need (or just want) that power on the desktop. GPUs are clearly something people want in their PCs as they sell a heck of a lot of them to people who aren't cloud service providers.
Definitely true! For any prolonged work, a local machine is going to be cheaper than the cloud, and people need those machines to do what they want. I don't really need a dGPU for my work or extra sound processing, but other things I do need, like RAM and good performance requires me to buy high end machines, and there's no possible way I could afford my constant workflow in the cloud, even if it were as reliable as local. (which it's not)Isn't the cloud a bit of a red herring in this discussion? You can work on a MacBook Air if your main computational needs are outsourced to the cloud, but the Mac Pro is designed for people who need (or just want) that power on the desktop. GPUs are clearly something people want in their PCs as they sell a heck of a lot of them to people who aren't cloud service providers
I'm a little dissapointed just because of the past usage for a Mac Pro, but disaster, no. My only real desire that they would have sold it cheaper than the old version, since it is less expandable and it would compete with WIndows PC's in the market better. I might have even bought one because the performance and RAM are perfect for what I need. I might not have to have so many machines sitting around me to do what I want. As the price is now, I can get better performance for cheaper.I don't think the Mac Pro is a disaster btw, it's just reflective of a narrowing of the focus of Apple's technology towards consumers and the many groups of professionals who don't have somewhat esoteric needs.