Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ric22

Suspended
Mar 8, 2022
2,713
2,963
Of course I had it done during annual service, so the only additional time spent was adding “please also replace the battery” when handing over the keys. I wasted more time than that opening the hood and finding the manual.
Fair enough. Batteries usually die mid winter at the most inopportune time, or at least they used to.
 

BellSystem

Suspended
Mar 17, 2022
502
1,155
Boston, MA
Actually the Mac was the easiest windows machine to administer in my experience. In my job I use MacOS, Windows and variations on UNIX, so the Intel Macs suited me to the ground. Not sure what I feel about the new M1 Macs, but eventually compatibility issues with running VM's with other OS's will be ironed out for AS Macs or the Apple ecosystem will begin to wither for professional users.
I think we are about to see the Mac return to the professional workflows, at least creative pros.

I was a hardcore Apple fanboy until the Intel Macs showed up and it got worse and worse as time passed. But I have to say sitting in front of my Mac Studio and matching Display......for the first time in many many many years......I feel like I am using a Mac again. I haven't loved a Mac this much since the QuickSilver.

The best is yet to come.
 

Tagbert

macrumors 603
Jun 22, 2011
6,256
7,281
Seattle
I think we are about to see the Mac return to the professional workflows, at least creative pros.

I was a hardcore Apple fanboy until the Intel Macs showed up and it got worse and worse as time passed. But I have to say sitting in front of my Mac Studio and matching Display......for the first time in many many many years......I feel like I am using a Mac again. I haven't loved a Mac this much since the QuickSilver.

The best is yet to come.
The Intel Mac were very good computers for a long time. Much better than the PowerPC versions before. But they stagnated over the last 5-6 years from a performance standpoint and it was clear that Apple had lost interest in them. Since then, Apple has turned around and gotten serious again and Apple Silicon is breathing new life into Macs and it is exciting to watch again.
 
  • Like
Reactions: BigMcGuire

BellSystem

Suspended
Mar 17, 2022
502
1,155
Boston, MA
The Intel Mac were very good computers for a long time. Much better than the PowerPC versions before. But they stagnated over the last 5-6 years from a performance standpoint and it was clear that Apple had lost interest in them. Since then, Apple has turned around and gotten serious again and Apple Silicon is breathing new life into Macs and it is exciting to watch again.
Apple lost interest because the Intel stuff was trash and there was zero innovation in that space for more than a decade. The Intel Macs were/are so terrible. Every Intel Mac that I bought over the years I returned or sold because they were crap. The only ones I ever used long term were company provided. The PowerPC Macs had their own stagnation issues thanks to IBM. Apple got screwed by both so I’m glad they control their own future .
 

ahurst

macrumors 6502
Oct 12, 2021
410
815
The Intel Mac were very good computers for a long time. Much better than the PowerPC versions before. But they stagnated over the last 5-6 years from a performance standpoint and it was clear that Apple had lost interest in them.
I think it was less that Apple lost interest in making faster computers and more that Intel lost interest in making substantially faster chips. My Late 2013 iMac is almost 9 years old now, and the fastest Mid-2020 iMac with Intel’s flagship i7-10700K desktop chip only ever offered a ~37% single-core improvement over my aging Haswell i7. Heck, it took them 9 years to make a laptop chip that doubled the single-core of my old ThinkPad’s Sandy Bridge i7 from 2011! Apple made some questionable Macs during the 2016-2020 late-Intel era for sure, but I think that was more to do with Ive going overboard and Intel’s failed promises (e.g. the CPUs for the redesigned air running hotter than promised, forcing Apple to cram a fan into the design last-minute).

Also by 2006 the Core architecture was clearly the way forward from PowerPC, but in the G3 and early G4 era those chips held their own pretty well against contemporary Pentiums. It still amazes me how well my 466 MHz PowerMac G4 runs Tiger when my contemporary 700 MHz Pentium III ThinkPad struggles running a clean install of XP.
 

throAU

macrumors G3
Feb 13, 2012
9,199
7,354
Perth, Western Australia
A) For what purpose do we even need 400GB/s?
B) Can a Mac even utilise all that speed? No, it can't, not even in lab tests designed to utilise as much as possible. I posted a link with a lot of detail on this the other day.

Pro-res encoder will chew bandwidth
GPU will chew bandwidth

CPU not so much, but for GPUs, 400GB/sec isn't actually that high in 2022.

The lack of needing to copy data in and out of it due to the unified architecture helps a lot though - but media workloads is where the bandwidth is required. Those pro-res engines doing X streams of 8k processing need memory throughput.
 
  • Like
Reactions: Ethosik

ric22

Suspended
Mar 8, 2022
2,713
2,963
Pro-res encoder will chew bandwidth
GPU will chew bandwidth

CPU not so much, but for GPUs, 400GB/sec isn't actually that high in 2022.

The lack of needing to copy data in and out of it due to the unified architecture helps a lot though - but media workloads is where the bandwidth is required. Those pro-res engines doing X streams of 8k processing need memory throughput.
I assumed the GPU element of the M1 Max would chew through bandwidth too, but it can't utilise a quarter of what's available to it 🤷🏼‍♂️ The CPU utilises more than double what the GPU element does

Edit: Do the monster discrete GPU's even saturate their bandwidth? Or come close to the 1TB/s speeds some of them list? It's like buying tyres for your car that are rated to do 400mph... it's nice to know, but it doesn't mean your Honda is going to be able to reach that speed. Or your Ferrari.
 
Last edited:

Tagbert

macrumors 603
Jun 22, 2011
6,256
7,281
Seattle
I think it was less that Apple lost interest in making faster computers and more that Intel lost interest in making substantially faster chips. My Late 2013 iMac is almost 9 years old now, and the fastest Mid-2020 iMac with Intel’s flagship i7-10700K desktop chip only ever offered a ~37% single-core improvement over my aging Haswell i7. Heck, it took them 9 years to make a laptop chip that doubled the single-core of my old ThinkPad’s Sandy Bridge i7 from 2011! Apple made some questionable Macs during the 2016-2020 late-Intel era for sure, but I think that was more to do with Ive going overboard and Intel’s failed promises (e.g. the CPUs for the redesigned air running hotter than promised, forcing Apple to cram a fan into the design last-minute).

Also by 2006 the Core architecture was clearly the way forward from PowerPC, but in the G3 and early G4 era those chips held their own pretty well against contemporary Pentiums. It still amazes me how well my 466 MHz PowerMac G4 runs Tiger when my contemporary 700 MHz Pentium III ThinkPad struggles running a clean install of XP.
Certainly Intel was to blame for the performance issues (both compute and power consumption). I just meant that (from about 2016-2020) Apple seemed to want to focus on iPhones and iPads and didn’t really want to put the effort into enhancing the functionality of Macs. In the middle of that period, they seem to have woken up to the importance of the Mac and rededicated themselves. that period was where Apple seemed to just let Ive turn the Macs into pure sculptures at the cost of functionality.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
Certainly Intel was to blame for the performance issues (both compute and power consumption). I just meant that (from about 2016-2020) Apple seemed to want to focus on iPhones and iPads and didn’t really want to put the effort into enhancing the functionality of Macs. In the middle of that period, they seem to have woken up to the importance of the Mac and rededicated themselves. that period was where Apple seemed to just let Ive turn the Macs into pure sculptures at the cost of functionality.
Funny thing is, during that entire time, sales of Macs didn’t vary by much. If sales this year settle back into roughly 20 million a year, then it would indicate that, focus or not, they still do roughly the same business.
 

throAU

macrumors G3
Feb 13, 2012
9,199
7,354
Perth, Western Australia
Edit: Do the monster discrete GPU's even saturate their bandwidth?
Yup, they do.

Now bear in mind with the discrete memory they have, at least some bandwidth is consumed copping out of system RAM into GPU memory, and that isn't a thing with unified, but PC memory bandwidth to CPU is only say 70 GB/sec, so yeah the GPU is consuming several hundred GB of throughput.

Vega 64 being a prime example - memory overclocking gets you better gains than core overclocking... ditto for my 6900XT
 
Last edited:
  • Like
Reactions: Unregistered 4U

ric22

Suspended
Mar 8, 2022
2,713
2,963
Yup, they do.

Now bear in mind with the discrete memory they have, at least some bandwidth is consumed copping out of system RAM into GPU memory, and that isn't a thing with unified, but PC memory bandwidth to CPU is only say 70 GB/sec, so yeah the GPU is consuming several hundred GB of throughput.

Vega 64 being a prime example - memory overclocking gets you better gains than core overclocking... ditto for my 6900XT
Care to share a link that states that all that bandwidth is saturated? The documents I read last night suggested it was not.

Back on the M1 Max side of things, only 90GB/s of the 409GB/s available can be utilised by the GPU...
 

LinkRS

macrumors 6502
Oct 16, 2014
402
331
Texas, USA
Certainly Intel was to blame for the performance issues (both compute and power consumption). I just meant that (from about 2016-2020) Apple seemed to want to focus on iPhones and iPads and didn’t really want to put the effort into enhancing the functionality of Macs. In the middle of that period, they seem to have woken up to the importance of the Mac and rededicated themselves. that period was where Apple seemed to just let Ive turn the Macs into pure sculptures at the cost of functionality.
Sorry, completely disagree. Intel CPUs run (ran?) fine in non-Apple laptops or computers. The problem is that Intel had no competition, which resulted in their stagnation over the past six or seven years. Apple, on the other hand, is (was?) on a push to make the ever thinner and lighter devices, and Intel CPUs need adequate cooling/voltage. This not an argument about "performance per watt" or "efficiency" it is just for performance. If Apple had allowed for the increasing thermal requirements for the Intel chips, Macs would run just fine. However, Apple did not want to go that route, and what they shipped is the compromise between what Intel CPUs required, and Apple was willing to allow for. There is a reason that most Apple systems allow for temps (including M1s) to reach 100C before kicking on cooling (if available). When you see temps that high on a Windows PC, something is not working or you are really pushing your machine.
 

singularity0993

macrumors regular
Oct 15, 2020
161
794
People say Macs are designed for creativity rather than gaming. But I wonder what creativity task is M1 Max suitable for?

Let’s look at a few examples:
1. Photo Editing - I think base M1 works fine unless you’re editing images with a thousand layers, which most people apparently don’t.
2. Video Editing - This is the only genuine reason I can think of to buy M1 Max
3. 3D Modeling & Animation - You probably want to get a Quadro RTX card if you’re serious about this. M1 Max has no ray tracing capabilities, and its performance & cost is not as good as Nvidia cards. Software support on M1 is available in general but can’t compare to Windows PC. Windows with Nvidia is just better at this than M1.
4. Coding - You don’t need GPUs for these. M1 Pro or even base M1 is usually adequate. And people usually use CI for big projects so performance is not that important in general.
5. Music - Same as coding
6. Game development - Without ray tracing or graphics APIs like Vulkan, I think Windows is far better at this. M1 Max doesn’t have enough performance for AAA titles, and Macs aren’t friendly to games anyway.

So besides video editing, is there a reason to buy M1 Max?
 

bushman4

macrumors 601
Mar 22, 2011
4,142
3,902
Everybody has different needs in a computer. Some for business and others for leisure some for gaming others for surfing. Depending what your needs are is the criteria for liking or disliking a computer
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
You're saying that no one ever replaced a laptop because they needed more RAM or memory? Thus creating e-waste sooner rather than later.
I have replaced the full system if I need more RAM. With USB3 and higher to Thunderbolt, internal drives aren’t that necessary. And if you are pro enough where you need every MB from the 7GB/s transfer, you make up the cost of the added internals.

So yes. To add extra pint to Unregistered 4U, all of my products I ever had would be e-waste. Luckily there are services and even Apple to properly recycle my old products.
 

Ifti

macrumors 601
Dec 14, 2010
4,033
2,601
UK
Maybe its just for those who end up buying the best thats available at the time thinking that it will 'future proof' them for many years to come......but then in a few years they end up upgrading to a newer system anyways........[myself included, although trying to break that habit!] lol
 
  • Like
Reactions: philstubbington

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
This not an argument about "performance per watt" or "efficiency" it is just for performance. If Apple had allowed for the increasing thermal requirements for the Intel chips, Macs would run just fine.
Apple didn’t build systems in a vacuum, they would consult well ahead of time, align with Intel’s release schedule and, if Intel says, “We’ll definitely be able to ship on time, in quantity and in a form factor that you’ve specified”, Apple can do nothing but build the enclosures/motherboards specified and trust Intel’s roadmap. However, Intel during those days RARELY hit their goals (well, they’re still not hitting their goals, but anyway :) and in some cases shipped CPU’s so buggy, Apple was logging more bugs than Intel’s internal testers. All vendors WANTED to ship i9’s that used LPDDR4, but no one was able to because Intel didn’t include that capability. Other vendors just molded a different plastic body, in most cases thicker and heavier (because of the larger battery required) and crammed in desktop DDR. Apple, due to Intel missing their roadmap, was stuck with speed bumps.

Apple COULD have allowed for the increasing thermal requirements IF Intel specified on their roadmap that there were going to be increasing thermal requirements. :)
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
I believe most have stopped upgrading. In the companies I’ve been in most recently, the masses get a laptop for a number of years (2-5), they have parts to replace if one goes bad, but not upgrade. In two years, they’re assigned a new laptop, and so on and so on. Financially, it’s better for a company to have a contract that works like a hardware subscription… they get access to updated hardware on a known schedule and don’t have to deal with the purchasing headache for their tens and hundreds of thousands of computers every few years.

Modifications to the standard are acceptable with an approval for a variance, but those exceptions number more in the 10’s and 100’s than thousands.
We eventually ended up with a “no upgrade” policy at work when we first started moving from HDDs to SSDs and got about 50 OCZ drives. Nearly all of them failed due to corrupted firmware about s month or so after use.

Plus, from a Capital Expense perspective, getting a random drive off Newegg a year later throws off the books. It’s just not worth it. These are business systems, not for your grandma. You do capital expenses and have the product last for 3-5 years and spread the cost out. There’s more to CapEx but I could write an entire book.

It’s fine for users or grandmas to get Mac Studios, but that’s not the target audience.
 

Sopel

macrumors member
Nov 30, 2021
41
85
Photo editing - only you're an actual professional
Video editing - ok, maybe
video encoding - no actual benchmarks for software encoders so no idea, hardware encoding useless
Sound editing - don't see why
Coding - only if you target Apple/Web, otherwise just issues upon issues
compilation - only for small projects
modelling/cad - no way, needs good gpu with fp64, good ray tracing
gamedev - only if you target mobile ios
chess - terrible performance for price
web browsing - anything would do
streaming - what would you stream anyway?
machine learning on gpu - cuda is just miles ahead
machine learning on cpu - why?, neural engine comes with big usability issues anyway
gaming - not really, apple doesn't like gamers
high ram workloads - usually if you do need a lot of ram you need more than apple offers and an option to expand
accounting - windows + office
databases - not M1 issue but good luck with the performance of the ssd with full syncs


so basically you don't need it if you're not already fully converted to the apple land
 
  • Love
Reactions: jagolden

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
Normal business stuff, office apps. Mail clients.
Java applets (machine control/logging)
VM's (Windows and Linux)
Coding/monitoring.
remote to work location.

And then there the general web stuff one does at home.
Chess, and I don't care about the performance as I normally play with speed chess settings and I absolutely abhor benchmarking.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.