Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

telequest

macrumors regular
Feb 1, 2010
185
43
NJ
I think for the point of the Pro discussion, it's about how much the workstation is able to improve the productivity of the user vs using a typical computer. A 'professional' is typically highly skilled and well compensated for that skill. Therefore, if they are regularly sitting around waiting for a computer to render, compile or even just resize images, there is an opportunity cost in terms of availability for additional work.

Quality demands increase over time, however, so the workstation needs some headroom and / or expandability to avoid becoming a bottleneck. It's not just about processing power either - ergonomics are also important. Trying to edit video on a 13" laptop via a trackpad would put a dent in most people's productivity. Interoperability with industry-standard pipelines is also a must too.

Over time more and more types of tasks can be processed so fast by a typical machine that an 'Air' laptop or iPad is sufficient. A professional writer could probably get by with an iBook G3 (if only writing). That writer is of course still a professional by all other definitions of their job - it's just they don't require a big box under their desk with a 1000W PSU in it. Apple are likely champing at the bit for the day a 27" iMac can replace the Mac Pro for virtually everyone, and they can discontinue it forever.

Productivity-focussed machines typically put functionality first and other considerations second. The 'big truck' that Steve referred to.

'Functionality' might be considered as:
  • Sustained high performance
  • Effective and quiet cooling
  • Reliability
  • Quantity and variety of ports / interfaces
  • Adaptability
  • Ergonomics (e.g. multiple large screens, full-size keyboard and mouse etc.)

'Nice to haves':
  • Slim / sleek form-factor
  • Low weight
  • Visual simplicity
  • Compact
  • Low cost
It would be great if your post here could put the whole "who is a pro user?" discussions to rest finally. I'm old enough to have gone thru a series of Power Macs before the Mac Pro arrived. It really is all about who needs the *power* to do what they want or need to do (along with reliability and flexibility) so they can be productive and achieve their goals. Who cares if they get paid to do it, or if they have some kind of institutional credential?
 
  • Like
Reactions: Bow Commander

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Yeah, it essentially boils down to: is the computer the limiting factor, or you?

If you're regularly waiting for the computer to finish tasks (even if momentarily, but constantly), then you'd benefit from a more powerful computer. This may be a Mac Pro, or just a specced up iMac or MBP. If you're using it to generate income, it may even pay for itself if it allows you / your company to complete more jobs per month.

If the computer is generally waiting on you, you're probably fine with what you have. This includes professionals who work in areas (e.g. music composition) that require a lot of skill, but not a great deal of processing power. That's not to say of course that people shouldn't spend money on whatever they feel like.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
It would be great if your post here could put the whole "who is a pro user?" discussions to rest finally. I'm old enough to have gone thru a series of Power Macs before the Mac Pro arrived. It really is all about who needs the *power* to do what they want or need to do (along with reliability and flexibility) so they can be productive and achieve their goals. Who cares if they get paid to do it, or if they have some kind of institutional credential?

The question of whether "Professional" is defined merely by whether someone gets paid or not is important, because it's the core of an argument that goes along the lines of "you shouldn't need an upgradable computer, you should just buy a whole new ~10k computer every 3 years, because it'a a 'professional' computer, and if you're a 'professional', you're using it to make money, and therefore it's just a cost of business so you should just afford it".

Whereas the small-computer fetishists can get everything apart from their cosmetic jollies from a larger slot-based machine that they can choose not to upgrade, the non-upgrade brigade are literally arguing in favour of a machine that excludes those who need to be able to reconfigure their systems post-purchase.

We've seen what happens when Apple makes those non-upgradable appliance machines - they cost just as much, if not more than the bigger slotboxes, because Apple culturally considers smallness a premium feature on which to charge customers, and it destroys their presence in the industries they're supposed to target, because Professional Practice values customisation, serviceability, and reconfigurability as a safety feature against unforeseen industry and technology changes in desktop hardware, and that isn't something that Apple can change, because it's a platonic ideal that said industries like, not some burden they labour under.
 
Last edited:

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Do you guys think the iMac 27 and Mac Pro will be based on an M1 architecture or an M2 architecture?
M1 for the iMac for sure. I expect the only reason the 27 hasn't been released yet is that there's supply chain restrictions and Apple are prioritising the M1 Pro / Max chips for their laptops. This is their volume market, and possibly has higher margins. It's also possible the iMac is getting a new 30" LCD with FaceID or something, and there's delays with that as well / instead.

Whether the Mac Pro gets M1 or M2 depends when it actually comes out. If soon, M1, if 2023, M2. The iPhone will always be the priority, so the MP's 'mega chip' derivative will likely be the last to emerge in each generation, as it is furthest removed from the source.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
The question of whether "Professional" is defined merely by whether someone gets paid or not is important, because it's the core of an argument that goes along the lines of "you shouldn't need an upgradable computer, you should just buy a whole new ~10k computer every 3 years, because it'a a 'professional' computer, and if you're a 'professional', you're using it to make money, and therefore it's just a cost of business so you should just afford it".

Yeah, the argument that just because a company can afford something means they don't care about value for money is a bit silly. Fair enough if it's ILM, but if a freelancer wishes to sell a machine after 3 years, they want a good resale value; alternatively, if it could be put to good use as e.g. a rendering box, then things like GPU upgrades are relevant. Sure, the entire platform ages technologically, but upgrading a 3 year old GPU can still provide a big boost.

Whereas the small-computer fetishists can get everything apart from their cosmetic jollies from a larger slot-based machine that they can choose not to upgrade, the non-upgrade brigade are literally arguing in favour of a machine that excludes those who need to be able to reconfigure their systems post-purchase.

Not being able to reconfigure a machine later typically also means a higher up-front cost, as you can't take advantage of prices falling over time. Like many companies, Apple love to rinse customers with RAM, SSD etc. prices; they just have greater opportunity to do so when the parts are proprietary.

We've seen what happens when Apple makes those machines - they cost just as much, if not more than the bigger slotboxes, because Apple culturally considers smallness a premium feature on which to charge customers, and if destroys their presence in the industries they're supposed to target, because Professional Practice values customisation, serviceability, and reconfigurability in desktop hardware, and that isn't something that Apple can change, because it's a platonic ideal that said industries like, not some burden they labour under.

Yes, please, no more SSF Mac Pro machines. And I say that as someone with a G4 Cube on my shelf (and a space waiting for a 6,1 some day).
 
  • Like
Reactions: mattspace

mikas

macrumors 6502a
Sep 14, 2017
898
648
Finland
No more SFF Pros for me either. I wouldn't buy any of those never anymore..
1639502135884.png

:rolleyes:
 

mikas

macrumors 6502a
Sep 14, 2017
898
648
Finland
Seriously, the best I can hope for from Apple at this time is a "Beast Canyon" alike (mentioned quite a few times already earlier by some insightful members).

A Mac Cube Tube it would be. All SoC and all soldered in, but one PCIe slot for a real GPU.

It probably will be called a Mac Cube Tube Pro Max.

At a second thought though, I will predict they will ruin it by making it a proprietary only slot. Maybe a derivative of a MPX or something. Tinier, slimmer, notcompatiblier.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
This sounds like a terrible idea (no offence - I realise there's limited options here). Using the PCIe bus, they'll have no inherent advantage (unified memory etc.) over other GPUs. And as far as I'm aware, Apple's GPUs are nothing revolutionary in themselves.

An Apple PCIe GPU would also only be used in one computer that sells in tiny numbers. How are they supposed to counter the colossal investment Nvidia and AMD put into their designs, year on year? I guess in a captive market they won't need to be better than the competition, just 'good enough', but it seems like a long-term hassle for Apple when they could just use an off-the-shelf AMD GPU for this one model.

Unified memory and an SoC design comes with thermal constraints. They'd loose the unified memory boost, but gain a much higher thermal ceiling they could use to produce really big and hot GPUs. Big and hot GPUs aren't what you want in a laptop, but in a desktop workstation it's not a big deal.

That said - you're also right that it's probably going to be hard to compete with AMD and especially Nvidia even with a higher thermal ceiling. So they could just keep selling AMD (and maybe maybe even Nvidia) MPX modules coupled with an ARM CPU. If they can't beat them in the dGPU case, there is no reason to compete.

And like the Barefeats benchmarks above show, there are plenty of reasons to support AMD eGPUs even on existing hardware. Even the Radeon 6900 makes a pretty compelling case as an eGPU on M1 Max hardware. Apple seems to be trying to pretend their mobile GPUs are so fast they make for great desktop class performers. But that is clearly not true.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Seriously, the best I can hope for from Apple at this time is a "Beast Canyon" alike (mentioned quite a few times already earlier by some insightful members).

A Mac Cube Tube it would be. All SoC and all soldered in, but one PCIe slot for a real GPU.

It probably will be called a Mac Cube Tube Pro Max.

At a second thought though, I will predict they will ruin it by making it a proprietary only slot. Maybe a derivative of a MPX or something. Tinier, slimmer, notcompatiblier.
I predict that a compact AS MP, should it exist, will have no PCIe slots whatsoever. Does seem dangerously close to a 6,1 though, a machine they loudly proclaimed they'd learned their lessons from not that long ago. I suppose a compact AS machine wouldn't lead to the 'thermal corner' they were in before, as it essentially has a single heat source for a large heatsink / fan to deal with, plus a CPU / GPU roadmap that is now under their control (hopefully).

I really can't see them making multiple MP form factors though. If the rumours do pan out, it looks like the AS MP will be a compact desktop-based machine that relies on multiple M1 Max SoCs with 'sufficient' GPU grunt. If there is an Intel MP revision, it may be because the first AS MP can't quite get there GPU-wise. This would be a bit worrying; the discrete GPU competition isn't standing still - at what point will they be competitive?
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Even the Radeon 6900 makes a pretty compelling case as an eGPU on M1 Max hardware.
That's the thing though - the M1 Max doesn't support eGPUs. The RX6900XT results were using an Intel MBP.

Edit - I assume you were saying that reading between the lines, the ability to use a big Radeon with an AS SoC would be well worth it (if future designs have this capability).

The thing is, all other AS chips are essentially expansions of the SoC in the iPhone (their core product). A version of AS with a lot of PCIe lanes is a bit of a departure from this, which further raises costs on a niche product. Obviously, Apple can afford it, but they may not want to.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
That's the thing though - the M1 Max doesn't support eGPUs. The RX6900XT results were using an Intel MBP.

Edit - I assume you were saying that reading between the lines, the ability to use a big Radeon with an AS SoC would be well worth it (if facilitated by future designs).

Right. An M1 Mac MacBook Pro still has a lot to gain by pairing with an eGPU like the 6900. And will get even more out of the eventual 7900 or 7800. The narrative has been that M1 Max is so fast you don't need eGPU anymore. But that doesn't seem like thats true.

Probably one thing Apple is considering is that eGPU adoption was rather low. But it does seem like between desktops and laptops we're headed for a GPU performance wall again. And Apples blockading of Nvidia probably didn't help with eGPUs (nor did the rush on GPUs.)
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
Probably one thing Apple is considering is that eGPU adoption was rather low.
Not surprising. The eGPU boxes themselves cost hundreds of pounds / dollars, before you've put a card in. If you're doing it for gaming, you may as well spend a bit more and build a PC - games run much faster on Windows anyway.

For professional uses, using a Mac mini or MBP as the CPU isn't ideal if performance is key.

The later 27" iMacs and the iMac Pro have pretty powerful CPUs. I believe there's a performance penalty for routing the display traffic back to the host machine (rather than connecting a monitor directly to the eGPU), which means their 5K screens couldn't take advantage. That wouldn't be a problem if just using the eGPUs for Redshift or Resolve though.

The 6,1 didn't support eGPUs without hacks, and only had TB2 in any case.

Basically, eGPUs are an expensive and compromised workaround - the proper solution is a tower with space for expansion, as made by everyone else. No need to reinvent the wheel, just because Apple really didn't want to make that type of machine.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Not surprising. The eGPU boxes themselves cost hundreds of pounds / dollars, before you've put a card in. If you're doing it for gaming, you may as well spend a bit more and build a PC - games run much faster on Windows anyway.

I hinted at it in my original reply, but it's also not helpful when Apple is blocking the largest GPU vendor with the fastest GPUs from entering the platform. Because of that, Apple has to debut eGPU with the Radeon 580, which is not the most compelling.

If Apple allowed their laptops to be paired with a 3080 or 3090, current GPU prices aside, that would get a lot more people excited. AMD has been catching up, but it's been slow. Apple hitching their wagon to AMD seems like a clear part of the problem here.

Even for the Mac Pro, AMD's options aren't stellar for some configurations and they would have been helped by having Nvidia options. (Although Apple's pricing doesn't help either.)

In a way, Apple has solved the AMD problem by hitching their wagon to no one. But that may not be sustainable either.
 

flowrider

macrumors 604
Nov 23, 2012
7,323
3,003
They don’t currently, but who knows in future.
OK, let me state this more clearly - Will an M CPU in an expandable machine (like a real Mac Pro) with PCI slots support an external GPU in a PCI slot or any other card in a PCI slot❓

Lou
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
But, does that mean the M CPUs wont support external GPUs❓

They don’t currently, but who knows in future.

OK, let me state this more clearly - Will an M CPU in an expandable machine (like a real Mac Pro) with PCI slots support an external GPU in a PCI slot or any other card in a PCI slot❓

State it as clearly as you like, but until Apple themselves make a clear statement on discrete GPUs (whether first-party from Apple or third-party from AMD / Nvidia), no one can say; but I would expect the full spectrum of the Mac Pro lineup to be previewed and highlighted at WWDC 2022...

As to "any other card in a PCI slot"; obviously there will still be audio I/O cards, video I/O cards, network cards, RAID cards, etc....
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
State it as clearly as you like, but until Apple themselves make a clear statement on discrete GPUs (whether first-party from Apple or third-party from AMD / Nvidia), no one can say; but I would expect the full spectrum of the Mac Pro lineup to be previewed and highlighted at WWDC 2022...

As to "any other card in a PCI slot"; obviously there will still be audio I/O cards, video I/O cards, network cards, RAID cards, etc....

I wouldn't bet on Apple Silicon supporting AMD GPUs. It's possible. The Metal APIs allow it in theory. The platform APIs still have all the hooks for discrete GPUs and eGPUs. So Apple could do it if they want. So they've left the door open. But it doesn't seem like they're very cozy with AMD or Nvidia right now.

The only thing I've noticed that could be a possible explanation is that Apple Silicon Macs use the iPad frame buffer, and not the traditional Mac framebuffer. The best explanation would be that the iPad frame buffer doesn't support anything but Apple integrated GPUs. Could change in the future. But for now it is what it is.
 

iDron

macrumors regular
Apr 6, 2010
219
252
I wouldn't bet on Apple Silicon supporting AMD GPUs. It's possible. The Metal APIs allow it in theory. The platform APIs still have all the hooks for discrete GPUs and eGPUs. So Apple could do it if they want. So they've left the door open. But it doesn't seem like they're very cozy with AMD or Nvidia right now.
The rumored M1 Max Quad would have about twice the CPU performance of the current Mac Pros, but at 40TFLOPS less than the highest configuration of current Mac Pros, at 45TFLOPS with two Radeon Pro W6900X. Having a lower graphics performance than the Intel predecessor won't be acceptable.

So, either Apple steps up their GPU game, or they will have to use a SoC GPUs + (optional) dedicated GPUs solution.
 

Melbourne Park

macrumors 65816

Lou

That review is unbalanced IMO.

Firstly, the 6900 eGPU test doesn't perform much better than the M in the software speed tests, of which there are only a couple. Then it pitches the M against games - and the games are far from native to the Mac. And then there is the only actual piece of productivity software tested - the new version of DaVinci Resolve, version 17. That test would have mean't more if they'd used 17.1, which was designed to work on the M. 17 isn't, and 17.1 wasn't initially available. 17 is very far from having any native capability on the M. Just like the games that were played.

Multi-core intel CPUs often throttle back too - did the review test for that? It happens in iMacs with large numbers of intel cores - and its worse in notebooks. There's no evidence given here of how long these tests were run for. Real Mac users have found under heavy CPU loads, their promised Intel CPU performance didn't last long. Perhaps not surprising when those Intel processors were still 14 nm wafer processors. Apple's M ones are 5 nm. Thinner means cooler which means less throttling back due to overheating.

And then there is the cost of the 6900 - which is half the cost of the entire Powerbook M Max. And with that Powerbook - heck - what about size and battery performance? eGPU's are huge. Next test, why not run native software and test the battery life, including powering he eGPU. Testers can manipulate things at their discretion. It's important for us to see through that.

If Apple saw this test, they'd likely drop any inclination to put in graphic card acceleration slots - why have a platform where adding a GPU is going to rob Apple of so much profit? The GPU companies have now learnt they can simply limit production, and get increased profits. What makes people think GPU companies will change their new profit making oligopolies? The only way for computer makers is to do it themselves - make fast GPU performance themselves. And Apple is now doing that. Intel may eventually follow too. Then GPU prices will come down I guess.

While it would be nice to have spare slots in a Mac Pro M powered - its also quite possible the cost and performance benefits might be quite limited.
 
Last edited:

richinaus

macrumors 68020
Oct 26, 2014
2,432
2,186
That review is unbalanced IMO.

Firstly, the 6900 eGPU test doesn't perform much better than the M in the software speed tests, of which there are only a couple. Then it pitches the M against games - and the games are far from native to the Mac. And then there is the only actual piece of productivity software tested - the new version of DaVinci Resolve, version 17. That test would have mean't more if they'd used 17.1, which is designed to work on the M. 17 isn't. Its very far from having any native capability on the M. Just like the games that were played.

And then there is the cost of the 6900 - which is half the cost of the entire Powerbook M Max. And with that Powerbook - heck - what about size and battery performance? eGPU's are huge. Next test, why not run native software and test the battery life, including powering he eGPU. Testers can manipulate things at their discretion. It's important for us to see through that.

If Apple saw this test, they'd likely drop any inclination to put in graphic card acceleration slots - why have a platform where adding a GPU is going to rob Apple of so much profit? The GPU companies have now learnt they can simply limit production, and get increased profits. What makes people think GPU companies will change their new profit making oligopolies? The only way for computer makers is to do it themselves - make fast GPU performance themselves. And Apple is now doing that. Intel may eventually follow too. Then GPU prices will come down I guess.

While it would be nice to have spare slots in a Mac Pro M powered - its also quite possible the cost and performance benefits might be quite limited.
I tend to agree with this, and fully expect Apple to release something like the Cube / trashcan. It’s all they [Apple] have ever wanted in a desktop and now have the ability to do so without relying on others.

Yes the power may not be as good as with duel 6900 but I think they will get to a point where it is good enough for 99.5% of users. That .5% will just have to go with PC’s.

In addition I believe that they will actually gain more pro desktop users going this route rather than the current designs, because for studios like mine [and many other small design studios] the current Mac Pro is overkill. Whereas the 14” MBP I have [maxed] is about 20 - 30% off where I need it to be. Obviously this is my use case, but I do push the computers reasonably hard, so can imagine others in a similar position.

A quiet, cool desktop that has double the max pro M1 would be the absolute perfection in a desktop for my studio. As long as the RAM and SSD can be updated, we would be all good.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
And then there is the only actual piece of productivity software tested - the new version of DaVinci Resolve, version 17. That test would have mean't more if they'd used 17.1, which was designed to work on the M. 17 isn't, and 17.1 wasn't initially available. 17 is very far from having any native capability on the M. Just like the games that were played.

It's something Barefeats should clarify, but - Black Magic themselves refer to the product as a whole as DaVinci Resolve 17. The review doesn't state which exact version was used. But the name of all versions of the product is "DaVinci Resolve 17." For example, the official product name for Resolve 17.4 is still "DaVinci Resolve 17." It's like how Mac OS X's name didn't mean you were literally running Mac OS 10.0.

They didn't explicitly state it was 17.0 specifically, and it doesn't preclude 17.1 from being the version. But they do need to clarify that. 17.0 is so old though I'd be kind of surprised if for some reason they specifically downloaded that version instead of the latest.

Then it pitches the M against games - and the games are far from native to the Mac.

The games he tested were Mac native. Both are Metal optimized. They weren't M1 native... but....

Rosetta doesn't seem like it's completely the issue here. You can tell that because M1 Max completely trounces the Intel MacBook Pro with it's built in GPU. So even with Rosetta, Apple Silicon can still do pretty well. It's only when the Radeon 6900 shows up that the tables flip. It's possible the M1 Max is CPU constrained because of Rosetta. But clearly it's not as simple as games-bad.

And honestly? If M1 Max has trouble performing with games that is still perfectly valid performance data. Just maybe for something not everyone cares about. But still a valid test.
 
Last edited:

Melbourne Park

macrumors 65816
...

And honestly? If M1 Max has trouble performing with games that is still perfectly valid performance data. Just maybe for something not everyone cares about. But still a valid test.
All good points.

However it's known that the M can play some games very well. I've read the thread on the failure for Apple to have a games notebook with the M machines. Having read the whole thread, it appears that the economic case for software vendors with major PC based games to effectively port to the M processor is highly marginal. Basically Apple can't handle AAA gaming. the thread stopped in early November ...

So games are a real test for the M IMO. And if one wants to test games performance, then use one of the few that can run on the M platform. And Apple makes heaps of money from the simple games sold for iPhones and iPads. But the big games that people buy costly hardware for, very few will run on an M processor, and that situation is unlikely to change.

Using games on an M Mac as a performance comparison is a setup against the M processor.

I'll quote the second last post of the thread, but one key line from the second last post:

"Apple can‘t do AAA gaming because they don’t have the people to make Macs do AAA gaming"

Apple has done Jack squat in 3 decades to address the issue that Macs are a poor platform for gaming on computers.

Don’t feel bad for criticizing a multi trillion dollar corporation - Apple have the resources to fix this.

We can blame Devs for not optimizing the slim pickings of native games on Macs, but the truth is Apple has not met Devs halfway in support and in some cases have made it less in enticing to make games for Macs .
- dropped support for the popular and widely used OpenGL and Vulkan in favor of Metal.
- When Apple updates Macs it sometimes devs have to update their games to work properly or at all with Macs and devs can‘t update their games indefinitely especially on a platform that takes time and resources to develop for for little returns.

what is worse?

Apple have convinced many Mac gamers this is fine, and they either accept or defend/excuse Apple.

If Mac gamers don’t demand better, Apple doesn’t bother to do better.


Apple like to say they want gaming and its important to Apple, but Apple thinks mobile, cloud and Apple Arcade for children is gaming and tries to sell that to Mac gamers hard instead.
This is is actually insulting when one looks at it.


Apple can‘t do AAA gaming because they don’t have the people to make Macs do AAA gaming.


case in point.

Sony and Microsoft/Xbox where not gaming household names in the beginning, BUT they employed the right people, talents and know how from across the games industry to create their respective gaming brands/platforms….because Sony and Microsoft KNEW they themselves didn’t know gaming.
Apple however in their hubris think they do.


e.g. Seamus Blackley and ED Fries game developers who worked in the games industry were responsible for the creation of the Xbox console, because Microsoft wanted to get in the industry but did not have the credentials to do it and take on Sony‘s PlayStation

SB and ED were the heads and took the lead in the creation of the Xbox and went against the grain of Microsoft corporate thinking, Bill Gates even complained why the console didn’t have Windows the way he wanted it, and needed to be reminded this was not how to appeal to console gamers and beat Sony. So had to relent to those who know more about gaming…the rest is history.



This is the same for Apple, sure Apple may make good video editing software, design efficient chips etc…but Apple does NOT know AAA gaming and can’t do AAA gaming until they employ those who know AAA gaming and change the culture within the company.


A dedicated AAA gaming division within Apple headed and lead by Seamus Blakeys and Ed Fries of the gaming industry who have a passion, understanding and connection to AAA gaming and core gamers is what it will take for Apple to make Macs a gaming platform contender, NOT its current philosophy what it thinks gaming is.

The thread:


And there is some software that runs quickly on the M processor and also runs on Intel. Why not include such software in such a comparison?
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.