Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
Ideally, point of discussion will be even though Windows may have better graphics in the short term or near future.. what makes u still think Mac is a better choice if both os can work for u.. of course other discussion is welcome..
What exactly do you want to do? Play games? Then macOS is the wrong choice in the first place. HPC/research? Then there has never been an alternative to Nvidia as price is irrelevant in those scenarios. Heavy video editing, color grading, etc.? Renting performance in the cloud is the way to go if you're not running jobs 24h per day, 365 days a year.

For your typical scenario at home, doing some photo processing, cutting a few videos and maybe even play some lightweight arcade game, there's no difference between AMD and Nvidia... and most likely anything that Apple comes up with.

Your biggest problem with Apple silicon will be lack of software you can actually use. It’ll be more like a chromebook running iPad type apps which could be fine for the masses. Will be locked down to App Store to load anything anyways.
And why do you think that? Developers can compile for ARM and you can compile anything open source to ARM as well. I've been developing x86 and ARM for many years and the problem is only there when using 3rd party libraries that do not support both architectures. This won't be available from the very beginning, but it will still work.

Brew already started porting to 11.0 including ARM support, nothing will change. One difference is, that arm64 executables on 11.0 must be properly code signed. A simple local ad-hoc signature will do, so this shouldn't be a problem as seen: here.
 

jerryk

macrumors 604
Nov 3, 2011
7,421
4,208
SF Bay Area
....

If you love games, you need a PC in my opinion. If you are like me, and you love both games and the Mac user experience, you buy a midrange Mac for the computing experience + a "cheap" gaming PC to play the "real" games.

I agree with this. I am typing this on a Windows 10 system with an i9900K processor, 64 GB of ram, 2 TB of Nvme, and an RTX 2070. I have around $1100 into this by buying on sales.

The only game I have run recently is FS 2020 and it runs smooth as butter at high settings.
 

Woochoo

macrumors 6502a
Oct 12, 2014
551
511
At the same time, the 3070 RTX has the same TDP as the 2070 RTX super, while doubling the amount of shader cores running at the same clock. It seems to me that Nvidia has doubled the width of their vector ALUs, going from 512bit (16-wide ALU) to 1024bit (32-wide ALU). If this is indeed the case, I am wondering what is the difference in ALU utilization between Turing and Ampere. Wider vector units are usually more difficult to keep occupied efficiently.

True, I was refering to the 3080 which is the one that I remembered as they said it doubled the performance compared to 2080. The 3090's TDP makes much more sense than the 3080 as it has way more GDDR6X (14 more GB) that consumes more aswell, and way more cores in every area (CUDA, Tensor and RTX)
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
True, I was refering to the 3080 which is the one that I remembered as they said it doubled the performance compared to 2080. The 3090's TDP makes much more sense than the 3080 as it has way more GDDR6X (14 more GB) that consumes more aswell, and way more cores in every area (CUDA, Tensor and RTX)

Looking at specs alone, the 3070 also offers twice the shader cores and FLOPS compared to the 2070 super, so yeah, the "double the performance" claims apply across the line. I would be very curios indeed to see some actual benchmarks.
 
  • Like
Reactions: jerryk

MisterMe

macrumors G4
Jul 17, 2002
10,709
69
USA
...

The truth is, the change from ARM to x86 really enriched Apple's system. It allowed them to take advantage of x86 more easily (which they did).The system got slightly more popular, and this popularization brought more developers and more software.

...
The truth is that Apple did not change from ARM to x86. Apple switched from PowerPC to x86.
 
  • Like
Reactions: Dranix and cardfan

CWallace

macrumors G5
Aug 17, 2007
12,525
11,542
Seattle, WA
If you love games, you need a PC in my opinion. If you are like me, and you love both games and the Mac user experience, you buy a mid-range Mac for the computing experience + a "cheap" gaming PC to play the "real" games.

This.

I got off the iMac GPU upgrade cycle in 2018 and bought an Alienware tower that sits under my desk (so I don't have to see it) and connects to an ASUS QHD gaming display via DisplayPort and my 2017 iMac 5K via HDMI. Paid about the same as the resale loss of selling my 2017 iMac and buying a 2019 iMac and my display supports 165Hz and far higher graphics detail whereas the iMac is locked at 60Hz in BootCamp.

My 1070GTX is fine for Diablo 3 and Overwatch (the only two games I play) and if I ever do need more, it's an easy GPU card swap.
 

Joelist

macrumors 6502
Jan 28, 2014
463
373
Illinois
Staying on topic, we need to see what the GPU block on the AS for Mac will look like. Remember that right now on the iPhone and iPad the GPU piece is extremely powerful - and in a VERY constrained environment both in terms of power and thermals. Let's see what they have up their sleeve for a less constrained envionment.
 
  • Like
Reactions: CWallace

Joe Dohn

macrumors 6502a
Jul 6, 2020
840
748
The truth is that Apple did not change from ARM to x86. Apple switched from PowerPC to x86.

True, PowerPC was just a RISC processor, my mistake. But it doesn't change the fact that transitioning to x86 helped create a much richer and useful system for users.
 

dnewkirk

macrumors newbie
Jun 8, 2015
26
22
Los Angeles
Apple also has absorbed entire GPU teams from AMD (and possibly Nvidia in part). They have extensive experience from both a design and software/compute perspective (they originated OpenCL, Metal). I would expect their initial GPUs to be competitive in both a laptop and desktop form factor, with exception perhaps at the high end. If the key game engines are designed to take advantage of Apple's APIs, silicon, then the game selection shouldn't take a huge hit long term. If the difference in Apple silicon performance is substantial, Macs may carve out a larger user base which may help increase interest in supporting Mac silicon, leading to an even better situation than is currently present. Lots of ifs, but it's not inconceivable. Also, as many have noted, Apple has gone about this transition extremely well thus far, which again helps to encourage developers to make the jump and increase application support (including games) from day 1.
 
  • Like
Reactions: 2Stepfan

thisismyusername

macrumors 6502
Nov 1, 2015
476
729
Ideally, point of discussion will be even though Windows may have better graphics in the short term or near future.. what makes u still think Mac is a better choice if both os can work for u.. of course other discussion is welcome..

Nothing changes in the Mac world with these new nvidia cards because those are basically for gamers and they're not gaming on a Mac. They're building their own PC even if they prefer to use Macs for non-gaming purposes.

Will be locked down to App Store to load anything anyways.

It amazes me that some here still think this. Apple's made it very clear they have no intentions of doing that. Plus, the type of CPU architecture has absolutely nothing to do with where users can and can't get their apps. If Apple really wanted to lock Macs down to the app store, they would have done that years ago.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,517
19,664
True, PowerPC was just a RISC processor, my mistake. But it doesn't change the fact that transitioning to x86 helped create a much richer and useful system for users.

Yeah, because Intel CPUs were faster and better. And it is also true that using x86 CPUs simplified porting of popular software to Macs (PowerPC was very different after all). But then again ARM64 that Apple uses now is pretty much binary compatible with x86-64, so porting to Apple Silicon is trivial most of the time.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
...at any rate neither of these GPUs is very interesting for Macs because of their extreme power consumption.

Plus the fact that Nvidia GPUs are not supported on macOS at all...

The Mac Pro has a 1000 watt power supply, though it might not have the 12 pin connector.

None of the PSUs out there have a 12-pin connector. Nvidia ships an adapter (two 8-pin to one 12-pin) with every 3000-series GPU that needs it...

But moot point altogather in regards to any 3000-series GPU being used on a Mac, because Nvidia GPUs an not supported on the Mac platform...

If you are a gamer, you're always gonna be better served by buying a PC with the latest and strongest GPU from Nvidia or AMD for about 1500$...

How can you build a PC with the "latest & greatest" for 1500 bucks, when the "latest & greatest" costs that much itself...?!? Nvidia GeForce RTX 3090 is a $1500 GPU...

Nothing changes in the Mac world with these new nvidia cards because those are basically for gamers and they're not gaming on a Mac. They're building their own PC even if they prefer to use Macs for non-gaming purposes..

Finally, a voice of reason...!
 

AutisticGuy

macrumors member
Feb 1, 2018
97
176
Come on. Let's get a serious discussion here.

Yes, you can still virtualize systems and run Rosetta to run Intel apps. However, as far as we know, virtualization will only be available for ARM, and Rosetta will not run 32-bit software. And we have PLENTY of it to run, especially old games.

So, it's not that Apple is lying: the problem is that they are overly optimistic, thinking users won't miss x86 / Windows software. But they are really underestimating how some software might be wanted and/or necessary.

Another issue is: do you really think Nvidia,AMD and Intel will really sit down and wait for Apple to steal their share? Of course not. As we speak, they are probably moving towards faster processors and graphics cards. So, Apple's advantage (if any) might as well diminish on the long run.

Intel hasn't been sitting down and waiting for Apple (or AMD) to steal their share. They can't successfully shrink their die and simultaneously manufacture it with satisfactory yields. And while AMD now has a superior design, it took them years to catch up. And again, while AMD is showing performance gains that are good for x86, they aren't going to be able to compete with Apple's performance gains and performance per watt.

While Nvidia hit a major home run with the release of these video cards, they aren't exactly a diversified company. Their success depends on x86. Everything seems to be pointing to the possibility of x86 ending. It looks like it's on its last legs.

With the handwriting potentially on the wall, ARM is going to be a more compelling future and software companies will flock to Apple if they can gain some market share and show inherent performance advantages.
 

MisterMe

macrumors G4
Jul 17, 2002
10,709
69
USA
Yeah, because Intel CPUs were faster and better. And it is also true that using x86 CPUs simplified porting of popular software to Macs (PowerPC was very different after all). But then again ARM64 that Apple uses now is pretty much binary compatible with x86-64, so porting to Apple Silicon is trivial most of the time.
You are conflating several issues. Binary refers to the machine code level. ARM is relatively modern RISC. x86 is an evolution of ancient CISC. They are not "pretty much binary compatible" by any stretch of the imagination. What they have in common is that the byte order of x86 is Little-Endian whereas ARM is now Bi-Endian. Bi-Endian means that the byte order can be either Big-Endian or Little-Endian. This has to do with data handling, not programming code.

As for porting software, this has to do with the APIs and frameworks used by the OS. After NeXT dropped its hardware line, Steve Jobs ran OpenSTEP on a white box PC. When he moved over to Apple, Apple transitioned its OS to the OpenSTEP-based MacOS X as well as the NeXT development system which eventually evolved into Xcode. Jobs ran MacOS X on his white box PC. The first Hackintosh sat on Steve Jobs's desk.

From the development of the development of MacOS X 10.0 to the current development of macOS 11.0 and OpenSTEP before them, MacOS X/macOS has run on multiple processor families. Apple's stated reason for this is to ensure that processor dependencies do not creep into the code. This means that software coded using in high-level languages and macOS frameworks can be ported with a "simple recompile" to any supported platform. If you code part of your project in assembly language, then that part of the project must be rewritten in the assembly language of the target or a high level language supported by Xcode.

x86 does not affect the ease of the port if you use a high level language. There are processors whose assembly language is easier to code than x86.
 
Last edited:
  • Like
Reactions: LinkRS

MrX8503

macrumors 68020
Sep 19, 2010
2,293
1,615
Come on. Let's get a serious discussion here.

Yes, you can still virtualize systems and run Rosetta to run Intel apps. However, as far as we know, virtualization will only be available for ARM, and Rosetta will not run 32-bit software. And we have PLENTY of it to run, especially old games.

So, it's not that Apple is lying: the problem is that they are overly optimistic, thinking users won't miss x86 / Windows software. But they are really underestimating how some software might be wanted and/or necessary.

Another issue is: do you really think Nvidia,AMD and Intel will really sit down and wait for Apple to steal their share? Of course not. As we speak, they are probably moving towards faster processors and graphics cards. So, Apple's advantage (if any) might as well diminish on the long run.

The majority of users won't even notice the difference during the transition. Intel had a decade to turn this around and develop faster processors, you think suddenly they've figured this out?
 

burgerrecords

macrumors regular
Jun 21, 2020
222
106
Intel had a decade to turn this around and develop faster processors

For desktops, Intel processors are fast with very little downside.
They were more expensive than they are now due to lack of competition over the past decade, they have now been forced to be competitive. Nvidia desktop gpus are also fast. AMD is there too for cpus and getting there with gpus.

Until Apple demonstrates something otherwise, the opportunity and benefits of arm/Apple silicon are almost entirely related to portables. The possibility of a portable that’s fast like a desktop is very appealing.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
You are conflating several issues. Binary refers to the machine code level.

Apologies, I should have been more specific. When I am talking about binary compatibility I mean data. Basic data sizes, alignment constraints, formats etc. are pretty much identical between ARM64 and x86-64 which in turn means that fundamental algorithmic invariants stay unchanged no matter which target you compile for. This is not the case, say, for PowerPC vs x86 or even x86 vs. x86-64. In fact, porting from x86-64 to ARM64 is probably easier than from x86 to x86-64.

There are some caveats of course. Aside the obvious ones (such as inline assembly or CPU-specific features), a big one concerns parallel programming. Apple CPUs are weakly ordered, which means that your multithreaded code might contain a data race bug that will trigger on ARM and not on Intel.

As for porting software, this has to do with the APIs and frameworks used by the OS.

Not really relevant here since the API is identical. Again, some caveats: latent bugs that have to do with platform-specific assumptions that are not guaranteed by the API (e.g. page size, CPU timer granularity etc).
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Come on. Let's get a serious discussion here.

Yes, you can still virtualize systems and run Rosetta to run Intel apps. However, as far as we know, virtualization will only be available for ARM, and Rosetta will not run 32-bit software. And we have PLENTY of it to run, especially old games.

First, Apple starting notifying folks that 32-bit macOS apps were being phased out 3-4 years ago. Developers knew that Carbon ( a decent chunk of 32-bit legacy apps) was a dead end long before that. 32-bit apps were dead whether Apple moved to ARM or not. Same way the 32-bit kernel booting went 5-7 years ago.

Second, Virtualization isn't only availabe for ARM. Apple has had hypervisor support in macOS since 2017

"...
April 17, 2017By Veertu Team
Today, Veertu Labs is thrilled to pull back the curtain on three Anka toolsets, our latest endeavors geared ..."


This has been deeply in wide scale, production instances for years.

https://www.macstadium.com/anka



macOS 11 is going to add "more viritualization" support, but the notion that this is brand new or only Apple Silicon only is on super thin ice. Apple Silicon for the Mac is going to probably going to add either features or more performance to IOMMU ( VT-d in Intel terms ) to the standard macOS hypervisior API. Kernel extensions that don't depend upon IOMMU mapping as Systems Extensions are being kicked out of the kernel on x86-64 also. They are going to be just as "dead" as 32-bit are. Nothing particularly due to the instruction set of the CPU at all. It is Apple not being permanently saddled with legacy baggage they don't want to do deal with in the future.

What is "new" is Apple punting on any options other than virtualization on Apple Silicon. Decent chance there is no "open" boot environment (i.e., No BIOS clone , no EFI , no UEFI ). That is it is straight iBoot ( which was present on T2 Macs to some small shim to boot macOS on internal or external APFS volume and that is it).

So, it's not that Apple is lying: the problem is that they are overly optimistic, thinking users won't miss x86 / Windows software. But they are really underestimating how some software might be wanted and/or necessary.

It isn't really a matter of whether there is some ( any) users that will miss. It is a matter of how many. Look hard enough can probably find someone who'd like to run Mac Paint or version one HyperCard.

Apple knows probably better than most end users how many folks download the Windows drivers for Apple hardware. The primary location for those are their servers so number of downloads isn't that hard to get. Also not hard to get how many folks download the macOS system updates and upgrades.

Underestimating? Probably not. Assigning some outsized proportion of importance to them? Again probably not.


2020 isn't 2005. The Mac user base is much larger now. Up around 100M versus something with one less digit 15 years ago. The inflow momentum from iPhone is also not even in the same zipcode as what flowed in from "iPod". For the x86 transition Apple was a "follower". For the ARM transition (even though Windows has had 2 (or more ) cracks at Windows on ARM there isn't much there). In this case, Apple is more so a leader into this zone. There is far larger base of iOS (on arm) software ecosystem than there is a Windows on ARM software ecosystem. There inertia on ARM is either iOS or Android . It is not Windows.

Another issue is: do you really think Nvidia,AMD and Intel will really sit down and wait for Apple to steal their share? Of course not. As we speak, they are probably moving towards faster processors and graphics cards. So, Apple's advantage (if any) might as well diminish on the long run.

Yeah but already said there aren't looking for "max performance". they are looking for perf per watt. They are far , far more interested in moving "laptops" up into the "desktops" performance space than they are looking for some huge leap over current desktop performance. There is a slide from WWDC that made that quite explicit. ( not really guessing there. )

They aren't trying to win the biggest ever super-duper FLOPS race. They are just trying to make Macs "better" ( probably in the case design direction that Apple already has been driving too. e.g. one port wonder MacBook . )
For the laptops Apple most assuredly out to steal all of those other players shares. Period. And since they get the make the parts selection call for those laptops , they are extremely probably going to select their all own stuff in 2-3 years from now (if not sooner) .

There are some desktop corner cases I don't think Apple is going to drive too deep into because the unit volume is too low. There are some Thunderbolt external PCI-e enclosure cases that are the same way. GPUs on a removable 'card' Apple will probably leave for AMD (and Intel if they get their act together, don't burn bridges with Apple , and put in the work. ). The embedded ( soldered to the logic board) work is going to be harder to get, but probably possible in a couple of product instances.

To bring it back to this thread's topics though. in order to keep up with the Nvidia 30xx and 40xxx and 50xx Apple probably will need to lean on AMD and/or intel to cover that. The upper end of that range the Mac GPU card volume is way too lower for Apple to mess with their own stuff. Lots of work to do and not much pay off for the work.
 
  • Like
Reactions: 2Stepfan

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
I'm surprised that no one os bringing up the power draw of the 30xx series. People always harped on AMD for being "hot and loud" but now there's nothing but silence about the fact that their lowest power card announced draws damn near the amount of power their high-end card did two gens ago. Not even a peep about the absurd 350 watts (and more with board partner cards) that the 3090 draws?

Isn't it a pastime to roast Intel over making nuclear furnace processors? Despite the fact that they still hold the performance crown in games? What makes NVidia different?

We haven't even seen independent benchmarks yet, let's not forget that either. All we have is NVidia's word.

And for $600.00 WAOW! We're forgetting that the 1070 cost $380 on launch? Have we already been suckered into accepting an 80% markup in just 3-4 years?

Apple aside, the whole ass-kissing over NVidia just puts a bug up my ass. Apple will be fine I'm sure, but there's no way they're gonna be putting a 350W graphics processor in anything.
 

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
.......
Intel hasn't been sitting down and waiting for Apple (or AMD) to steal their share. They can't successfully shrink their die and simultaneously manufacture it with satisfactory yields. And while AMD now has a superior design, it took them years to catch up. And again, while AMD is showing performance gains that are good for x86, they aren't going to be able to compete with Apple's performance gains and performance per watt.
......
[/QUOTE

I don't know i f Intel has been "sitting down", or standing still, or other, but I do know that in 2016 Intel had over 80% of the CPU market share. I know that right now, in Sept. of 2020, over 80% of the desktop CPU market share is with AMD. I would say that Intel has not progressed well enough in performance to prevent that from happening. With the Ryzen 4000 laptop CPUs/APUs, there is a pretty good chance that the same will be happening in laptops. Was Intel standing still? in the sense of technology, YES. Financially, no, they were making Wall Street happy, and then it became obvious that not only was Intel standing still, they were losing out to AMD. And then Apple pulled the plug on Intel CPUs for the Mac line.

Intel has stoold still, and will continue to stand still, until one of two things happen: either upper management starts to understand what is going on and makes the investment and committment to get their process to the point at which their CPUs can start to become competitive, or they start to go to outside silicon foundries with better processes. Continually fiddling around with clock speeds, hyperthreading ability, and core counts is not progress, it is standing still. Until Intel can start to bring process improvements to market they ARE standing still. When that will happen is unknown, but it isn't happening any time soon.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.