Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I ordered the 64gb first, then cancelled and went with 32gb instead.

My thinking is that if 32gb isn't enough then chances are the M1 Max won't be enough either, and a standard 32gb 1tb M1 Max will be easier to sell in a year's time when the Mac Pro comes out.
 
Wanted 64GB, but settled for 32. For several reasons:
  • Based on reports, the M1 Macbook airs with 8GB handle most tasks VERY well. My father got a 16GB he is running a vm and multiple "office" type apps at the same time and has yet to bog down. I figure double that ram should do me for most tasks.
  • I'm cheap.
  • The CFO (wife) would kill me, right in the face.
  • My damn kids insist they want to go to college one day.
  • My current 16GB work 2019 macbook works VERY well for most tasks. My personal machine will be used for more intensive things, but I'm gambling that the 32GB will do fine.
 
I guess, basically (without knowing much about this stuff) I am wanting to know if Parallels will look and work the same way it did on my Intel machine.

Thanks for the help
No, on x86 you can run other x86 OS's in virtualization, such as Windows for "Intel". On Apple SoC you'll have to run Windows for ARM.

Virtualization only passes the host CPU architecture to the guest via a hypervisor, so the guest cpu operations are near native speeds. To run x86 guests on Apple SoC requires emulation of the CPU and that comes with a large speed penalty.
 
  • Like
Reactions: NJRonbo
Wanted 64GB, but settled for 32. For several reasons:
  • Based on reports, the M1 Macbook airs with 8GB handle most tasks VERY well. My father got a 16GB he is running a vm and multiple "office" type apps at the same time and has yet to bog down. I figure double that ram should do me for most tasks.
  • I'm cheap.
  • The CFO (wife) would kill me, right in the face.
  • My damn kids insist they want to go to college one day.
  • My current 16GB work 2019 macbook works VERY well for most tasks. My personal machine will be used for more intensive things, but I'm gambling that the 32GB will do fine.

It's only $400 difference. Just skip the occasional Starbucks or cook at home instead of dining out for a few weeks.
 
  • Like
Reactions: lordhamster
It's only $400 difference. Just skip the occasional Starbucks.
Lol that's kind of how I did the math in my head. I can live without Starbucks coffee and other impulse snacks when I have it at home 👌

64GB for me.
 
It's only $400 difference. Just skip the occasional Starbucks or cook at home instead of dining out for a few weeks.
Already doing those things because I'm cheap. That said, you are spot on $400 extra is not much in the scheme of a machine i'll be keeping for at least 5 years.
 
I guess, basically (without knowing much about this stuff) I am wanting to know if Parallels will look and work the same way it did on my Intel machine.

Thanks for the help

I am running it on an M1 MBA w 8 gb and Parallels / WinARM runs but slo; parallels likes a lot of memory. I run it on a 2018 MBP with 16 GB and an i9 and it was plenty fast, about taht of a decent Win Laptop like an HP with a Ryzen chip.I ordered a base 14" w/16GB to see if it runs acceptably fast to use PowerBI. If it doesn't I'll return the MBP, as the MBA does everything else I need, and use my 2018 to run PowerBI.
 
I'm just getting the base 14" 16GB.

$400 to double to 32 GB is crazy. I don't do anything that intensive (light Photoshop, LR CC, some 1080p (maybe at some point 4K) video editing). Not worth $400 extra when it will never really be put to good use.
 
  • Like
Reactions: JayKay2021
I'm just getting the base 14" 16GB.

$400 to double to 32 GB is crazy. I don't do anything that intensive (light Photoshop, LR CC, some 1080p (maybe at some point 4K) video editing). Not worth $400 extra when it will never really be put to good use.
Well that depends.
The situation now are the same as when apple did have 4Gb , 8gb and 16gb of ram. Some was I don’t need 16 as 8 is good enough etc.

This was 8 years ago or similar
 
I ordered 64 so I can laugh in the face of people with inferior MacBook Pros.

I am even laser cutting huge decals for the back of my MacBook Pro:

"Back off, I have 64 GBs of RAM", "Go big or go home, I have 64 GBs of RAM", "32 GB MacBook Pros are Pathetic"
 
  • Haha
Reactions: 5425642
I'm actually working with large datasets. I'm a back end developer and at any given time I'm working on multiple websites that each have databases several GBs in size behind them.

The memory pressure graph on the 8GB 13" M1 MBP I trialed was red almost the entire time and on some days I was generating nearly 1TB of SSD writes from swapping. That surprised me because I wouldn't have realized how hard it was working without running monitoring applications.

I suppose had I compared performance against a 13" 16GB M1 MBP instead of a 32GB i7, that extra memory may have yielded significant benefits, so I'm not saying 8GB is all you need. I just think most people are stuck in a 2009 mindset and are greatly overestimating how much RAM they need.

That said, if the cost isn't an issue to you, there's no harm in buying a little extra just in case.
It's all in one's definition of "large" and one's use case. 32 GB of double precision numbers still fills 32 GB....
 
It's all in one's definition of "large" and one's use case. 32 GB of double precision numbers still fills 32 GB....
The size of the dataset is usually unimportant to me; how it’s accessed is the critical property. If you have to jump randomly all over a big dataset, you might want to spend some time figuring out how to restructure it or your code.

Sometimes that’s not possible, and in that case 32GB vs 64GB is probably irrelevant. To me a “large dataset” is a few TB minimum. Rent some time in the cloud if time is of the essence or just crunch away on your personal machine if it’s a pet project.
 
The size of the dataset is usually unimportant to me; how it’s accessed is the critical property. If you have to jump randomly all over a big dataset, you might want to spend some time figuring out how to restructure it or your code.

Sometimes that’s not possible, and in that case 32GB vs 64GB is probably irrelevant. To me a “large dataset” is a few TB minimum. Rent some time in the cloud if time is of the essence or just crunch away on your personal machine if it’s a pet project.

The "cloud" isn't something I use. The "cloud" is "someone else's computer" with slow access. These sorts of datasets aren't databases, but single problems and processes, so its all in physical RAM in an MPI parallel job. Everyone has their own use case. Problems expand to fit the available resources. What was a Grand Challenge problem 10 years ago isn't anymore.

It's nice to be able to run small problems on a local system before moving to a cluster.
 
It's all in one's definition of "large" and one's use case. 32 GB of double precision numbers still fills 32 GB....

And to be totally honest, I don't think I do anything impressive.

My real point was that everyone goes around saying 8GB is barely enough to keep the lights on because that's what everyone else keeps saying. "32 is the new 16" is replacing "16 is the new 8" as a catchphrase as if hardware was locked to 2009 specifications and software continues on its own timeline and never gets optimized.

My wife has an 8GB Intel MBA. I needed an alternate machine to compile a build in XCode so I borrowed hers. It far exceeded my expectations.

FWIW, I still ordered a 32GB 16" because the extra RAM wasn't a significant cost to me, but I would have been plenty happy on 16GB if my budget was tighter.
 
I did go with the 32/512GB now and it's the standard M1 Pro CPU 8-core.
Delivery in the middle of November sadly :(
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.