Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

How much RAM do I need ?

  • 16go

    Votes: 41 47.7%
  • 32go (+480€)

    Votes: 33 38.4%
  • 64go (+960€)

    Votes: 12 14.0%

  • Total voters
    86

Camille R

macrumors newbie
Original poster
Nov 22, 2019
2
0
Montpellier, FRANCE ☀️
Hi,

I want to buy the new 16" MBP i9, but I don't know how much RAM I need to buy.
The apps I'm using daily are : Firefox, Slack, Ableton Live, Spotify, VS Code (I'm a Rails dev)...
Currently, my Macbook has 16go of RAM and I'm never limited.
Should I stay to 16go or should I buy 32go for the future?

Thanks!
 
Load up everything you usually have going at any given time. Then open Activity Monitor and check your memory usage. Make your decision based on that while also factoring in how long you plan to keep this machine.
 
Load up everything you usually have going at any given time. Then open Activity Monitor and check your memory usage. Make your decision based on that while also factoring in how long you plan to keep this machine.

Here is what I get with my typical usage :
Capture d’écran 2019-11-22 à 12.06.24.png
 
Interested in this as I'm having the same dilemma myself. 16GB or 32GB. I currently have a MBP 13 with 8GB RAM which seems fine so I reckon 16GB would be fine but the comments on this thread will be enlightening. Mostly :)
 
It really depends on how long you want to keep it, how likely your work is to change, and how much you're willing to spend. Going against what most people suggest with 16 GB, I say 32 GB if it's within your budget. If not, don't stretch and also don't worry about it as you'll most likely be just fine.

I'm also a developer and just ordered my 16" with 32 GB. Right now, I'm on a 15" with 16 GB and while macOS does a great job handling memory when you're getting to the limit, I've hit the ceiling many times when running a typical setup, i.e. Visual Studio Code, an iOS Simulator, Docker, a few node process, and all the stuff that comes with that like Slack, Discord, a bunch of Safari tabs, Spotify, Messages, Things, iTerm2. It's only ever been a problem a few times but I figured I'll have this next MacBook for a good few years and that the extra would make my life easier.

Each to their own, though.
[automerge]1574422817[/automerge]
Can you add extra RAM at a later date? Either yourself, or via an Apple Store?

No, you can't add it later so you need to decide when you're buying the MacBook.
 
Honestly - the base machine will be enough. But its you money. I'm Node.js developer and I would be picking the base machine, but since I'm planning to get into FCP video editing I'm bumping the specs, otherwise I would not bother and spend the rest of the money on something else.
 
I vote 32GB.

@casperes1996 also brings up a good point, why the i9 with your usage?

Well, I am facing the same decisions and when I ran activity monitor, I got the same results you did which shows about 14GB of 16GB being used so to me that was close enough to want to upgrade to 32GB.

Also, I'm thinking about an i9 for two reason, I do want 1TB SSD and the other is resale value/trade in. By the time I upgrade again, the i7 will seem ancient because i9 will have been mainstream for awhile.
 
I vote 32GB.

@casperes1996 also brings up a good point, why the i9 with your usage?

You will always get faster compile times with better/more cores CPU's (and a generally snappier computer), but having more RAM will only yield you performance if you have a workflow that utilities it. In theory even many simple of tasks can utilise 8 cores.
 
Well, I am facing the same decisions and when I ran activity monitor, I got the same results you did which shows about 14GB of 16GB being used so to me that was close enough to want to upgrade to 32GB.

Keep in mind macOS will allocate more memory than it needs to for caching purposes, and if you have a lot to spare it'll also not clear out memory once it is done with it, until it is needed for something else, so just reading the top number of memory used doesn't always tell you so much. If you look at RAM the traditional way, only consider app memory + wired. Or just look at the pressure graph :)

Also, I'm thinking about an i9 for two reason, I do want 1TB SSD and the other is resale value/trade in. By the time I upgrade again, the i7 will seem ancient because i9 will have been mainstream for awhile.

Right. The i9 won't be mainstream though. A new i3, i5, i7, i9 group will come along however, with the performance of the current i9 in the i7. I assume you didn't mean the same i9 though so that's just pedantic - Point I want to get to though, is that the i9 at that stage will also seem old, paging pretty much as quickly as the i7. Multi-threaded performance may stay more on-par with future products, but energy consumption and heat generation will be very high compared to the processors of the future. And future chips will come with new instructions and Intel Xe. - Point and case is that buying a faster chip for future proofing has historically never held ground, as the advancement in processor tech over a year is more than the financial investment now for upgrading to the more equivalent chip.

And I say that as someone who would buy the i9 config myself if I were to get one now ;P.
[automerge]1574424774[/automerge]
You will always get faster compile times with better/more cores CPU's (and a generally snappier computer), but having more RAM will only yield you performance if you have a workflow that utilities it. In theory even many simple of tasks can utilise 8 cores.

Compiling is actually mostly limited by storage speed. Linking (a step in compilation) is most of the time single-threaded. Some parts of the compilation workflow will benefit from more cores, yes, but most of what we wait for when we compile won't benefit greatly. Plus you rarely compile all your components. That only happens if you pull in a significant set of git changes from others that fundamentally changes things. Normally your build chain is intelligent enough to only compile modified modules and relink them, and again, the linker is mostly single-threaded. So you wind up with storage speed being the biggest factor in compile times, by far. At least for day-to-day compiles. Compile the whole Linux kernel with all the modules and it's a different situation.
 
Keep in mind macOS will allocate more memory than it needs to for caching purposes, and if you have a lot to spare it'll also not clear out memory once it is done with it, until it is needed for something else, so just reading the top number of memory used doesn't always tell you so much. If you look at RAM the traditional way, only consider app memory + wired. Or just look at the pressure graph :)



Right. The i9 won't be mainstream though. A new i3, i5, i7, i9 group will come along however, with the performance of the current i9 in the i7. I assume you didn't mean the same i9 though so that's just pedantic - Point I want to get to though, is that the i9 at that stage will also seem old, paging pretty much as quickly as the i7. Multi-threaded performance may stay more on-par with future products, but energy consumption and heat generation will be very high compared to the processors of the future. And future chips will come with new instructions and Intel Xe. - Point and case is that buying a faster chip for future proofing has historically never held ground, as the advancement in processor tech over a year is more than the financial investment now for upgrading to the more equivalent chip.

And I say that as someone who would buy the i9 config myself if I were to get one now ;P.
[automerge]1574424774[/automerge]


Compiling is actually mostly limited by storage speed. Linking (a step in compilation) is most of the time single-threaded. Some parts of the compilation workflow will benefit from more cores, yes, but most of what we wait for when we compile won't benefit greatly. Plus you rarely compile all your components. That only happens if you pull in a significant set of git changes from others that fundamentally changes things. Normally your build chain is intelligent enough to only compile modified modules and relink them, and again, the linker is mostly single-threaded. So you wind up with storage speed being the biggest factor in compile times, by far. At least for day-to-day compiles. Compile the whole Linux kernel with all the modules and it's a different situation.
I am inclined to disagree. I find updates of Microsoft products notably Office becoming more and more memory hungry. Running massive spreadsheets in excel I have gone all the way as 32 MB is not going to cut it. Also 4K streaming through Safari or Google does not seem that memory efficient. I have always used the maxim to buy the best I can and then keep it for 7 or 8 years. Then it is retired as a back up in case my latest lover affair ever fails. Which is important when meeting client deliverables.

Not sure my approach aids resale values but I have written it off on the books by then and it just provides a reliability back up.

Microsoft philosophy always seems to be to add whistles and bells to suck up memory and processor time and make shiny things to attract the young kids.
 
I am inclined to disagree. I find updates of Microsoft products notably Office becoming more and more memory hungry. Running massive spreadsheets in excel I have gone all the way as 32 MB is not going to cut it. Also 4K streaming through Safari or Google does not seem that memory efficient. I have always used the maxim to buy the best I can and then keep it for 7 or 8 years. Then it is retired as a back up in case my latest lover affair ever fails. Which is important when meeting client deliverables.

Not sure my approach aids resale values but I have written it off on the books by then and it just provides a reliability back up.

Microsoft philosophy always seems to be to add whistles and bells to suck up memory and processor time and make shiny things to attract the young kids.

Respectfully, what I said wasn't opinion, but fact.
Microsoft products may well eat up a lot of memory, I don't use them. You may also very well have hit your RAM ceiling. But in no way does that contradict what I stated.
 
  • Like
Reactions: revmacian
I am inclined to disagree. I find updates of Microsoft products notably Office becoming more and more memory hungry. Running massive spreadsheets in excel I have gone all the way as 32 MB is not going to cut it. Also 4K streaming through Safari or Google does not seem that memory efficient. I have always used the maxim to buy the best I can and then keep it for 7 or 8 years. Then it is retired as a back up in case my latest lover affair ever fails. Which is important when meeting client deliverables.

Not sure my approach aids resale values but I have written it off on the books by then and it just provides a reliability back up.

Microsoft philosophy always seems to be to add whistles and bells to suck up memory and processor time and make shiny things to attract the young kids.

If you actually use 32 GB that that is a really big spreadsheet. But remember that Excel showing it is using 32 GB of memory is completely different from needing 32 GB.

The operating systems will give process all the memory they every touch, unless there is a need from other processes or threads for additional memory. It is a "just in case you might need this again" optimization. So if you open a big file and it allocates a big buffer to read it in, it keeps that memory allocated to a process, even though the process is not currently using it. And if some other process or thread needs it it will reallocate some of the memory.

This is why you need to check memory pressure. That shows if the OS is spending a lot of time allocating and deallocating memory between processes and threads. If it goes yellow or red there is an issues. If it stays low and green that you have enough memory.
 
I am inclined to disagree. I find updates of Microsoft products notably Office becoming more and more memory hungry. Running massive spreadsheets in excel I have gone all the way as 32 MB is not going to cut it. Also 4K streaming through Safari or Google does not seem that memory efficient. I have always used the maxim to buy the best I can and then keep it for 7 or 8 years. Then it is retired as a back up in case my latest lover affair ever fails. Which is important when meeting client deliverables.

Not sure my approach aids resale values but I have written it off on the books by then and it just provides a reliability back up.

Microsoft philosophy always seems to be to add whistles and bells to suck up memory and processor time and make shiny things to attract the young kids.

I use the App Store versions of the Microsoft Office suite every day, all day on my current MBP 13 which is i5 / 8GB RAM. I do tend to find that they run a little slow at times, wether this is memory related or the applications themselves I'm not sure. I was about to order a base model MBP 16 and decided at the final moment to go with 32GB RAM as the only upgrade. I'm sure that 16GB would be fine right now, but I went with 32GB for future proofing peace of mind as I intend to keep this one for 5 years and as I wasn't budget constrained (for once!) I thought, why not.
 
I'm looking a my memory usage right now. Outlook is using 1.9 Gigs of memory. Why? I don't have a bunch of emails open. No mail app should require this much memory. I'd bet my bottom dollar that Microsoft does this on purpose on Macs.
 
Keep in mind macOS will allocate more memory than it needs to for caching purposes, and if you have a lot to spare it'll also not clear out memory once it is done with it, until it is needed for something else, so just reading the top number of memory used doesn't always tell you so much. If you look at RAM the traditional way, only consider app memory + wired. Or just look at the pressure graph :)

This was very helpful, thanks!. I looked further into this and read the support doc on Apples website about Activity Monitor and am now more informed. My App + Wired is 13GB of the 16GB I have installed. With only 3GB not being used, I would say upgrading to 32GB makes sense.

Right. The i9 won't be mainstream though. A new i3, i5, i7, i9 group will come along however, with the performance of the current i9 in the i7. I assume you didn't mean the same i9 though so that's just pedantic - Point I want to get to though, is that the i9 at that stage will also seem old, paging pretty much as quickly as the i7. Multi-threaded performance may stay more on-par with future products, but energy consumption and heat generation will be very high compared to the processors of the future. And future chips will come with new instructions and Intel Xe. - Point and case is that buying a faster chip for future proofing has historically never held ground, as the advancement in processor tech over a year is more than the financial investment now for upgrading to the more equivalent chip.

And I say that as someone who would buy the i9 config myself if I were to get one now ;P.

If the i7 is sticking around that would change things, I guess I was not thinking clearly because the i9 is a new addition to the "i" family and I was thinking more along the lines of a different chip/marketing name like "Intel Sonic" whereas the core "i" series would be outdated from a marketing perspective.
 
If the i7 is sticking around that would change things, I guess I was not thinking clearly because the i9 is a new addition to the "i" family and I was thinking more along the lines of a different chip/marketing name like "Intel Sonic" whereas the core "i" series would be outdated from a marketing perspective.

Intel may go to a different naming scheme eventually, leaving the "Core I" naming. Frankly I think Apple will have left Intel chips before that though. The Core I naming has stuck around for a long time now, and there's nothing in the cards suggesting a marketting change up in the near to mid-term future. Only in their graphics venture, going from Gen graphics to Xe graphics.
The only reason Core i9 was invented as a naming scheme, was to justify a price bump and a hotter chip on the mainstream platform to compete with AMD's offerings, whilst still allowing them to sell i3's in the lower end, and not moving the whole stack along.
[automerge]1574437683[/automerge]
I'm looking a my memory usage right now. Outlook is using 1.9 Gigs of memory. Why? I don't have a bunch of emails open. No mail app should require this much memory. I'd bet my bottom dollar that Microsoft does this on purpose on Macs.

My current Mail.app instance is currently sitting at 113.8MB of virtual memory allocation...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.