Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
Anyone else here still confused by what OP means? Games and rendering apps are still applications. Why do you have 3 types of ram in your math?

Sorry OP it seems like you mean well.

I’m still laughing at him trying to bring a 3 year old workstation gpu into his argument. No offense to him but he has no idea what he’s talking about. Or knows how the architecture works and it’s not Intel ram.
 

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Here I'll go slow
Traditionally prior Intel PCs/Macs had two main types of Memory (not counting PRAM/CACHE/etc):

RAM (CPU usage)
VRAM (GPU usage)

M1 (Apple Silicon Macs) have one type of main Memory (not counting PRAM/CACHE/etc):

RAM (CPU and GPU usage)

Laters...
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Here I'll go slow
Traditionally prior Intel PCs/Macs had two main types of Memory (not counting PRAM/CACHE/etc):

RAM (CPU usage)
VRAM (GPU usage)

M1 (Apple Silicon Macs) have one type of main Memory (not counting PRAM/CACHE/etc):

RAM (CPU and GPU usage)

Laters...
And before VRAM, PCs/macs had one type of memory. None of that means you get double the RAM.
 

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
Here I'll go slow
Traditionally prior Intel PCs/Macs had two main types of Memory (not counting PRAM/CACHE/etc):

RAM (CPU usage)
VRAM (GPU usage)

M1 (Apple Silicon Macs) have one type of main Memory (not counting PRAM/CACHE/etc):

RAM (CPU and GPU usage)

Laters...

It’s called unified memory. It also makes it so you can’t run external gpus on this architecture. Please just don’t ramble off stuff you read online, if you don’t actually know how it works. It’s tsmc silicon…. Lol
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Here I'll go slow
Traditionally prior Intel PCs/Macs had two main types of Memory (not counting PRAM/CACHE/etc):

RAM (CPU usage)
VRAM (GPU usage)

M1 (Apple Silicon Macs) have one type of main Memory (not counting PRAM/CACHE/etc):

RAM (CPU and GPU usage)

Laters...
Yes, most people are well aware that for a couple decades most personal computers with a powerful GPU have needed a special type memory for the GPU. (Note: it's not actually main memory, it's special purpose.)

The question people need you to answer before they can take you seriously is this: How does UMA reduce the need for main (system) memory? I will try to go slow for you...

Let's think about the base models of the last Intel 16" MBP (2019 model year) and the current M1 Pro 16" MBP (2021):

2019: i7-9750H + 16GB DDR4-2666 (41.8 GB/s), Radeon Pro 5300M + 4GB GDDR6 (192 GB/s)
2021: M1 Pro + 16GB LPDDR5 (200 GB/s)

Both have 16GB system RAM. Both actually have UMA GPUs, in fact! The i7-9750H includes an integrated GPU which has unified full-performance access to theoretically all 16GB of system RAM.

So how does any of this add up to 16GB on the M1 Pro being better than the 16GB on the i7 MBP? It's the same amount of memory. The purpose of the extra 4GB attached to the Radeon Pro 5300M is to provide higher bandwidth local memory to the GPU, which is the part of the system which needs bandwidth the most. Having this extra memory on the side doesn't somehow halve the effective capacity of the main 16GB.
 

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Now let's say you put two machines side by side:

One a PC using that SILLY NVIDIA card for $10,000 at NewEgg, with 32 GB of RAM

One a M1 Pro with 32 GB of RAM

You would have:

AppleVersusIntel.jpg


n.b. I used the SILLY NVIDIA card for reference of having a 32GB GPU card, I think the most a Titan has is 12GB, I don't want to get into the reason why PC+NVIDIA gaming doesn't/can't use more at this point but a Workstation can if you working with large Graphical Data Sets, that a normal PC gamer just couldn't even use... so maybe some other day ;)
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
It’s called unified memory. It also makes it so you can’t run external gpus on this architecture. Please just don’t ramble off stuff you read online, if you don’t actually know how it works. It’s tsmc silicon…. Lol
UMA is not why external GPUs don't work. That's just Apple's choice. I doubt there's anything blocking it on the hardware level, it's just software - they don't want to support drivers for non-Apple GPUs now that they think Apple GPUs are good enough.
 
  • Like
Reactions: MacCheetah3

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Now let's say you put two machines side by side:

One a PC using that SILLY NVIDIA card for $10,000 at NewEgg, with 32 GB of RAM

One a M1 Pro with 32 GB of RAM

You would have:

AppleVersusIntel.jpg


n.b. I used the SILLY NVIDIA card for reference of having a 32GB GPU card, I think the most a Titan has is 12GB, I don't want to get into the reason why PC+NVIDIA gaming doesn't/can't use more at this point but a Workstation can if you working with large Graphical Data Sets, that a normal PC gamer just couldn't even use... so maybe some other day ;)

No, it doesn’t mean that at all. The CPU needs to share that RAM, so in the M1 case you have HALF the ram of the “silly nvidia card” case. It’s not just the GPU that accesses memory, after all.
 

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
Now let's say you put two machines side by side:

One a PC using that SILLY NVIDIA card for $10,000 at NewEgg, with 32 GB of RAM

One a M1 Pro with 32 GB of RAM

You would have:

AppleVersusIntel.jpg


n.b. I used the SILLY NVIDIA card for reference of having a 32GB GPU card, I think the most a Titan has is 12GB, I don't want to get into the reason why PC+NVIDIA gaming doesn't/can't use more at this point but a Workstation can if you working with large Graphical Data Sets, that a normal PC gamer just couldn't even use... so maybe some other day ;)

So much wrong here. That’s why a 3090 has 24 Gb of VRAM? I can saturate that whole thing with 1 game. They don’t use titan cards anymore….
 

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Gees, now you want to throw Unified Memory into the discussion?

Some can't even understand the difference between classic RAM vs VRAM!

?

I am outta here, better off dead at the CORE...
 

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
UMA is not why external GPUs don't work. That's just Apple's choice. I doubt there's anything blocking it on the hardware level, it's just software - they don't want to support drivers for non-Apple GPUs now that they think Apple GPUs are good enough.

As far as I understand it they are blocking it in a hardware level. It’s not just software. But regardless they are focusing on the average consumer and leaving the pros to the PC world. Which they never really catered to the pros anyway.
 
  • Sad
Reactions: Shirasaki

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
Gees, now you want to throw Unified Memory into the discussion?

Some can't even understand the difference between classic RAM vs VRAM!



I am outta here, better off dead at the CORE...

What you started this thread about is unified memory… are you ok?
 
  • Haha
Reactions: Shirasaki

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Right, I design CPUs, but I'm the one that doesn't understand the difference.

I said "some don't understand", don't take things personal, I respect people.
I am not taking any of this personal, this seems like a discussion.

What you started this thread about is unified memory… are you ok?

I tried to use older Memory Terms to make things easier to understand...
because the PC Memory (as far as my knowledge is concerned) using simple terms like RAM and Memory
(instead of like, SRAM, DRAM, ECC, non-ECC, DDR, DDR2, ... DDR4, etc)

But whatever my song is always, "Mister, Mister, Mister miss the point..."
You can choose to Miss the Point as much as you want instead of trying to understand, and pick things apart.

I understand people do that all the time 50/50, half of people try to understand and half pick apart, it's ALL GOOD...

:p
 

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
I said "some don't understand", don't take things personal, I respect people.
I am not taking any of this personal, this seems like a discussion.



I tried to use older Memory Terms to make things easier to understand...
because the PC Memory (as far as my knowledge is concerned) using simple terms like RAM and Memory
(instead of like, SRAM, DRAM, ECC, non-ECC, DDR, DDR2, ... DDR4, etc)

But whatever my song is always, "Mister, Mister, Mister miss the point..."
You can choose to Miss the Point as much as you want instead of trying to understand, and pick things apart.

I understand people do that all the time 50/50, half of people try to understand and half pick apart, it's ALL GOOD...

:p

I’ve tried to read your original post like 5 times now… I don’t think there is a point in there TO miss.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Some can't even understand the difference between classic RAM vs VRAM!

Most people here understand this difference. This concept has been common with computers for a long time and it’s really not that complicated.

Also UMA has also been around for awhile too. iPhone uses it for example. The difference here is now a high-performance notebook is using it. So many of us here are familiar with traditional ram setups and Apple’s UMA.

Nothing you have said means: “How Apple M1 RAM = 2x RAM” and that’s where we are all confused. This statement just isn’t true and your reasonings for this don’t make any sense.
 

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Here, I'll try a different route.

I have an App in Xcode, I created, that generates 3D Graphical Image Scenes.

I built this App using my Intel MBP 15" (2019) w/16GB of RAM, and a RADEON that only has 3GB VRAM.

When running my App, from Xcode, I could only load 3GB of Graphical Data. That's ALL, 3GB, or else it would CRASH.
The App on Intel was using about 3GB of RAM and 3GB of VRAM, any MORE (loading of VRAM) and the App will crash.

With "Unified Memory":

I now have an M1 Pro MBP 16" (2021) w/32GB of Unified RAM.

When running my App, from Xcode, I can NOW LOAD 28GB of Graphical Data, and render an insanely complex scene.

In THIS USE CASE, the 32GB RAM on my M1 (2021) compared to the 16GB RAM + 3GB VRAM on my Intel (2019):

Lets me load 8x (8 times) the amount of IMAGE DATA (24GB vs 3GB), than I could with with the Intel.

In addition EVEN IF, the Intel MBP had 32GB of RAM, I STILL could only use 3GB of VRAM.

??

In this case 32GB of Unified RAM versus 16GB of RAM(+3GB VRAM), means you get 8x the VRAM!
You would need a PC with 32GB of RAM and a GPU with 32GB VRAM... to do what I am doing on my M1 Pro w/32GB Unified RAM

Maybe clearer?
 
Last edited:

Populus

macrumors 603
Aug 24, 2012
5,944
8,414
Spain, Europe
I’ve tried to understand your first post as well. I personally think that either you’re wrong (or confused) and you have a really robust self-confidence, or you’re trolling. I’m not saying you’re trolling, mind you, but I think you’re wrong.

If M1 macs share the same pool of memory with the CPU and GPU (just like the old intel with integrated graphics) then the memory the GPU is using is memory the system doesn’t have. It’s not “just double the RAM” but rather the same RAM shared with GPU and CPU.

Like others said, the perceived increased performance is because of the fast NVMe drives that allow a super-fast swapping (something that happens a lot with 8GB machines)
 
  • Like
Reactions: rezwits

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Here, I'll try a different route.

I have an App in Xcode, I created, that generates 3D Graphical Image Scenes.

I built this App using my Intel MBP 15" (2019) w/16GB of RAM, and a RADEON that only has 3GB VRAM.

When running my App, from Xcode, I could only load 3GB of Graphical Data. That's ALL, 3GB, or else it would CRASH.
The App on Intel was using about 3GB of RAM and 3GB of VRAM, any MORE (loading of VRAM) and the App will crash.
This is not how using VRAM on GPUs actually works. When you try to use more VRAM than your discrete GPU has, what happens is something akin to swapping: video drivers will frequently copy data between system memory and GPU memory. This hurts GPU performance, but it does not cause a crash.

With "Unified Memory":

I now have an M1 Pro MBP 16" (2021) w/32GB of Unified RAM.

When running my App, from Xcode, I can NOW LOAD 24GB of Graphical Data, and render an insanely complex scene.

In THIS USE CASE, the 32GB RAM on my M1 (2021) compared to the 16GB RAM + 3GB VRAM on my Intel (2019):

Lets me load 8x (8 times) the amount of IMAGE DATA (24GB vs 3GB), than I could with with the Intel.

In addition EVEN IF, the Intel MBP had 32GB of RAM, I STILL could only use 3GB of VRAM.

??
You don't understand this technology.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Here, I'll try a different route.

I have an App in Xcode, I created, that generates 3D Graphical Image Scenes.

I built this App using my Intel MBP 15" (2019) w/16GB of RAM, and a RADEON that only has 3GB VRAM.

When running my App, from Xcode, I could only load 3GB of Graphical Data. That's ALL, 3GB, or else it would CRASH.
The App on Intel was using about 3GB of RAM and 3GB of VRAM, any MORE (loading of VRAM) and the App will crash.

With "Unified Memory":

I now have an M1 Pro MBP 16" (2021) w/32GB of Unified RAM.

When running my App, from Xcode, I can NOW LOAD 24GB of Graphical Data, and render an insanely complex scene.

In THIS USE CASE, the 32GB RAM on my M1 (2021) compared to the 16GB RAM + 3GB VRAM on my Intel (2019):

Lets me load 8x (8 times) the amount of IMAGE DATA (24GB vs 3GB), than I could with with the Intel.

In addition EVEN IF, the Intel MBP had 32GB of RAM, I STILL could only use 3GB of VRAM.

??

In this case 32GB of URAM versus 16GB of RAM(+3GB VRAM), means you get 8x the VRAM!
You would need a PC with 32GB of RAM and a GPU with 32GB VRAM... to do what I am doing on my M1 Pro w/32GB.

I think most of us can agree that the M1 family of processors are pretty cool. But I would stop trying to explain yourself here because there are people here that are much more knowledgeable than you on this.
 

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
Here, I'll try a different route.

I have an App in Xcode, I created, that generates 3D Graphical Image Scenes.

I built this App using my Intel MBP 15" (2019) w/16GB of RAM, and a RADEON that only has 3GB VRAM.

When running my App, from Xcode, I could only load 3GB of Graphical Data. That's ALL, 3GB, or else it would CRASH.
The App on Intel was using about 3GB of RAM and 3GB of VRAM, any MORE (loading of VRAM) and the App will crash.

With "Unified Memory":

I now have an M1 Pro MBP 16" (2021) w/32GB of Unified RAM.

When running my App, from Xcode, I can NOW LOAD 24GB of Graphical Data, and render an insanely complex scene.

In THIS USE CASE, the 32GB RAM on my M1 (2021) compared to the 16GB RAM + 3GB VRAM on my Intel (2019):

Lets me load 8x (8 times) the amount of IMAGE DATA (24GB vs 3GB), than I could with with the Intel.

In addition EVEN IF, the Intel MBP had 32GB of RAM, I STILL could only use 3GB of VRAM.



In this case 32GB of Unified RAM versus 16GB of RAM(+3GB VRAM), means you get 8x the VRAM!
You would need a PC with 32GB of RAM and a GPU with 32GB VRAM... to do what I am doing on my M1 Pro w/32GB Unified RAM

Maybe clearer?

Yea but if you are doing rendering that requires that much vram you aren’t gonna be using a laptop lol
 
  • Love
Reactions: rezwits

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Yea but if you are doing rendering that requires that much vram you aren’t gonna be using a laptop lol
YES THAT'S the WHOLE great part! I am visualizing and manipulating in REAL TIME 28GB of VRAM LIVE on a LAPTOP!!

NO swapping, NO re-rendering, Live EDITING, nothing but just RAW UNLIMITED (well 28GB of GPU VRAM-ish, limited) POWER!!!

Alright, I guess it's hard to understand or just hard for me to explain... I only understand because,

I HAVE the two Laptops SIDE by SIDE and can actually WITNESS THIS and it's ASTONISHING!

n.b. maybe you could ask why does one GPU have 2GB and another 8GB? and why would you even need more VRAM (GPU Memory)?
IDK maybe that question will help indirectly...
 
Last edited:
  • Angry
Reactions: Shirasaki

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
As far as I understand it they are blocking it in a hardware level. It’s not just software. But regardless they are focusing on the average consumer and leaving the pros to the PC world. Which they never really catered to the pros anyway.
No, there's no hardware blocking. How could they? eGPUs are just PCIe devices. Apple supports generic external PCIe through Thunderbolt, therefore there's a hardware path by which software (a GPU driver) can communicate with an eGPU. There's nothing special about GPU PCIe traffic, it's all just PCIe MRd and MWr packets same as any other PCIe device.

Apple would've had to build some kind of traffic analysis hardware to actually block this on the hardware level, and... why would they? What would be the point? All they need do if they want to actively block eGPU is just to not write and ship the required software layers. (and as far as motivations go - that's exactly identical to what they'd do if they're not actively trying to block it, but instead just don't want to put effort into supporting eGPU. So you can't even tell what their intentions are.)

As for consumer vs. pro, arguably they are more focused on pros than before, but they're doing it the Apple way. Which means that when pro software is optimized to fully take advantage of their system architecture, things are great, and when not, not so much.
 
  • Like
Reactions: rezwits

rezwits

macrumors 6502a
Original poster
Jul 10, 2007
838
436
Las Vegas
Here is another example. Why does a top of the line NVIDIA Titan have 12GB of VRAM?

This is because if you have a PC with 16GB of RAM, you can load up a 3D scene in an App, or Game.
The App and the OS will take up at a minimum 4GB of RAM, leaving 12GB of your 3D DATA, which you can swap/edit/load whatever into the 12GB of the NVIDIA GPU...

16GB RAM + 12GB VRAM

Hmm maybe this can help too IDK ??
Why not just put 16GB on the NVIDIA GPU? and why does MORE GPU RAM even matter? ugh
 

Eric40962005

macrumors newbie
Sep 23, 2021
14
8
Here is another example. Why does a top of the line NVIDIA Titan have 12GB of VRAM?

This is because if you have a PC with 16GB of RAM, you can load up a 3D scene in an App, or Game.
The App and the OS will take up at a minimum 4GB of RAM, leaving 12GB of your 3D DATA, which you can swap/edit/load whatever into the 12GB of the NVIDIA GPU...

16GB RAM + 12GB VRAM

Hmm maybe this can help too IDK
Why not just put 16GB on the NVIDIA GPU? and why does MORE GPU RAM matter? ugh

Why do you keep repeating the top card they have only has 12 GB of VRAM that’s dead wrong just stop.
 
  • Like
Reactions: rezwits
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.