Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There was a post from mlykke earlier in this thread demonstrating exactly that. This graph compare the multi-core performance of MySQL 5.6 and the previous version 5.5. MariaDB is a fork of MySQL.

View attachment 949679

The software update that was displayed in Max Tech was because the processing was offloaded to the GPU which is much faster at that type of work. You wouldn't be able to do the same for the CPU since the raw performance difference is around 9-10% - So that would be the highest difference you would see in 100% CPU based loads. But you will not see those 9-10% because the i9 can't sustain a turbo clock as high as the i7, meaning it will run with more cores but lower speed pr. core. Unless there is something really broken in Geekbench and similar CPU benchmarks, then nothing can overcome the raw possible performance difference of the processor.

The MySQL example was not on a system where more cores were running slower. It was with a system where the cores could run at their max potential 24/7, making the only difference the number of cores. The limiting fact on the iMac is usually not the software, but the overall raw performance of the i9, which is limited by the cooling system in the iMac.

So instead of "feeling" like it might be better, then look at the facts.
 
Instead of a pointless statement, then come with some data to support your point. So far I have a ton of reviews and benchmarks that support my claims.

The people in the i9 camp will never understand. I seriously think that because the number 10 is higher than 8 and the number 9 is higher than 7 that no matter what you tell them or show them or prove to them, they will never be convinced. They are willing to pay any amount of money for the highest CPU upgrade when regardless of whether its a performance increase or not. Its just funny to use how the clueless or uneducated are agreeing that the person that spent the upgrade "made the right choice" when they don't even know what the right choice is.
 
The people in the i9 camp will never understand. I seriously think that because the number 10 is higher than 8 and the number 9 is higher than 7 that no matter what you tell them or show them or prove to them, they will never be convinced. They are willing to pay any amount of money for the highest CPU upgrade when regardless of whether its a performance increase or not. Its just funny to use how the clueless or uneducated are agreeing that the person that spent the upgrade "made the right choice" when they don't even know what the right choice is.

I do see a bit of "I feel like the i9 must be better even though facts is stating otherwise", but I really don't want it to be an us vs them. I'm just trying to discuss something I find interesting and hoping to give people some insight that they might not otherwise have. I'm not trying to be arrogant but I could easily buy a couple of well specced Mac Pro's with some XDR displays without worrying about my budget. I'm the type that really push my machine a lot, so if the i9 actually provided any real benefit in the order of 10-20% performance increase, I would be all over it. But fact is that the i9 is a waste of money since it doesn't provide any benefit. Sure there might be some kind of very specific niche situation where it can give you a tiny edge - But so far nobody has given any example of this being the case for the use cases people have mentioned. I would also argue that if that tiny edge in a very specific(so far unknown) example makes a difference in your job or finances, then you would be much better off by getting a Mac Pro or similar machine, that would be able to provide a lot more performance.

In the end it's a computer and we're supposed to enjoy them. I just don't want people to waste money on a pointless upgrade. :)
 
  • Like
Reactions: BuCkDoG
I do see a bit of "I feel like the i9 must be better even though facts is stating otherwise", but I really don't want it to be an us vs them. I'm just trying to discuss something I find interesting and hoping to give people some insight that they might not otherwise have. I'm not trying to be arrogant but I could easily buy a couple of well specced Mac Pro's with some XDR displays without worrying about my budget. I'm the type that really push my machine a lot, so if the i9 actually provided any real benefit in the order of 10-20% performance increase, I would be all over it. But fact is that the i9 is a waste of money since it doesn't provide any benefit. Sure there might be some kind of very specific niche situation where it can give you a tiny edge - But so far nobody has given any example of this being the case for the use cases people have mentioned. I would also argue that if that tiny edge in a very specific(so far unknown) example makes a difference in your job or finances, then you would be much better off by getting a Mac Pro or similar machine, that would be able to provide a lot more performance.

In the end it's a computer and we're supposed to enjoy them. I just don't want people to waste money on a pointless upgrade. :)
Oh I completely agree with you for sure. I’m not trying to say either that it’s an us versus them camp but the people trying to justify the i9 have 0 facts and 0 proof that it’s not only better but worth the upgrade.
 
Instead of a pointless statement, then come with some data to support your point. So far I have a ton of reviews and benchmarks that support my claims.

Oh I completely agree with you for sure. I’m not trying to say either that it’s an us versus them camp but the people trying to justify the i9 have 0 facts and 0 proof that it’s not only better but worth the upgrade.

What kind of information would you need to see that you'd consider "facts" and "proof" that would get you to change your mind?

Below is my 10-core i9 utilization (according to Activity Monitor) while I'm working with four instances of a program that are each doing different things. And yes, it's beneficial to my workflow to have four instances at once – they generate unique outputs that are critical to my work. I could probably use around six instances honestly...

1599145279304.png


Hate to burst your bubble, you seem to get off on telling people they're wasting their money...
 
What kind of information would you need to see that you'd consider "facts" and "proof" that would get you to change your mind?

Below is my 10-core i9 utilization (according to Activity Monitor) while I'm working with four instances of a program that are each doing different things. And yes, it's beneficial to my workflow to have four instances at once – they generate unique outputs that are critical to my work. I could probably use around six instances honestly...

View attachment 949891

Hate to burst your bubble, you seem to get off on telling people they're wasting their money...
That's impressive an iMac can do this today. This was only possible on even more expansive workstation just 2 years ago.
 
What kind of information would you need to see that you'd consider "facts" and "proof" that would get you to change your mind?

Below is my 10-core i9 utilization (according to Activity Monitor) while I'm working with four instances of a program that are each doing different things. And yes, it's beneficial to my workflow to have four instances at once – they generate unique outputs that are critical to my work. I could probably use around six instances honestly...

View attachment 949891

Hate to burst your bubble, you seem to get off on telling people they're wasting their money...
It has nothing to do with “getting off” but clearly you are just triggered about the truth and facts that for literally 99.99999% of people they should not waste their money. Here is my facts and proof from an extremely reliable source. The facts within this video cover the vast majority of users purchasing this computer. Enjoy getting your bubble bursted.
 
But people tend to forget that when you actually do video editing, 3D modelling and similar heavy tasks - You spend 95% or your time on editing, finding clips, deciding on the story etc. All things where the CPU is only used lightly. So you spend 8 hours editing a video but then people worry about saving 15 seconds on a video export which takes 8 minutes.

That hit me in the gut so hard lol. Who are you to speak such wisdom and truth??
 
  • Haha
Reactions: pldelisle
The software update that was displayed in Max Tech was because the processing was offloaded to the GPU which is much faster at that type of work. You wouldn't be able to do the same for the CPU since the raw performance difference is around 9-10% - So that would be the highest difference you would see in 100% CPU based loads. But you will not see those 9-10% because the i9 can't sustain a turbo clock as high as the i7, meaning it will run with more cores but lower speed pr. core. Unless there is something really broken in Geekbench and similar CPU benchmarks, then nothing can overcome the raw possible performance difference of the processor.

The MySQL example was not on a system where more cores were running slower. It was with a system where the cores could run at their max potential 24/7, making the only difference the number of cores. The limiting fact on the iMac is usually not the software, but the overall raw performance of the i9, which is limited by the cooling system in the iMac.

So instead of "feeling" like it might be better, then look at the facts.

I think you are missing the main point of my repost of the graph that you originally posted which is the difference in multi-core performance between MySQL 5.6 and (not much) older versions of the same software.

As to your other point, more cores are always going to run somewhat slower than a single core if they are on the same CPU die.
 
What kind of information would you need to see that you'd consider "facts" and "proof" that would get you to change your mind?

Below is my 10-core i9 utilization (according to Activity Monitor) while I'm working with four instances of a program that are each doing different things. And yes, it's beneficial to my workflow to have four instances at once – they generate unique outputs that are critical to my work. I could probably use around six instances honestly...

View attachment 949891

Hate to burst your bubble, you seem to get off on telling people they're wasting their money...

But you're not proving anything. You're just showing that all cores are running at close to max load. But if each individual core overall performs slower than the i7 with a higher base clock and which maintains a higher boost clock, then the i7 might be able to perform the same tasks equally fast.
So your screenshot shows regular behaviour but says nothing about the overall performance and speed of which the processes perform. Nobody has at any point claimed that you can't max out the cores in the i9 - Just that the thermal limitations of the iMac forces them to run at a slower speed overall compared to the i7, which then causes the performance to be more or less identical.

And again, if you enjoy your i9 then thats great. But it still doesn't mean that the i7 couldn't have done the same.
 
  • Like
Reactions: BuCkDoG
I think you are missing the main point of my repost of the graph that you originally posted which is the difference in multi-core performance between MySQL 5.6 and (not much) older versions of the same software.

As to your other point, more cores are always going to run somewhat slower than a single core if they are on the same CPU die.

I fully got your point. What I'm saying is that hoping to see a similar picture of performance increase with future software updates are not likely on the 2020 iMac. The overall limitation is not the software but the raw performance of the i9. If the software is the limitation then at the very best you will see a 9-10% performance increase with a future update IF the current limitation in performance is the result of software not being optimized for 10 cores/20 threads and the load is strictly CPU based. This will also require the software companies to see a benefit to optimize in a manner to squeeze out a few more percent on the 10 core machine, which is quite a niche situation for most software. Some high-end software might think along those lines, but most software is more likely to focus on features than a tiny performance increase.

But hey, maybe a Christmas miracle is gonna happen and Apple releases a firmware update that overcomes the iMacs sub-optimal cooling of high-core CPU's such as the i9. ;)
 
  • Like
Reactions: pldelisle
Oh I completely agree with you for sure. I’m not trying to say either that it’s an us versus them camp but the people trying to justify the i9 have 0 facts and 0 proof that it’s not only better but worth the upgrade.

You are forgetting that the Geekbench multi-core score is 11% higher for the i9 than the i7. It's less than half of the 25% increase in cores but it's not insignificant.
 
  • Like
Reactions: BuCkDoG
But people tend to forget that when you actually do video editing, 3D modelling and similar heavy tasks - You spend 95% or your time on editing, finding clips, deciding on the story etc. All things where the CPU is only used lightly. So you spend 8 hours editing a video but then people worry about saving 15 seconds on a video export which takes 8 minutes.
That hit me in the gut so hard lol. Who are you to speak such wisdom and truth??
This is sooooooooooo true !!!!!!!
 
I fully got your point. What I'm saying is that hoping to see a similar picture of performance increase with future software updates are not likely on the 2020 iMac. The overall limitation is not the software but the raw performance of the i9. If the software is the limitation then at the very best you will see a 9-10% performance increase with a future update IF the current limitation in performance is the result of software not being optimized for 10 cores/20 threads and the load is strictly CPU based. This will also require the software companies to see a benefit to optimize in a manner to squeeze out a few more percent on the 10 core machine, which is quite a niche situation for most software. Some high-end software might think along those lines, but most software is more likely to focus on features than a tiny performance increase.

But hey, maybe a Christmas miracle is gonna happen and Apple releases a firmware update that overcomes the iMacs sub-optimal cooling of high-core CPU's such as the i9. ;)

I agree that the even the best multi-core optimized CPU load is unlikely to exceed the Geekbench multi-core delta. But 10% is not nothing.
 
But you're not proving anything. You're just showing that all cores are running at close to max load. But if each individual core overall performs slower than the i7 with a higher base clock and which maintains a higher boost clock, then the i7 might be able to perform the same tasks equally fast.
So your screenshot shows regular behaviour but says nothing about the overall performance and speed of which the processes perform. Nobody has at any point claimed that you can't max out the cores in the i9 - Just that the thermal limitations of the iMac forces them to run at a slower speed overall compared to the i7, which then causes the performance to be more or less identical.

And again, if you enjoy your i9 then thats great. But it still doesn't mean that the i7 couldn't have done the same.
While there are some truth in your statement, I don't personally think that a slightly higher frequency (say what ... 200Mhz more ?) on 8 cores would perform any better for this specific kind of workload since all core seem to be maxed out in time.

Yes, there are software optimization level to be made when more cores are involved. I have good basis on parallel computing. There is an inflexion point between core count and speedup. The more you split the workload, the more overhead you have on the OS kernel side and software side. I experimented a lot with libraries like MPI and CUDA to know that the worst in parallel computing is the overhead of creating a thread/process and attributing to it CPU time. The inter-process communication (IPC) is also incredibly important. Some workload just doesn't parallelize because there would be too much overhead to send/receive the data to be processed across process. Threads use shared memory, but the context need to fit this paradigm too.

For well optimized workload or computing that can be manually distributed over different processes, the more cores the better. For map_reduce paradigm, the more cores the better (this is what makes Big data analytics so fast today). And this kind of workload won't gain from a 200Mhz more on each core. If you have a million images to preprocess (ImageNet for instance), preprocessing it on 10 cores will always be faster than on 8 with slightly higher frequency. I don't have any experiment to prove it because obviously I don't have two iMac with the i7 and i9 chips but my feelings are 2 more cores at 4 GHz can process a lot more than 2 cores less with 200 MHz more.

Benchmarks aren't the real life. Benchmarks are synthetic. The real world use cases always take precedence on the synthetic tests.

And even if we don't speak of distributed / parallel workload, more cores and threads mean that more program can access a time slot on CPU, reducing the global execution time of a multi-app use case scenario. 250 processes requiring CPU time on 10 cores will always be more efficient that 250 processes requiring CPU time on 8 cores.
 
Last edited:
  • Like
Reactions: mlykke
While there are some truth in your statement, I don't personally think that a slightly higher frequency (say what ... 200Mhz more ?) on 8 cores would perform any better for this specific kind of workload since all core seem to be maxed out in time.

Yes, there are software optimization level to be made when more cores are involved. I have good basis on parallel computing. There is an inflexion point between core count and speedup. The more you split the workload, the more overhead you have on the OS kernel side and software side. I experimented a lot with libraries like MPI and CUDA to know that the worst in parallel computing is the overhead of creating a thread/process and attributing to it CPU time. The inter-process communication (IPC) is also incredibly important. Some workload just doesn't parallelize because there would be too much overhead to send/receive the data to be processed across process. Threads use shared memory, but the context need to fit this paradigm too.

For well optimized workload or computing that can be manually distributed over different processes, the more cores the better. For map_reduce paradigm, the more cores the better (this is what makes Big data analytics so fast today). And this kind of workload won't gain from a 200Mhz more on each core. If you have a million images to preprocess (ImageNet for instance), preprocessing it on 10 cores will always be faster than on 8 with slightly higher frequency.

I agree with most of your points. Although I'm still not too convinced about the i9 vs the i7, since we still haven't seen any tests really showing a clear difference between the two. So it's still a theoretical possibility. But would be awesome to see some real-world examples of this in action.
 
You are forgetting that the Geekbench multi-core score is 11% higher for the i9 than the i7. It's less than half of the 25% increase in cores but it's not insignificant.
I fully agree as I’ve seen the benchmarks via MaxTech but for sure indeed.
 
So it's still a theoretical possibility. But would be awesome to see some real-world examples of this in action.
Totally correct.

But none of the guys you see on Youtube can do it because none of them understand parallel computing principles. They only understand how to click on the "BENCHMARK NOW" stupid button. None of them even know what data analytics can be or how it works. Or how software development work. When you have a test suite of thousands of unit and integration tests to run prior merging your code into a higher level branch, having more cores to spread the load is also an evidence things will achieve lower execution time.
 
It has nothing to do with “getting off” but clearly you are just triggered about the truth and facts that for literally 99.99999% of people they should not waste their money. Here is my facts and proof from an extremely reliable source. The facts within this video cover the vast majority of users purchasing this computer. Enjoy getting your bubble bursted.

I don't disagree that in 99%+ of cases the 10c i9 is not worth the $400 upgrade (and often slower in single-threaded tasks).

But if each individual core overall performs slower than the i7 with a higher base clock and which maintains a higher boost clock, then the i7 might be able to perform the same tasks equally fast.

So your screenshot shows regular behaviour but says nothing about the overall performance and speed of which the processes perform. Nobody has at any point claimed that you can't max out the cores in the i9 - Just that the thermal limitations of the iMac forces them to run at a slower speed overall compared to the i7, which then causes the performance to be more or less identical.

I unfortunately don't have the exact machine w/ only the CPU swapped for the 8c i7 to compare this to but here is more of the same work with Intel Power Gadget running to show some additional info (avg. clock speed of ~4.2GHz) ... And I don't know enough about CPU's to know what the "equivalent clock speed" the 8c would have to maintain in order to match this performance for these processes I'm running. Plus, I don't know what else I don't know...


1599153190849.png
 
It has nothing to do with “getting off” but clearly you are just triggered about the truth and facts that for literally 99.99999% of people they should not waste their money. Here is my facts and proof from an extremely reliable source. The facts within this video cover the vast majority of users purchasing this computer. Enjoy getting your bubble bursted.

I don’t think that video actually proves very much. There are a bunch of benchmarks where the i9 machine is significantly faster than the i7, but those results don’t support his point of view so he just dismisses them as “oh that’s the GPU doing that” which is probably true, but he has not proven it. Until somebody tests an i9 against another i7 with the same GPU the results are inconclusive.
I have also wondered if using the i9 with a lower end a GPU such as the 5500xt would give it more room to breath within the overall power and thermal budget Of the machine for CPU intensive tasks. But that would require testing 2 machines with the same CPU and different GPUs.
For the record, I bought the i7 because I had a budget to stick to, and didn’t think the i9 was worth the extra money for me. I just don’t think Max Tech’s testing methodology is good enough to hold it up as definitive proof in this case.
 
Look, for me it’s not actually a settled issue. Max Tech’s testing is absolutely helpful but it isn’t definitive or comprehensive. The Xcode test he’s using isn’t as indicative as say, building Firefox or something from source. And it’s very possible that those real world tests would still show little to no difference — but what we’ve got right now isn’t even a proper A/B test because the GPUs in the machines are different.
I’m most interested in virtualization and containerization loads. If I could get even one more VM from the i9 versus the i7, yeah, for me it’s worth it. But there haven’t been any tests to show that. I’d be happy to try to create a testing scenario if someone with a i7/5700XT/128GB of RAM wants to waste a few hours to try to answer that comparatively.

I will say I don’t think harping on someone who already made a purchase to try to make them feel bad for getting what they got is helpful. And I’m not talking about me — I’ve wasted $400 on far dumber things than a useless iMac upgrade. I spent $200 on multiple copies of the same vinyl record (color exclusives are a bitch) just this morning. I’m very comfortable with my life choices and how I spend my money and I don’t need validation or affirmation that I’ve spent it the right way. But it’s not helpful and it comes across as belittling and frankly, rude, to try to make people feel bad for how they’ve spent their money on something they have saved for and are excited about. You can say you don’t think it’s with the extra money without being a jerk about it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.