Uh, this is not just my definition of bursty. If you read technical articles about CPUs, this is exactly what they are talking about. If you make up your own definition, then you will be confused when people talk about the subject.Well, let's just agree to disagree. I don't think my definition of "bursty" workloads will match up with yours then. I've had all of the fanless 12" MacBooks, as I don't doubt you have as well, and both of us are very well aware of the performance level of those machines. They were supposed to be "bursty" as well, but they didn't perform very well.
Taken from Apple's site:Well, they might have said Intel if they meant Intel. Not qualifying their claim should make it inclusive of the best PC chip available.
I think the numbers may line up fine, as explained above. But we'll have a better idea as other CPU benchmark results become available.
Anyway, this is just a matter of curiosity, what Apple was really comparing to.
Testing conducted by Apple in October 2020 using preproduction 13‑inch MacBook Pro systems with Apple M1 chip and 16GB of RAM. Multithreaded performance measured using select industry‑standard benchmarks. Comparison made against latest‑generation high‑performance notebooks commercially available at the time of testing. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
This test not is accurate, as the M1 would use the 4 lower cores to process a 1080P60 file. To see a real difference, its better to use 4k 60 file, with effects. That will make the MBP 2014 chokeA tender test, 15 minutes old
Why do you conclude it was limited to Intel based on what you quote from Apple?So It looks like it is not only limited for Intel
But I said NOT ?Why do you conclude it was limited to Intel based on what you quote from Apple?
Why do you conclude it was limited to Intel based on what you quote from Apple?
Guy says his mba m1 gets way too hotLive benchmarking
Well yeah that’s another thing. The Intel power figures drastically increase for not much performance. Discrete GPUs too. Apples chips just overall seem more efficient. Let’s say they double the high performance cores of the current M1. That’s a negligible difference compared to the chips in the current 16”. Barely registers. A 4x scale of GPU cores would be a bigger difference, but we’re still way under the current chips. The thing about Apples architecture is that every bit of it is designed to use as less energy as possible. When doing every day tasks the battery will be amazing, it will drop when stressing it for sure but I don’t think will be nearly the disparity from best to worst case scenario as Intel is. Intel CPUs and AMD GPUs fundamentally are just desktop chips at limited TDPs. Apple is the opposite.I don't think you can say benchmark scores are a matter of percentage... because the higher you scale, the harder it is for you to see gains unless you switch to a whole new architecture. I'd take the absolute score difference, but maybe that's just me. Also, that's not to mention the iPhone 12 Pro does thermal throttle quite often. It heats up very easily... and I encounter this every day because I do video calls with my wife on a regular basis. The phone heats up like its life depends on it. So the chip is clearly not meant to be in this body, or else Apple is too obsessed with looking good in benchmarks that they have scaled the chip too high. It probably doesn't hurt to scale it down a little.
And it's unfortunate that Geekbench is the only benchmark we can use to compare the iPhone and MacBook. The Air and Pro don't look different in Geekbench either, until Cinebench is engaged.
But anyways, yes, the current 16" is not great in worst-case. It really can drain in an hour flat. The GPU is not 50W, though. At max load, it's about 55 - 60W (I have the 5500M), and the CPU, while rated at 45W, can draw 120W when Turbo-boosting.
In regular use, it does last about 7-8 hours thanks to the massive battery, but do anything slightly more intensive like... say... "watching Youtube video" and that figure drops pretty rapidly.
So I've had enough of loud fans and short battery run time. But I'm not going to fall into the "fanless" trap of the old days anymore. Even if the fan in the 13" Pro is useless, at least it'll help when ambient temps get toasty. Last summer in Cali, I've learned that no device I own can work at full speed at 110F. The iPad Pro and iPhone hung on for dear life, and the 16" didn't even bother. I'm hoping the M1 Pro will do better next summer.
Guy says his mba m1 gets way too hot
He is doing Benchmarking while streaming using OBS on the same machine?, if M1 stays cool without a fan under that load this is no longer science, but magic.Guy says his mba m1 gets way too hot
Sorry, I see what you mean now!But I said NOT ?
It hurts watching this. He doesn’t know what he is doing. He runs 3-4 tests at the same time. He doesn’t understand the numbers.He is doing Benchmarking while streaming using OBS on the same machine?, if M1 stays cool without a fan under that load this is no longer science, but magic.
BTW, I don't think his benchmarking is representative as OBS is cooking the CPU.
It hurts watching this. He doesn’t know what he is doing. He runs 3-4 tests at the same time. He doesn’t understand the numbers.
Yeah, hard to guess what the benchmarks might have been, and what form they were run in.People are misinterpreting Apple’s claim via vagueness I suspect. Apple claims they have the fastest single core performance. It’s listed right on their product page. They also claim vastly improved multi core performance. Both of these claims are with respect to “the latest laptop pc cpus”. People latch onto the fastest aspect and apply it to multi core as well when Apple never made that claim.
We don’t know if this is for all benchmark programs.
Yeah, hard to guess what the benchmarks might have been, and what form they were run in.
I don’t mind that, but he didn’t listen to comments... but agree, I was a little harsh. Was not irritated by someone who didn’t touch a Mac in 10 years, running 4-5 benchmarks at the same time, didn’t understand a bit what was happening but that he wasn’t listening to input of users, who tried to give him sound and friendly adviceI saw where you called the benchmarks amateurish lol. Give the guy a break. He probably is an amateur! Everyone and their grandma is making a benchmarking vid right now.
That's exactly why I'm not watching. I tried watching a couple and they were awful. Either they didn't know what they were doing with the tests or else they didn't know how to hold a frickin' camera, zooming in and out and moving all over the place. Made me want to puke.I saw where you called the benchmarks amateurish lol. Give the guy a break. He probably is an amateur! Everyone and their grandma is making a benchmarking vid right now.
I don’t mind that, but he didn’t listen to comments... but agree, I was a little harsh. Was not irritated by someone who didn’t touch a Mac in 10 years, running 4-5 benchmarks at the same time, didn’t understand a bit what was happening but that he wasn’t listening to input of users, who tried to give him sound and friendly advice
Are you saying 4k 60fps videos are faster to export than a 1080p 60fps video?This test not is accurate, as the M1 would use the 4 lower cores to process a 1080P60 file. To see a real difference, its better to use 4k 60 file, with effects. That will make the MBP 2014 choke
Just one thing. Apple said that its faster than 98 percent of Laptops PCs in the last year. The actual amount of laptops sold now will be higher more powerful than the M1 will be higher two percent. So Apple admits that some PC laptop chips are faster. What's the fastest PC laptop chips? The 35-45w Renoir chips, so by Apple's own statement is not Renoir, but some 14nm Intel chip as 10nm chips don't reach higher than 28w.We were originally talking about the claim that the M1 is twice as fast as the latest and greatest PC chip. @M1 Processor thought that claim showed they were comparing to Intel, not AMD. I'm not so sure.
The claim was actually quite vague, as you say: "At just 10 watts (the thermal envelope of a MacBook Air), M1 delivers up to 2x the CPU performance of the PC chip." I think that claim is relative to the power, so up to 2x at a particular power draw, probably 10W. I don't see any reason that couldn't be about the 4800U or a similar chip.
It's the claim that followed that one that I think would be a tougher comparison with a chip like the 4800U: "And M1 can match the peak performance of the PC chip while using just a quarter of the power." That suggests a TDP of maybe 40W at peak power. The footnote explains that multiple benchmarks were used, so I'd guess some kind of composite was used.
Supposing they were comparing to the 4800U operating at 40W, the M1 would likely beat the 4800U by a little in Geekbench. It would lose by a larger margin in Cinebench r23. Don't think either of those tests was available in native form for the M1 in October, so who knows what they used and how. Supposing they used other benchmarks with potentially similarly mixed results, it still seems possible, pending further results, they were comparing to a chip like the 4800U.
Well, they might have said Intel if they meant Intel. Not qualifying their claim should make it inclusive of the best PC chip available.
I think the numbers may line up fine, as explained above. But we'll have a better idea as other CPU benchmark results become available.
Anyway, this is just a matter of curiosity, what Apple was really comparing to.