Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Feel free to share the test data.

In which case you won't mind showing the working out, as my teachers used to say...

So let’s see the numbers.

Performing the same mouse-heavy task on both operating systems for any sustained length of time led to a consistent 20-40% difference in the amount of time required to complete it. And this was after a couple of practice rounds to adjust to the different acceleration curves.

An even better test, I later discovered, is the classic.mouseaccuracy.com web-based test. With a good mouse, (or even a trackpad) a score of 17 is doable in Windows. Good luck hitting that in OSX though. You can even try it yourself... unless you're too scared to know the objective truth.

This is a thread topical to the Apple announcement in 2005 to transition to Intel CPUs. Your grievances over pointer acceleration are topical here how exactly?

I believe it started on page 2, arising out of a discussion of computer productivity.

Are you arguing that there is a paradigm around pointer acceleration which should be quashed forthwith from pointer-based UIs? I can’t really come to another conclusion here, based on your persistent thesis.

Sigh. No. I never said that. Nobody did. Good acceleration is essential. Bad acceleration is counterproductive. No acceleration is even worse. I don't know how it is possible to be more clear.

This sounds like something particular to you that you've decided affects everyone. I'm reminded about the old saying that a poor craftsman blames the tools.

Does a poor craftsman blame a manual screwdriver for being slower than a power driver? No. In the same way, this is simply an example of a less efficient tool. Yes, it does work, and a good job can still be done with it. That job just takes longer to do.

...

I'll also go out on a limb and say that even if you, individually, can show a difference, that's not sufficient to make the sweeping generalizations you have made. All that means is that Windows mouse acceleration works better for you. If a handful, dozens, hundreds of users in controlled tests reach the same result, you would have a basis to make the claim of it being universally better.
...

It's glaringly and objectively obvious. I've already explained the test. It's replicable by anyone with a non-1:1 pointing device (that is to say, anything that isn't a touchscreen). I've shown the practical data and the theory behind it, but nobody has taken me up on presenting anything but conjecture and anecdotes to the contrary.

...
As to your archived link-first of all, I'm pretty sure that "Ballistics" is just the term that MS uses, or at least once used, for their mouse acceleration algorithms. At least that's how I read that page. Second, is that your support for saying there's "serious research" behind it? I'm not doubting that MS spent a lot of time developing it and didn't just do it blindly, but that page just describes how it works, not the WHY of it working the way it does. I'm sure the "why" information exists, but understandably Microsoft probably doesn't want the details publicly known(although I'd be really interested in seeing them). Do you really think Apple hasn't UI studies to arrive at their current algorithms?

I think the "why" is that it allows users to operate pointing devices more precisely. It's really not more complicated than that. To answer your question, no, I don't think that Apple "studied" anything to arrive at what they have right now. They plucked some non-copyright-infringing default out of the bowls of the 1990s and torturously kept it alive.
 
I believe it started on page 2, arising out of a discussion of computer productivity.

Then it was an aside, not central to the discussion topic. I haven’t found any standouts on page 2, but that’s not really the point.


Sigh. No. I never said that. Nobody did. Good acceleration is essential. Bad acceleration is counterproductive. No acceleration is even worse. I don't know how it is possible to be more clear.

“Good” and “bad”, as you should already know, are subjective calls, not benchmarks. They cannot be quantified. And as many others aside from myself have observed here, what’s “good” for one person may be entirely “bad” for another.

This is not a hill upon which one should find it worth dying.
 
Last edited:
So no documented test then - just anecdotal observations?

It was very well documented and repeatable by anyone. Judging by the fact that nobody has been willing to show any data to the contrary, (in this thread or any other, on any forum, in the 2+ decades that this has been a thing) I'm kind of thinking that it's more than "just anecdotal observations."

“Good” and “bad”, as you should already know, are subjective calls, not benchmarks. They cannot be quantified. And as many others aside from myself have observed here, what’s “good” for one person may be entirely “bad” for another.

You might be forgetting, however, that, in this context, they most definitely can be quantified and objectively measured. Almost anyone can do it, even without a lab coat 😉. I would say that when the difference is as significant as it is, you can call it what you want, but "good" and "bad" are not inaccurate descriptors.

This is not a hill upon which one should find it worth dying.

Until such a time as we find input devices that negate the need for mice, pointing sticks, trackpads, trackballs, and other such instruments, I would emphatically disagree.
 
It was very well documented and repeatable by anyone. Judging by the fact that nobody has been willing to show any data to the contrary, (in this thread or any other, on any forum, in the 2+ decades that this has been a thing) I'm kind of thinking that it's more than "just anecdotal observations."

I’m sorry. As a rhetorical device, that’s not how an appeal to data works.

If one brings up the conclusion of data, then one also must supply the data to back their thesis.

One does not make the thesis, then to demand others to supply data to refute that thesis.

You might be forgetting, however, that, in this context, they most definitely can be quantified and objectively measured. Almost anyone can do it, even without a lab coat 😉. I would say that when the difference is as significant as it is, you can call it what you want, but "good" and "bad" are not inaccurate descriptors.

Then I’m sure you can deliver on that past, established test data, as this is the thesis you — not others — presented.

Otherwise, we’re all gently trying to give you an out from this hole inside which you’re digging: let it go.
 
If one brings up the conclusion of data, then one also must supply the data to back their thesis.
I did. It's had to argue with a consistently and objectively replicable 20-40% difference. And while others (like the person who invented SmoothMouse, whose primary purpose is to simulate the Windows mouse acceleration curve on OSX) may not have used such a precise measurement, the fact that wildly-popular widely-used software existed to fill this void doesn't disprove it either.

Otherwise, we’re all gently trying to give you an out from this hole inside which you’re digging: let it go.
I'll present the same opportunity to you, especially as the self-proclaimed "B S Magnet." You and everyone else who has tried to challenge me on this point has failed to refute the hard facts of the matter, so I'll give you an opportunity to opt out of the hole that you've already dug.
 

Where? (and nope, one reference to a Microsoft-dot-com page, captured by Web Archive, fails to count as independent, peer-reviewable data).

It's had to argue with a consistently and objectively replicable 20-40% difference.

You may want to supply from where you sourced those figures.


And while others (like the person who invented SmoothMouse, whose primary purpose is to simulate the Windows mouse acceleration curve on OSX) may not have used such a precise measurement, the fact that wildly-popular widely-used software existed to fill this void doesn't disprove it either.

Making an unfalsifiable point undermines your thesis.


I'll present the same opportunity to you, especially as the self-proclaimed "B S Magnet." You and everyone else who has tried to challenge me on this point has failed to refute the hard facts of the matter, so I'll give you an opportunity to opt out of the hole that you've already dug.

I invite you to read my bio to confirm what “B S Magnet” means. Cheers.

Also, hey: I think you dropped this — 🅻 — just after you sent your last post.

The content of your next reply determines whether I’ll continue to interact with you.
 
It was very well documented and repeatable by anyone.
You've just lost any shred of credibility in this discussion. This is a polite and considered forum where people can talk and share opinions, whereas you're just driving the bulldozer of "I'm always right" - despite having nothing to show in terms of facts and figures.
 
You feel like you got 'hoodwinked' because you only received two major OS releases. However, you have to understand that the release cycle for macOS changed. When I worked for Apple, typically, we did a release every 2-3 years, and that changed with the release of iOS. Another issue to understand is that the rate of hardware refreshes was also time-stagnated between hardware releases. The G4 had been out since 1999, and the G5 didn’t hit until 2003. Fast forward, and you have Intel chips coming out year after year, so with the change in hardware revisions came a change in software release cycle. When I left the company in December of 2006, and having been there since 1996, a lot changed. I don’t think users got hoodwinked, but that’s my opinion. I certainly see other viewpoints, but the software industry in the early 2000s is a vastly different world than it is today.
 
And as someone who believes everyone deserves a voice I agree with @Dronecatcher you absolutely lost any shred of respect from me as well…sorry! This is a place where ideas should be what drives discussions and creativity….its not a place for someone who wants to argue and put down ideas…
 
You feel like you got 'hoodwinked' because you only received two major OS releases. However, you have to understand that the release cycle for macOS changed. When I worked for Apple, typically, we did a release every 2-3 years, and that changed with the release of iOS.

It changed when Bernard Serlet was replaced by Craig Federighi in 2011, a few months before the start of the once-per-annum-major-update beginning with Lion. Serlet’s approach was to iron out bugs and optimize the OS until it was stable and reliable; Federighi’s was to adopt a rolling release pattern, whether or not the outgoing OS build was in tip-top shape. Consequently, consumers got Lion, Yosemite, Catalina, and Ventura.

iOS, a discrete OS once removed and caged down from developer builds of Leopard, launched in early 2007, when Tiger was still the current Mac OS X build.

Another issue to understand is that the rate of hardware refreshes was also time-stagnated between hardware releases. The G4 had been out since 1999, and the G5 didn’t hit until 2003. Fast forward, and you have Intel chips coming out year after year, so with the change in hardware revisions came a change in software release cycle.

I wish that were the case, but Intel’s Core iX series ran on Macs released between 2009 and 2020 — eleven years. It’s still bundled on current non-Macs in 2024. Core 2 Duo with 64-bit EFI — Penryn — found use for almost four years (the C2D MacBook remained on sale until 2012).


When I left the company in December of 2006, and having been there since 1996, a lot changed. I don’t think users got hoodwinked, but that’s my opinion. I certainly see other viewpoints, but the software industry in the early 2000s is a vastly different world than it is today.

That’s true. It is different today.
 
Bernard, LOL, you mean Bertrand, and he wasn't replaced by Craig, but by Scott Forstall, whom I actually worked with at NeXT when the companies merged. Those were fun times! One of the best things about working at NeXT was the ability to talk directly to the person in charge of the department instead of having to go through multiple committees. NeXT, and then Apple, is to this day the largest company that manages as a startup company... I miss the environment!
 
Last edited:
Bernard, LOL, you mean Bertrand,

Yes, Bertrand. I’ve had a very long day.

and he wasn't replaced by Craig, but by Scott Forstall, whom I actually worked with at NeXT when the companies merged.

Just so we’re on the same page, I’m referring to the SVP of Mac Software Engineering — not general SVP (which as you noted, Federighi succeeded Forstall in 2012). Federighi indeed replaced Bertrand Serlet in March 2011 as overseer of the Mac/OS X portfolio.

Those were fun times!

I bet they were! :)

One of the best things about working at NeXT was the ability to talk directly to the person in charge of the department instead of having to go through multiple committees. NeXT, and then Apple, is to this day the largest company that manages as a startup company... I miss the environment!

Sounds like it was a convivial working environment.
 
Yes, Bertrand. I’ve had a very long day.



Just so we’re on the same page, I’m referring to the SVP of Mac Software Engineering — not general SVP (which as you noted, Federighi succeeded Forstall in 2012). Federighi indeed replaced Bertrand Serlet in March 2011 as overseer of the Mac/OS X portfolio.



I bet they were! :)



Sounds like it was a convivial working environment.
Avie is who I reported to until I left. My group was in charge of pro-software, aka Final Cut Studio, and the Video side. I wasn’t in a leadership role, but most of my team came from key grip, and some from the Macromedia group after they decided to stick with web. Apple bought their video group, which would become Final Cut Pro. If anyone ever gets an opportunity to work for Apple, I would recommend it – the best career decision I ever made.
 
  • Like
Reactions: AdamBuker
I wish that were the case, but Intel’s Core iX series ran on Macs released between 2009 and 2020 — eleven years.
The iX has had thirteen generations (and several microarchitectures), with a new gen coming out every year or two. Despite a gen 2 Sandy Bridge and a gen 13 Raptor Lake both being called, say, i7, there’s a difference between those two buggers.

Core 2 Duo with 64-bit EFI — Penryn — found use for almost four years (the C2D MacBook remained on sale until 2012).
Apple chose to stick with it for the MB (until 2012), MBA and mini (both until 2011). Many others had moved on to iX by 2010.
 
Last edited:
You may want to supply from where you sourced those figures.
Like I said, it is independently measurable. I measured it myself (20-40% difference in time spent on identical tasks, or using classic.mouseaccuracy.com, if you want something more scientific). It's yours for the debunking.

Making an unfalsifiable point undermines your thesis.
It's not unfalsifiable at all. You or anyone are free to test for yourself and prove me wrong. The fact that you haven't shown your results indicates that you either already have and are afraid to admit that I was right, or that you're afraid that I will be right when you do.

The content of your next reply determines whether I’ll continue to interact with you.
Oh, how nice of you😆. The point stands whether you "continue to interact" or not, so I'll let the historical record of this thread speak for itself.

You've just lost any shred of credibility in this discussion. This is a polite and considered forum where people can talk and share opinions, whereas you're just driving the bulldozer of "I'm always right" - despite having nothing to show in terms of facts and figures.
That's certainly not my attitude. I've demonstrated the theory, the science behind it, and my easily repeatable results. Nobody has yet shown any hard numbers to disprove them, and the fact that I'm encouraging this open invitation for folks to do so would seem to strongly disprove the "I'm always right" theory. What the peanut gallery thinks of my credibility for standing up to the mob means little to me.
 
Circling back to the thread question;

I had spent thousands on a decked out G5 in 2004 and then a new PowerBook G4 in 2005, so I was pretty deep in on PowerPC hardware at the time. The announcement to switch to Intel almost seemed unbelievable at first... like a bit of a joke after all the years of the Apple camp’s Intel bashing.

I was on the fence (bordering non-acceptance) for some time. By late 2006 my mother was looking to buy a new Mac, so I helped her get started with the first gen Core Solo Duo polycarbonate MacBook, which was nothing but trouble. The MacBook had major overheating issues and was returned to Apple for a logic board replacement after months of intermittent shutting off and multiple Apple Service attempts. The replacement board did resolve the issue... but it was a pretty shaky start into Intel hardware for me to observe and to provide support for...

Needless to say I was not impressed with Apple’s Intel switch at that time. After a few lengthy years of sticking it out with my PowerBook G4 and Power Mac G5, I took the plunge and bought a new late-2008 aluminum MacBook Unibody.

This was a game changer for me. I loved the performance (at least 4x faster than the PBG4), I loved the form factor, the brightness of the LED backlight, the response of the keyboard, the solid glass trackpad, and multitouch gestures. I loved that I could add a Bootcamp partition and run Windows XP blazingly fast.

It was a delight to work on compared to the G4 and even outperformed my G5 for my work in Photoshop, Indesign, etc and music production in Ableton Live and Pro Tools LE

I have never let go of the PowerPCs - that original G5 is still running perfectly fine today, 20 years later! I love all the PowerPCs I have, including a recently resurrected Beige G3, but I also feel the same about my Intel Macs, (which I probably have well over 50 of). So not hoodwinked, just slow to accept.

Looking back on the past recent few years I could say the same about Apple Silicon. I wasn’t thrilled at first but now I think it’s pretty cool the Mac OS has been portable enough to traverse (at least) 3 major architectural changes.

I see it as the 3rd party developer’s responsibility to maintain software support as much as possible for the outdated architectures, even beyond when Apple officially drop support, but I know not all Devs would agree with me on that...
 
I've demonstrated the theory, the science behind it, and my easily repeatable results.
But you haven't. You've recalled in your experience a 20-40% increase in speed doing tasks. You haven't specified what tasks, what application(s), what OS or what hardware.

Even referring to your linked whac-a-mole online test is meanless without any data relating to the hardware used.

I did the test and my results were abysmal - but without being able to boot my iMac into Windows my poor score means nothing.
 
I was on the fence (bordering non-acceptance) for some time. By late 2006 my mother was looking to buy a new Mac, so I helped her get started with the first gen Core Solo Duo polycarbonate MacBook, which was nothing but trouble. The MacBook had major overheating issues and was returned to Apple for a logic board replacement after months of intermittent shutting off and multiple Apple Service attempts. The replacement board did resolve the issue... but it was a pretty shaky start into Intel hardware for me to observe and to provide support for...
I think some might actually have forgotten how the Intel era began so shakily. The first generation MacBook Pros - the (2006, I believe) models built on the G4 15 and 17 inch design - were prone to severe heating and failure, which was a real irony considering that Apple's argument for the Intel transition was because they'd reached the end of the practicable thermal performance of PPC architecture.

I still have a 15 and 17 inch first-gen MBP, and they both get too hot to use on the lap - and the 17-inch crashes and freezes if used for more than about 30 minutes due to heat.
 
I think some might actually have forgotten how the Intel era began so shakily.
It was a bold business move - “if you can’t beat ‘em, join ‘em”. A gamble which had a rough start but paid off in a huge way.

Imagine how things could have turned out if Apple insisted on sticking with PowerPC for the sake of refusing to change... would they still be around today? The best thing they have done (for innovation) is to finally bring it all in-house but that has its cons in breaking industry standards and becoming a closed loop of its own.

Apple consumers really are a fickle bunch :oops:

:apple:
 
I think at the point the thermal and performance limitations of the PPC architecture were becoming obvious - which I suspect was well before the Intel transition took place - Apple were getting pretty desperate for a solution. They'd run into heat-related failures trying to push hardware and performance and the only way not to need a PPC alternative was to step well outside their physical design language.

Doing this with the G5 desktop wasn't so bad, but I can't see how they could have avoided a switch to Intel for laptops. Even so, while my G5 DP 2.0 wiped the floor with the Intel-based PCs we also had for video editing work, the first MBP was pretty woeful in performance and still got hot.

We do forget though that systems engineering at that point in time was fairly poor in general, and I think the switch to Intel was in part at least due to a need for a predictable architecture roadmap around which product planning/design/implementation could then be mapped out too. As such, I doubt Apple would still be around if they hadn't switched. Or at least not in the same form it is.

A lot of users complained a great deal, because yes, Apple users are a fickle lot indeed. But the really cool thing often not considered much is that the MacOS and application landscape remained almost entirely stable across that transition and beyond, so that those of us who saw the platform as simply a tool to get a job done were not disrupted at all. That seems to me to be a big success story - and one largely repeated with the AS transition too.
 
I think some might actually have forgotten how the Intel era began so shakily. The first generation MacBook Pros - the (2006, I believe) models built on the G4 15 and 17 inch design - were prone to severe heating and failure,
Memory activated - I had one of the 15" first gen MBPs as I mentioned earlier in the thread ... and as you talk about it I recall that it was too hot directly on the lap (especially if playing games) but if I had a blanket on my lap it would mess with airflow and make it worse! So I used my wife's old wood-top lap desk as a buffer. I never had failures, but definitely know people who did.
 
Memory activated - I had one of the 15" first gen MBPs as I mentioned earlier in the thread ... and as you talk about it I recall that it was too hot directly on the lap (especially if playing games) but if I had a blanket on my lap it would mess with airflow and make it worse! So I used my wife's old wood-top lap desk as a buffer. I never had failures, but definitely know people who did.
Not to prolong my contributions unnecessarily, but as said elsewhere, we tend to forget that system engineering was pretty bad at that point, so it was far from unusual to find 'laptop coolers' for sale and in use. Certainly not just for Macs.

When I think how my M1 MBA stays almost entirely cold in use, I can't help but realize how times have changed!
 
It was a bold business move - “if you can’t beat ‘em, join ‘em”. A gamble which had a rough start but paid off in a huge way.

Imagine how things could have turned out if Apple insisted on sticking with PowerPC for the sake of refusing to change... would they still be around today? The best thing they have done (for innovation) is to finally bring it all in-house but that has its cons in breaking industry standards and becoming a closed loop of its own.

I do wonder this, and I think Apple would still be around today, as would Macs — namely, because iPod was still a strong revenue stream, and iPhone, by then under planning, if not early development, was in the pipeline. This alone would have avoided much of the software re-working for little-endlian MIPS processing, maintaining RISC, big-endian code across the PPC Macs and ARM handhelds.

Moreover, PA Semi were already up and coming with their multi-core chip design for portable applications, and that would have relieved some of the engineering issues IBM and Freescale were facing.

And had Apple actually and truly been all-in, commitment-wise, on the AIM alliance they helped to forge during the Sculley era (which they weren’t and probably hadn’t been since 1999), then this would have prompted better collaboration between IBM and Motorola/Freescale in chip design and deployment as the PPC750 was being superseded. But seeing as Apple had dropped their end of the alliance, by engaging in picking one or the other partner to design an entire generation of CPUs — Motorola for G4, IBM for G5 — this had yielded, by 2004–05, two very different PowerPC-based evolutionary branches which had no chance of being dovetailed back together.

So in that sense, I do conclude Apple doomed themselves to being thrown a lifesaver ring by Intel as a contingency for Apple having weakened the AIM alliance by picking favourites as the G3 was being superseded. I do put this one squarely on Jobs’… capriciousness: whereas other might call this shrewdness, on this point, I don’t. Intel was his own CYA for being capricious on that picking-favourites move years earlier.
 
  • Love
Reactions: AphoticD
Not to prolong my contributions unnecessarily, but as said elsewhere, we tend to forget that system engineering was pretty bad at that point, so it was far from unusual to find 'laptop coolers' for sale and in use. Certainly not just for Macs.

So true! My 2005 laptop purchase (I was alternating years between Mac & PC laptop upgrades, 2000s were very expensive for me even with decent resale) was the Dell XPS M170 gaming laptop ... and that had a fan blowing out the BOTTOM ... which meant you couldn't put it on your lap, couldn't block that fan, had to be careful what surface you put it on, and so on.
When I think how my M1 MBA stays almost entirely cold in use, I can't help but realize how times have changed!
Same - I now have the M2 15" and even playing games it never really heats up. Amazing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.