but they never got Firewire and Thunderbolt to take off, for instance. But the point is that Apple dared to try. At a time when every other company was content to simply reinforce the status quo. And I respect Apple for that.
Yes, plenty of people still use USB-A peripherals, from flash drives to wired mice to wireless presentations remotes, because they still work, but then you also think back to how long it took to transition from VGA projectors to HDMI TVs in conference rooms, and it makes you wonder - where do you draw the line between continuing to support an existing technology vs dropping support in favour of pushing for something newer and better?
Instead of running HDMI cables, why not set up Apple TVs or Miracast dongles in meeting rooms where people can project their screens wirelessly to? USB-C drives do exist (I have made it a point to invest exclusively in them for a while now, and just purchased another Samsung T7 drive). I am using a Microsoft bluetooth mouse that connects directly to my laptop, but plenty of my colleagues still use mice with receivers. The whole point of USB-C is that it is supposedly versatile enough to be able to go all-in on, yet the new MBP got a HDMI port back for meeting purposes.
I am currently using a M1 MBA with 2 USB-C ports, and it's frankly a luxury after working with an iPad with a single lightning port for years, and having optimised my workflow around wireless tech and minimal cables. Maybe I am the outlier here in that when Apple asks me to jump, I ask "how far", but this also goes back a decade to that fateful day when I decided I would embrace the Apple ecosystem in its entirety and not fight whatever changes Apple implemented, for better or for worse.
I don't have an answer to any of this. Without anyone pushing, it may well be that a decade later, we will still be using USB-A accessories, and maybe there's nothing wrong with that, but at what cost to progress?