My experience as a tech support person suggests that most Windows users can drive a Mac, but original Mac users somehow get lost in Windows when switched - Need I say more?
And it never even once occurred to you that is because Windows is so flipping obtuse for no appreciable reason and thus a life-long Mac user might be confused as to why nothing on it makes any logical sense?
So to put it bluntly if you want to learn how to use any computer properly get a PC first.
I'm confused by the word proper there. A proper computer in my book wouldn't be obtuse. And yes I did learn Windows before OS X and a couple of Amigas before that and a C64 before that and a Vic20 before that. Windows just plain sucked up until XP SP2 (Win98 and earlier were horrible, just horrible for crashing all the time for no discernible reason). XP SP2 and SP3 sucked "less" but still sucked for usability. Yeah, it had more games and gaming support. Just when I thought Windows couldn't possibly suck any harder, Windows Vista came out and proved me wrong.
Windows 7 was a huge improvement, but honestly, I've never enjoyed having to run malware checkers 24/7. They have a way of eating up performance of the CPU and especially the hard drive when you least want them to. I had to disable scans on my Windows game machine for that reason. Yet you are tempting fate if you don't run one. I would never feel truly safe doing banking and shopping on a Windows machine.
Windows 8's default interface was Vista all over again. Yes, classic shell restored functionality but since when should you have to run 3rd party software to fix an operating system's basic behavior? Windows 10 is either 8's salvation or a hybrid monster behemoth depending on how you look at it (WTF wants metro apps on the start menu period? It makes it enormous). And now Microsoft is collecting key logging data and other things even if you turn off as many privacy invading options as possible. That concerns me even more.
At least Windows gives you an understanding of how a computer works- I feel some flaming coming on, ha ha.
Windows 95/98/Me and Dos gave you an understanding of how some of the hardware add-ons interfaced back then, but I'm not seeing how Windows XP and beyond give you any idea of how a computer "works". Take assembly language programming (I have had two classes in it as part of digital systems and interfacing hardware to the bus) if you want a real idea of how a computer works at the most fundamental levels. An operating system only tells you at most how it does things. I don't call the Registry an education in anything but mind-numbing awfulness of design.
Linux can give you an idea of how things work on more fundamental levels, but even it has been moving away from requiring you to learn how to compile your own software, etc.
Yep, Linux is easier to use, I agree- I cannot argue with that. Its my preferred OS,
I've used Linux on/off for over two decades and while it has evolved quite a LOT over that time, I have yet to have a "flavor" that can truly update itself to a newer version consistently and flawlessly like OSX has managed. Repositories are truly mind numbingly awful (you are entirely dependent on others keeping applications up to date and there's always lag and setting most flavors up to even get at things you actually want (e.g. DVD decoders, etc.) means manually doing even more since those are no-no copyright hazards for companies distributing Linux. Your "alternative" is to compile yourself (not all that much fun over time). So unless something has radically changed in the past three years, I could/would never say Linux is easier to use unless you just want a web browser and even then it might not stay up-to-date. And if you need Flash or something, forget about it. Linux support has been dumped for some time now.