"No" right back atchya. They have not moved an inch forward when it comes to facilitating mass-adaptation.
Mass-adoption of operating environments typically carries forward in a few, known ways.
One, it happen through incredible budgeting on promotion (like, idk, licensing use of a major rock band’s song on a multi-million-dollar ad campaign to effect said ends), saturation-based distribution, and borderline monopolization of the market by coercive means of cruicial software components being dependent on the operating environment. Microsoft did that, and Google have also done that with their bread-and-butter advertising engine.
Another way is for an everyday apparatus — like hardware — to work their way into the daily lives of most in a slow, steady, organic (or in some commercial cases, coerced) progression. Take, for example, the familiarity of Macintosh, as many folks had their first hands-on experience with a computer through their school’s computer lab during childhood. By the time OS X appeared, Macintosh was 16 years old. That’s an entire generation to come of age around that. (Were I in the next room, I’d be replying here on a 16-year-old patched MacBook Pro — a laptop which has managed to find its way into my daily lives over the span of, well, 16 years.) Adoption of OS X was built upon the organic awareness, adoption, and acceptance of the Macintosh hadware and user experience of Mac OS and Gossamer systems.
And another, third way: rely, if but partly, on the previous by burying the arcane, intimidating aspects of the operating system — the under-the-bonnet stuff — in a manner which is mostly shielded from the end user.
In other words, make it an appliance or embedded system (like a TV or an automobile). Android OS and iOS/iPadOS are two approaches to this basic idea. Respectively, one exists entirely atop Linux and the other atop Darwin/BSD.
In turn, also respectively, the boomerang effect of the development community which made creating Android possible adopts some of that commercial platform’s user-friendly, commercially funded UI/UX components to their operating desktop environment, to make it a more thorough, familiar user experience to appeal to a wider spread of users. (Several Linux flavours strive to do this with their default DE picks and branding.)
Or, alternately, aspects of the descendant UI/UX get pushed back upstream to its ancestor through a kind of reverse-adoption, even if so doing runs in conflict against the parent environment’s core mandate and use-purpose. [See: the iOS’ification of macOS, fka. Mac OS X, post-Snow Leopard, to make using a Mac feel less like using a Mac and more like using an iPhone. Or, if one can imagine this thought exercise: pull a person who was using a Macintosh in 1985 to the year 2010, through the creative licence of a time machine, and sit them in front of a Mac Pro running Snow Leopard, verus putting them in front of a Mac Studio in 2024 running Sequoia.]