Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Sami13496

macrumors 6502a
Original poster
Jul 25, 2022
854
1,802
I’ve been wondering how Apple could release such a terrible-looking UI that goes against all good UI and UX design principles, and even looks like something from the early 2000s. It occurred to me that perhaps it was a deliberate move to divert attention away from the problems with their AI efforts. The strategy seems to have worked, since now everyone is only talking about the new design.

I predict that next year, when Apple finally gets its AI ready for release, they’ll also fix the UI appearance in iOS 27.
 
You’re over thinking everything. The engineering effort behind Liquid Glass is massive, they can’t just roll that out overnight. It’s all part of a long term plan they have had with the UX, you can see small elements in existence for a few years now.
They already knew a year ago that their AI was failing, and maybe that’s when they started developing this Windows Vista copy.
 
Not even slightly.

The need for a new design paradigm has probably been in the works for a long time. This had to happen now because of the 20th anniversary iPhone and, most importantly, because of the Vision product line. The 20th anniversary iPhone is a year a half away so they want as many apps as possible to adopt the new design language before then and for everyone to get used to it (it also gives Apple time to fully perfect it themselves).

We are quickly approaching the point where Apple hardware becomes invisible: either you're looking through the hardware with Vision products, or products like iPhone become bezel-less and converge on the "magic sheet of glass." Both of those future product categories will markedly feel different than current technology and Apple are intent on revamping their software to complement the new hardware era.

Apple have been super clear in their design guidelines that the point of Liquid Glass is to be a distinct character in the design language ensemble that "sits on top of the content layer" -- in other words, it is NOT for the content layer itself, it should be selectively applied in moderation for the most important control points. What they didn't really emphasize out loud is that "the content layer" won't just be apps anymore, it will increasingly be the physical world. The question that Liquid Glass answers, if any, is how do you make the perfect AR 'material' that doesn't feel out of place when superimposed onto the real world AND/OR digital content and remains legible when you can't guarantee what the 'content layer' will be (i.e, the real world).

Liquid Glass then is supposed to be the universal material and signal which tells the user "this is the thing I interact with to make stuff happen" whether it's the send button on their email app or a floating on/off switch that fades into view when you look at your lampshade with Vision AR glasses on 5-10 years from now.

That's why I'm so excited about it, this is the biggest teaser Apple have ever given us about what the next couple decades of their products will feel like.

(also; my bet is the new lensing effect we see on the iOS 26 lockscreen as you slide up to unlock will be uniformly applied around the very edge of the 20th anniversary iPhone's screen, where the bezel currently is on the 16 Pro -- to really reinforce the illusion that this device is a magic lens into the software realm)
 
Last edited:
I’ve been wondering how Apple could release such a terrible-looking UI that goes against all good UI and UX design principles, and even looks like something from the early 2000s. It occurred to me that perhaps it was a deliberate move to divert attention away from the problems with their AI efforts. The strategy seems to have worked, since now everyone is only talking about the new design.

I predict that next year, when Apple finally gets its AI ready for release, they’ll also fix the UI appearance in iOS 27.
Early 00s are back in fashion though.
 
Not even slightly.

The need for a new design paradigm has probably been in the works for a long time. This had to happen now because of the 20th anniversary iPhone and, most importantly, because of the Vision product line. The 20th anniversary iPhone is a year a half away so they want as many apps as possible to adopt the new design language before then and for everyone to get used to it (it also gives Apple time to fully perfect it themselves).

We are quickly approaching the point where Apple hardware becomes invisible: either you're looking through the hardware with Vision products, or products like iPhone become bezel-less and converge on the "magic sheet of glass." Both of those future product categories will markedly feel different than current technology and Apple are intent on revamping their software to complement the new hardware era.

Apple have been super clear in their design guidelines that the point of Liquid Glass is to be a distinct character in the design language ensemble that "sits on top of the content layer" -- in other words, it is NOT for the content layer itself, it should be selectively applied in moderation for the most important control points. What they didn't really emphasize out loud is that "the content layer" won't just be apps anymore, it will increasingly be the physical world. The question that Liquid Glass answers, if any, is how do you make the perfect AR 'material' that doesn't feel out of place when superimposed onto the real world AND/OR digital content and remains legible when you can't guarantee what the 'content layer' will be (i.e, the real world).

Liquid Glass then is supposed to be the universal material and signal which tells the user "this is the thing I interact with to make stuff happen" whether it's the send button on their email app or a floating on/off switch that fades into view when you look at your lampshade with Vision AR glasses on 5-10 years from now.

That's why I'm so excited about it, this is the biggest teaser Apple have ever given us about what the next couple decades of their products will feel like.

(also; my bet is the new lensing effect we see on the iOS 26 lockscreen as you slide up to unlock will be uniformly applied around the very edge of the 20th anniversary iPhone's screen, where the bezel currently is on the 16 Pro -- to really reinforce the illusion that this device is a magic lens into the software realm)

This is the most accurate post I’ve seen about liquid glass. It seems a lot of people on this forum forget that apple has a roadmap years into the future and they aren’t designing the future of iOS for iPhones 11-16. I’m sure there is a version of liquid glass running on an iPhone XX prototype that looks amazing but we aren’t going to see the full vision for a couple years.
 
The decision to create a whole new design paradigm around what is a failure of a product (Vision Pro) is baffling. No one outside of Apple was saying "this looks incredible, make my phone look like this."

And saying it is designed for an upcoming phone doesn't make sense, because why didn't they then also add split-screen multitasking to iOS for the future foldable? If we get a foldable in 2026 and a 20th anniversary phone in 2027, they seem to have their priorities backwards. Why do the visual stuff, which is easier, before working on the important stuff that is going to take longer to get right. This doesn't feel forward-thinking.

I don't think it was a distraction. I think Liquid Glass is just as poorly thought out as Apple Intelligence and just further proof that Apple needs new leadership.
 
The decision to create a whole new design paradigm around what is a failure of a product (Vision Pro) is baffling. No one outside of Apple was saying "this looks incredible, make my phone look like this."

And saying it is designed for an upcoming phone doesn't make sense, because why didn't they then also add split-screen multitasking to iOS for the future foldable? If we get a foldable in 2026 and a 20th anniversary phone in 2027, they seem to have their priorities backwards. Why do the visual stuff, which is easier, before working on the important stuff that is going to take longer to get right. This doesn't feel forward-thinking.

I don't think it was a distraction. I think Liquid Glass is just as poorly thought out as Apple Intelligence and just further proof that Apple needs new leadership.
Apple's leadership has been superb. Perfect, no. But clearly superb in comparison to the rest of big tech.

And AVP is no failure. It is a beautifully done new tech product direction, not a hula hoop measured by instant sales like a pet rock. It is primarily an example of new tech direction, and as such it excels. AVP has a ways to go with software as is expected with a new tech direction; but the hardware experience was excellent right out of the gate at v. 1.0.
 
Last edited:
Do you really think that a $Trillion project by a very successful tech company behaves as you suggest?
Well, apologies, but that’s my opinion. You’re of course free to disagree. If you look at the direction AAPL has taken recently due to AI and how many billions they could potentially lose because of it, I believe my theory is quite plausible.
 
Last edited by a moderator:
You are talking about the AI disaster, but did you notice that they actually delivered some AI with iOS 26?

This is currently hidden on the system (probably because they don’t trust it enough yet), but in the Shortcut app, there is actually a way to communicate with Apple Chatbot! And while we are far from a GPT4 level, it actually works fine and well and it's promising for the future (and for a future Siri LLM actually).

And the new "Foundation Model Framework" seems like a big deal actually. Developers can access Apple LLM locally for free from any app with a few lines of code. As a developer myself, it seems huge actually and this can make a lot of new apps that would have been hard to make before.

While I was watching the WWDC, I honestly thought almost the same, like "well...nothing AI this year". But after watching the other conferences and playing with the beta myself, I think I changed my mind, and I feel like they actually woke up. It seems for me almost a bigger AI update than iOS 18 was (honestly) Just a tough..
 
You are talking about the AI disaster, but did you notice that they actually delivered some AI with iOS 26?

This is currently hidden on the system (probably because they don’t trust it enough yet), but in the Shortcut app, there is actually a way to communicate with Apple Chatbot! And while we are far from a GPT4 level, it actually works fine and well and it's promising for the future (and for a future Siri LLM actually).

And the new "Foundation Model Framework" seems like a big deal actually. Developers can access Apple LLM locally for free from any app with a few lines of code. As a developer myself, it seems huge actually and this can make a lot of new apps that would have been hard to make before.

While I was watching the WWDC, I honestly thought almost the same, like "well...nothing AI this year". But after watching the other conferences and playing with the beta myself, I think I changed my mind, and I feel like they actually woke up. It seems for me almost a bigger AI update than iOS 18 was (honestly) Just a tough..
9to5 did a human evaluation test and the text response was as good as ChatGPT 4o for the Server Model and Local model was better than Google Local Gemma model
 
I believe they’re deliberately doing something worse now so that next year they can do something better and the contrast will be greater.
Wouldn’t that have been this year then? I mean, the backlash after Apple Intelligence was pretty bad, and they’re actually even getting sued about some of it.
Why would they get into an entrails new car just to crash head on on purpose?
platform 26 is the biggest thing since iOS 7 and macOS 11 combined. No way they intended to f up THIS big.
 
  • Like
Reactions: uacd
9to5 did a human evaluation test and the text response was as good as ChatGPT 4o for the Server Model and Local model was better than Google Local Gemma model
It still lacks the general knowledge, the capability to save conversation history between conversation and to search the web.

That’s why it’s not GPT4 level to me (yet) but we will be there at some point.
 
  • Like
Reactions: aja_96
I’ve been wondering how Apple could release such a terrible-looking UI that goes against all good UI and UX design principles, and even looks like something from the early 2000s. It occurred to me that perhaps it was a deliberate move to divert attention away from the problems with their AI efforts. The strategy seems to have worked, since now everyone is only talking about the new design.

I predict that next year, when Apple finally gets its AI ready for release, they’ll also fix the UI appearance in iOS 27.
interesting theory.

but in reality:
something like Liquid Glass would take 3 to 5 years to bring about. so its long been in the making and likely planned a long time ago to be released in 2025/26.
its timing has nothing to do with the AI mess.
and most likely timed to be available and fine tuned in time for the iPhone being that's being called "the anniversary edition", probably next year.
 
Last edited:
Well, apologies, but that’s my opinion. You’re of course free to disagree. If you look at the direction AAPL has taken recently due to AI and how many billions they could potentially lose because of it, I believe my theory is quite plausible.

Superb for shareholders maybe. For users, not so much. For developers, not even a little.
I disagree. I have been a Mac user forever and IMO the last decade and a half has been the strongest yet for users.

As to AI, the world is [obviously] at its infancy. Anyone commenting that Apple missed AI fails to grasp the whole concept.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.