Just give us a MacBook iPad already and give us both OS ui layouts, once it’s connected to Magic Keyboard it turns into macOS layout and removed it’s iPad os layout
Adding touch to a MacBook does not assume that touch becomes the primary or only interaction method. It just becomes an additional method for cases where touch is easier or more natural. Hitting a button with touch is easy and we are used to doing it. Selecting specific menu items is probably something you would do with the trackpad or keyboard. Scrolling often feels better with touch. Gorilla arm is not a problem because you are never doing touch exclusively for extended periods of time.Adding a mouse and keyboard to a system with large enough buttons to work with touch interaction doesn’t require any UI changes at all. Adding touch control and interaction to a system with tiny buttons that aren’t sized large enough for touch interactions is different. You’d have to make a bunch of UI alterations to make the touch control and interactions function well, such as increasing button sizing and padding, hiding more options in drop downs and popups, etc. By the time you resize and rework the UI to actually be able to make use of the touch interaction, you’d essentially have iPadOS. Again, iPadOS can easily accommodate trackpad and keyboard support because no UI changes need to be made to the system to make it actually work well. With touch interaction, you need to resize UI elements, add more padding around buttons, etc. in order to make it a worthwhile feature and in order to make it a reasonable experience to use, and by the time you make all of those modifications, you just end up with essentially iPadOS. Just look at Windows, every button on Windows is bigger in order to try to accommodate touch interaction, and it wastes more screen real-estate, hides more options in drop-downs, and can wind up looking rather goofy on a big monitor.
But in order for it to be a sensible addition, the system has to be optimized for it. Look at all the changes Microsoft made to Windows to make it “touch optimized”. Modern Windows makes far less efficient use of screen real estate than macOS, or even old Windows versions did. All because it needs to add touch affordances. They’re trying to strike a balance, but it looks a lot more like iPadOS than macOS in terms of sizing and hiding of advanced options. MacOS should be allowed to be what it’s supposed to be, a system for devices with precise input methods, and iPadOS can be improved to share more of macOS’s functionality with all the touch optimizations you want with touch interactions. IPadOS could essentially become the touch optimized version of macOS. And I think that’s kind of what’s happening.Adding touch to a MacBook does not assume that touch becomes the primary or only interaction method. It just becomes an additional method for cases where touch is easier or more natural. Hitting a button with touch is easy and we are used to doing it. Selecting specific menu items is probably something you would do with the trackpad or keyboard. Scrolling often feels better with touch. Gorilla arm is not a problem because you are never doing touch exclusively for extended periods of time.
On a system like an iPad with keyboard and mouse/trackpad I will switch between interaction methods frequently depending on what I am doing. In addition, this can reduce the risk of RSI as it means you are not always doing the same movements.
Also, if they could add support the pencil it would give yet another optional mode.
It doesn't have to be fully touch-optimized to be useful when touch is not the sole interaction mode. Users can switch modes as they interact partially dependent on the size of the target. I'll often use the pencil for links when browsing on an iPad and use either my thumb or my mouse for scrolling.But in order for it to be a sensible addition, the system has to be optimized for it. Look at all the changes Microsoft made to Windows to make it “touch optimized”. Modern Windows makes far less efficient use of screen real estate than macOS, or even old Windows versions did. All because it needs to add touch affordances. They’re trying to strike a balance, but it looks a lot more like iPadOS than macOS in terms of sizing and hiding of advanced options. MacOS should be allowed to be what it’s supposed to be, a system for devices with precise input methods, and iPadOS can be improved to share more of macOS’s functionality with all the touch optimizations you want with touch interactions. IPadOS could essentially become the touch optimized version of macOS. And I think that’s kind of what’s happening.
most of the UI chances is simply the size of buttons. resolution scaling is already a thing. bringing and optimizing this to certain apps is not a difficult task. a simple button click is all that is needed to turn off touch/no-touch modeThe macOS UI would likely be changed to optimize it for touch. This would result in a ton of compromises that currently don’t have to be made.
Window sizing buttons would have to be enlarged, with more padding in between, or would have to be replaced with a 3-dot button like on iPadOS. Again, adding trackpad and keyboard support to an already touch-optimized system doesn’t really require any changes, adding touch interactions to a system that isn’t designed for it requires a lot of fundamental UI changes, and would end up looking essentially like iPadOS.It doesn't have to be fully touch-optimized to be useful when touch is not the sole interaction mode. Users can switch modes as they interact partially dependent on the size of the target. I'll often use the pencil for links when browsing on an iPad and use either my thumb or my mouse for scrolling.
The size of buttons is kind of a big deal. It’s not just a little change as you make it seem. Look at Windows, everything is over-sized, and it’s far less efficient on screen real-estate.most of the UI chances is simply the size of buttons. resolution scaling is already a thing. bringing and optimizing this to certain apps is not a difficult task. a simple button click is all that is needed to turn off touch/no-touch mode
That makes a lot more sense. A macpad if you will. And I would absolutely buy one.I don’t see this happening. I could see the iPad Pro running macOS before this would ever happen. Maybe these rumors are getting mixed up — touchscreen Mac, much more expensive iPad — same product?
i'm saying that all it takes is a touch screen and non touch screen mode that changes the size of the buttons. if you don't have it on, the buttons will be small. if you have it on, then the buttons will be big. some apps may need to make some adjustments to accommodate for this, but it's not a complex problem that should run into problemsThe size of buttons is kind of a big deal. It’s not just a little change as you make it seem. Look at Windows, everything is over-sized, and it’s far less efficient on screen real-estate.
And how do you know it’s “not a complex problem that should run into problems”? There’s a lot of potential issues you can run into when switching between UI element “modes” like that. It’s not the easy solution you make it sound like. Not to mention the confusion that will be created for users by having buttons resizing and rearranging themselves, and buttons hiding in pop down menus in “touch mode”. This “touch mode” thing is Windows thinking, and it sucks…i'm saying that all it takes is a touch screen and non touch screen mode that changes the size of the buttons. if you don't have it on, the buttons will be small. if you have it on, then the buttons will be big. some apps may need to make some adjustments to accommodate for this, but it's not a complex problem that should run into problems
it is not a problem unless if somebody hardcoded an app to run at a very specific resolution that cannot be adapted. resolution scaling is already a thing on both mac and windows. changing the size of buttons is not a complex task unless if you are running a unique and niche program that was design by a student for a college projectAnd how do you know it’s “not a complex problem that should run into problems”? There’s a lot of potential issues you can run into when switching between UI element “modes” like that. It’s not the easy solution you make it sound like. Not to mention the confusion that will be created for users by having buttons resizing and rearranging themselves, and buttons hiding in pop down menus in “touch mode”. This “touch mode” thing is Windows thinking, and it sucks…
It’s a problem because options would have to be resized and rearranged to accommodate the touch interaction, it’s not as simple as resolution scaling, resolution scaling with no changes to proportions of UI elements would be problematic, parts of the UI would end up expanding beyond the screen.it is not a problem unless if somebody hardcoded an app to run at a very specific resolution that cannot be adapted. resolution scaling is already a thing on both mac and windows. changing the size of buttons is not a complex task unless if you are running a unique and niche program that was design by a student for a college project
You are right there, but I don’t agree that you have to redesign everything. I would never expect a touch screen Mac to be a touch-first interface. Touch doesn’t need to be your only option for interaction. I’ve used Windows laptops with touch and I would frequently switch between touch, trackpad, and keyboard depending on the task. It’s just one more tool in the kit.It’s a problem because options would have to be resized and rearranged to accommodate the touch interaction, it’s not as simple as resolution scaling, resolution scaling with no changes to proportions of UI elements would be problematic, parts of the UI would end up expanding beyond the screen.
The traffic light windowing buttons are part of the heart and soul of macOS. They’re one of those distinctive elements that make macOS, well, macOS. A departure from that would be sad, I think.You are right there, but I don’t agree that you have to redesign everything. I would never expect a touch screen Mac to be a touch-first interface. Touch doesn’t need to be your only option for interaction. I’ve used Windows laptops with touch and I would frequently switch between touch, trackpad, and keyboard depending on the task. It’s just one more tool in the kit.
As for the window stop-light buttons, I would love to see those redesigned anyway. Their functions make little sense and the colors don’t correspond to anything. It just seems like someone thought they looked a bit like sideways stoplights and went with that for no good reason. The Windows icons make more visual sense.
I agree that Microsoft took the wrong path when they tried to make their interface all things to all people. But if Apple does ever add touch to the Mac I expect it'll also come with Pencil support, and probably just be one laptop model, like a "MacBook Studio" (and I suspect it would look like a thinner MS Surface Laptop Studio), and one touch enabled Studio Display with some funky easel like stand. And they'd pitch these things towards developers of iOS and iPadOS apps, as well as illustrators and other artists. With this approach they could leave the MacOS interface well enough alone and offer additional capabilities to a select interested audience. Or they've just decided that that select interested audience isn't big enough to merit the effort.The traffic light windowing buttons are part of the heart and soul of macOS. They’re one of those distinctive elements that make macOS, well, macOS. A departure from that would be sad, I think.
And in the case of Windows, they absolutely have redesigned the whole system to try to accommodate for touch after they adopted touch panels, and it doesn’t use space nearly as efficiently as even older versions of Windows did. Microsoft had to change their UI to optimize for touch, even if only used optionally, I don’t see how macOS wouldn’t have to be changed to accommodate for touch as well.
I’ll grant you that it may not need to be quite on the level of iPadOS’s touch optimization, but it would still make dramatic changes to the UI and system that I think could cause it to be less efficient on screen use, and lose lots of the elements that make macOS macOS and give it it’s charm.
Experience has shown this not to be true. When touch becomes an optional mode of input and control, everything starts being catered to it.It doesn't have to be fully touch-optimized to be useful when touch is not the sole interaction mode. Users can switch modes as they interact partially dependent on the size of the target. I'll often use the pencil for links when browsing on an iPad and use either my thumb or my mouse for scrolling.
But just as Windows has proven, as soon as you add touch-interaction to the system, things in the system will begin to be catered to it. People will complain that they have to use the stylus. Even if only third party software begins changing their app UIs to accommodate touch, you’re bound to wind up in Windows predicament where everything’s becoming more “touch-first” optimized, with cursor input as an afterthought. I think the better solution is to just continue to improve iPadOS. It’s already optimized for touch interactions. If both shared a more in common software base, and shared more features, I think that would be the ideal solution, because then people can choose between a desktop OS, or touch optimized iPadOS. They could even make a hybrid clamshell device running iPadOS like you’re talking about.I agree that Microsoft took the wrong path when they tried to make their interface all things to all people. But if Apple does ever add touch to the Mac I expect it'll also come with Pencil support, and probably just be one laptop model, like a "MacBook Studio" (and I suspect it would look like a thinner MS Surface Laptop Studio), and one touch enabled Studio Display with some funky easel like stand. And they'd pitch these things towards developers of iOS and iPadOS apps, as well as illustrators and other artists. With this approach they could leave the MacOS interface well enough alone and offer additional capabilities to a select interested audience. Or they've just decided that that select interested audience isn't big enough to merit the effort.
Interface changes aren't a force of nature. Microsoft made the bad decision to just try and bolt on touch to their mouse and keyboard centric OS. Apple steadfastly refused to make an iPad just a Mac with direct touch forced upon it, and by the time they added mouse and trackpad support all iPad apps were properly made as touch-first. If they ever do add touch to the Mac I don't think they're suddenly going to emulate Microsoft's terrible ideas.But just as Windows has proven, as soon as you add touch-interaction to the system, things in the system will begin to be catered to it. People will complain that they have to use the stylus. Even if only third party software begins changing their app UIs to accommodate touch, you’re bound to wind up in Windows predicament where everything’s becoming more “touch-first” optimized, with cursor input as an afterthought. I think the better solution is to just continue to improve iPadOS. It’s already optimized for touch interactions. If both shared a more in common software base, and shared more features, I think that would be the ideal solution, because then people can choose between a desktop OS, or touch optimized iPadOS. They could even make a hybrid clamshell device running iPadOS like you’re talking about.
The only way touch interactions would make sense on macOS is if they did add some level of touch optimization, otherwise it’s useless and adds extra cost for a very limited use. At minimum you’d have to have some basic UI features like the traffic light buttons changed to accommodate touch, which would be a downgrade IMHO. And inevitably, whether Apple made any optimizations for touch interactions or not, third party softwares would. There’s already the threat of this happening since iPad apps can run on macOS, and many app developers have started to make one app for both, which is generally good for the iPad, and currently, most of the Mac versions of these apps are still somewhat more optimized for keyboard and mouse input. But add a touchscreen to the Mac, and why would developers take all the extra time to make a UI for their app specifically geared around precise input methods when they could just use the same UI (to accommodate new touch functionality on Macs), and just bolt on keyboard and mouse interactions the way they are on iPadOS?Interface changes aren't a force of nature. Microsoft made the bad decision to just try and bolt on touch to their mouse and keyboard centric OS. Apple steadfastly refused to make an iPad just a Mac with direct touch forced upon it, and by the time they added mouse and keyboard support all iPad apps were properly made as touch-first. If they ever do add touch to the Mac I don't think they're suddenly going to emulate Microsoft's terrible ideas.
Have you ever used an iPad in a keyboard holder? Similar experience, though you might not use touch as much on a touchscreen Mac. Don't make the mistake of assuming that because a device has touch available that you are obliged to use it exclusively. It becomes just one more interaction method like the trackpad and the keyboard. Some actions, like tapping buttons, or scrolling the screen feel right with touch, others, like editing text, work better with a keyboard. You soon start to switch between interaction methods dynamically. One benefit is less likelihood of RSI as you are not repeating the same movements quite as much since you have options.My shoulder muscles ache just thinking about having to reach to touch the screen of a laptop. No thank you.
Could be something similar to HTML Responsive Web Design.Just give us a MacBook iPad already and give us both OS ui layouts, once it’s connected to Magic Keyboard it turns into macOS layout and removed it’s iPad os layout