Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Zendokan

macrumors 6502
Feb 17, 2011
324
172
Belgium
...

The list I gave was by no means a comprehensive list of technologies that couldn't be "future proofed" for, and while everything you wrote was opinion dressed as fact, I'll address two that I know to be factually incorrect for my usage:

Wrong, everything I wrote was based on my own personal experience and the modus operandi of my colleagues through the years working in professional environments (headquarters of several major Banks, Social Security Institutes, Insurance Companies, Federal Institutes, Public Transport, Energy Providers and 1 small company specialized in mobile app development for third parties).

Your two examples below qualify more as "everything you wrote was opinion dressed as fact"

"Unlocking your professional Mac by another electronical device is frowned up" So how do you unlock your Mac? A keyboard is an electronic device (wired or unwired) which can be remotely monitored via a key logger or interception of electrical signals. If someone is in the same office, it's very easy to glance over and see what the user is typing when they log in or unlock their Mac.

Unlocking my iMac with my Apple Watch has been a perfectly acceptable usage in my business, and more secure (and convenient) than typing in a password on a keyboard. When I have a sudden idea, it's nice to unlock the Mac without breaking my train of thought (even briefly) to type in a password. I'm looking forward to the added security and convenience of Touch ID, which is secure enough to authorize financial transactions.

Keyloggers are installed on the pc or mac, not on the keyboard itself.
Also the majority of keyboards that I have encountered which were provided by a company for their employees still have a cable.

"Sidecar...really? Maybe for a consumer or a prosumer when they are working in a hotelroom" I guess you've never heard of a Wacom tablet, or seen the price tag. Sidecar lets photo editors run the full version of Photoshop on the Mac and use the Apple Pencil to retouch images on the iPad's screen, or create an image from scratch. Working for hours is much nicer when you can sit back in your chair and work naturally as with a paper and pencil or canvas and a brush.

Again, the idea of "future proofing" assumes you know everything that's coming to the Mac and to your own workflow, and ignores the potential for drastic performance gains for the M series within the next couple of years.

Professionals that need a Wacom tablet, buy a Wacom tablet (if they are freelancers) or get one provided by their employer.
It's the same with software. I've never seen a fulltime developer work with VSCode as their prime IDE, they would buy or get a license for Visual Studio. Or use Notepad++and Kdiff3 for XML manipulations, because there would be XmlSpy licenses available.

Professionals want tried and true solutions (hardware and software), if you are an individual (Freelancer working in BYOD companies) or an employee, the hardware and software really doesn't cost much at all compared with daily rate a person costs.
It takes me personal about a week in earning my MBP with legal software licenses, which is nothing compared to the hours I use them.




No, like I said before, future proofing means that I know that there are rules to be followed concerning architecture within 1 type of eco-system, so software (OS and applications) written for the M5 will run on a powerful M1.

At this moment we don't have enough data to know if the release rate of newer Silicon processors will be 12 months or 18 months. So it could be that your M1 lasts 6 years or 9 years before you need to buy the M6.
 
  • Like
Reactions: singhs.apps

portland-dude

macrumors regular
Mar 16, 2021
119
177
Im actually saving money to get M1 Max 16” with 32GB of ram and 1TB of storage just for web browsing and watching Twitch.

Why you may ask?

Like I said, future proofing. Im sick and tired of buying these 200 euros laptops that lasts 2 months, Ive said to myself “never again”. My next laptop will be pricy but worth it.
So...you've been had by marketing? There is no such thing as future proofing. You think Apple will magically support your device longer because you spent more on it? Nope.
 
  • Like
Reactions: 3Rock

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Its well known that at the beginning of a console’s lifecycle it is more powerful than commercially available, ‘normal’ computers. That was true for the PS2, its still true today.
That's not the case. Historically, consoles, when they are first released, have been equivalent in performance to midrange gaming PCs. What they provide vs. those gaming PC's is thus not more power, but lower cost.

For instance, when the XBox Series X came out in 2020, PC World found they needed $1500 to build a PC with equivalent gaming performance (see link). That's a lot more than the $500 XBox X, of course, but $1500 still certainly qualifies as a "normal" computer.

To put it another way: When consoles are first released, they are typically more powerful than low-end gaming PCs, but less powerful than high-end ones.

 
Last edited:
  • Like
Reactions: JMacHack

AltecX

macrumors 6502a
Oct 28, 2016
550
1,391
Philly
Wrong, everything I wrote was based on my own personal experience and the modus operandi of my colleagues through the years working in professional environments (headquarters of several major Banks, Social Security Institutes, Insurance Companies, Federal Institutes, Public Transport, Energy Providers and 1 small company specialized in mobile app development for third parties).
No, you're wrong, that STILL just makes it your opinion based on your experience. It doesn't make it fact.

Myself and all of my friends find Seinfeld to be vastly overrated and a horrible show. That does NOT make it FACT just because my environment re-enforces my opinion, as others, outside of my circles have had other experiences with it.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
There is no such thing as future proofing. You think Apple will magically support your device longer because you spent more on it? Nope.
While nothing is future-proof, you can make it ... future-resistant ;).

Typically you can run a supported OS (one with security updates) on the higher-end models for about 7-11 years after they're released (my 2014 MBP's support ends with Catalina, which is supported through fall 2022; and the late 2013 Mac Pro runs Monterey, supported through 2024). It is somewhat less for the lower-end models—so you actually do get longer support if you pay more.

But the latter isn't the most significant factor. Rather, what's important is that the lower-end models may have trouble with the newer OS's (and with newer software) even before support runs out. So if you'd like your machine to be more future-resistant, and to work well for several years, a higher-end model (depending on your use case) may be the better choice.
 

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
That's not the case. Historically, consoles, when they are first released, have been equivalent in performance to midrange gaming PCs. What they provide vs. those gaming PC's is thus not more power, but lower cost.

For instance, when the XBox Series X came out in 2020, PC World found they needed $1500 to build a PC with equivalent gaming performance (see link). That's a lot more than the $500 XBox X, of course, but $1500 still certainly qualifies as a "normal" computer.

To put it another way: When consoles are first released, they are typically more powerful than low-end gaming PCs, but less powerful than high-end ones.


That hasn’t always been the case. If you take a slightly longer view of history, you’ll find i am right.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
That hasn’t always been the case. If you take a slightly longer view of history, you’ll find i am right.
I was taking a long view. This has been the case for quite some time.

Let's go back nearly two decades, to the November 2006 release of the PS3. Here's an article from ign published in April 2007 (5 months after the PS3's release). They compared the Xbox 360 (released in Nov. 2005), the PS3, and an Oblivion gaming PC. They found that the Oblivion was superior in the graphics category:

"Even adding the PS3 version to the mix the PC is still an easy winner in this category. Take a look at our HD video of Shivering Isles to see why. You have the option to crank the PC resolution up to 1600 by 1200 if your rig supports these settings. For the purposes of our side-by-side video we ran both games in 1280 x 720. At first glance it can be difficult to figure out which is which, but examine some still frames and you'll see that the PC has slightly more detail in the background of the outdoor environments."


Next we look at the Xbox One and PS4, both released in November 2013. Here's a Nov 2013 article (i.e., from the very same month those consoles were released) from Forbes clearly explaining that a good gaming PC is faster:

"However, while the new consoles are significant leaps ahead in terms of raw processing power and design from their predecessors, it may come as a surprise that PCs are already much more powerful and capable of producing even better graphics in games."


This is supported by these synthetic benchmarks from 2014:

Next we look at the XBox X and PS5, which were both released in Nov 2020. According to this Dec. 2020 article from Fossbytes:

"Although PS5 and Xbox Series X have been advertised as next-gen consoles, they do not run at a native 4K resolution. They use resolution upscaling to achieve such a high resolution. Meanwhile, a PC with RTX 3070 can deliver close to 60 to 100 FPS at true 4K resolution. According to Digital Foundry, the Xbox Series X maintains a dynamic resolution targeting 60 FPS while maintaining a minimum of 1440p resolution. The resolution might touch 4k in some instances, but folks at the Digital Foundry refused to label it as an absolute certainty."


In summary, your statement that "Its well known that at the beginning of a console’s lifecycle it is more powerful than commercially available, ‘normal’ computers. That was true for the PS2, its still true today." doesn't hold up. It might have been true for the PS2 (that was so long ago I had trouble finding a clear comparison), but it certainly wasn't the case for anything after that. Not for gen 3. Not for gen 4. And not for the current gen. And, again, note that I took pains to find comparisons done right after the respective consoles were released, to give the consoles the greatest possible chance of besting a gaming PC.
 
Last edited:

jollydogfellow

macrumors member
May 8, 2022
30
42
I literally have no idea what you guys are talking about, but no power is too much power, unless we're talking about the Weimar Republic.

That hasn’t always been the case. If you take a slightly longer view of history, you’ll find i am right.

Eh, the N64 was after all a gimped SGI Indy, its CPU was not that different from MIPS' early 90s offerings which were already past their prime for the workstation market.

The Sony Playstation definitely wasn't "more powerful" than a high end workstation of the time (such as SGI's very own offerings).

The brilliance of both designs -- especially the PS1 -- was that it had the right stuff for games, which are a specialized application.

In general, however, I agree that consoles are often sold below cost for at least the first year of their lifespan, which allows manufacturer to put expensive components in there, knowing that the price will go down quickly.

Conversely, by the end of the PS1's lifespan Sony was basically printing money with every unit sold, with the R&D and the tooling being paid for already.

While we are at it -- this must have occurred to Apple when they chose to control their silicon's lifecycle.
 

jollydogfellow

macrumors member
May 8, 2022
30
42
"Sidecar...really? Maybe for a consumer or a prosumer when they are working in a hotelroom" I guess you've never heard of a Wacom tablet, or seen the price tag. Sidecar lets photo editors run the full version of Photoshop on the Mac and use the Apple Pencil to retouch images on the iPad's screen, or create an image from scratch. Working for hours is much nicer when you can sit back in your chair and work naturally as with a paper and pencil or canvas and a brush.

Can I just say that people in general seem to be underestimating the sheer convenience of the macOS-iOS integration features -- particulary the ever-present "Apple is more expensive for no reason" crowd?
Handoff, sidecar, you name it -- in my opinion, those are the killer application.
The moment you buy an iPad, you'll want your next computer to be a Mac.
Halo effect squared.
 

StudioMacs

macrumors 65816
Apr 7, 2022
1,133
2,270
Can I just say that people in general seem to be underestimating the sheer convenience of the macOS-iOS integration features -- particulary the ever-present "Apple is more expensive for no reason" crowd?
Handoff, sidecar, you name it -- in my opinion, those are the killer application.
The moment you buy an iPad, you'll want your next computer to be a Mac.
Halo effect squared.
Once Apple adds a new feature that requires Bluetooth 5.2 or Wi-Fi 6E, they will limit support to Mac Studio 2024 or newer. Spending more for future proofing doesn’t make sense. At best we wind up with a slower processor than if we had spent half as much twice as often. At worst, we miss out on some of the most compelling Apple ecosystem features.
 

dandeco

macrumors 65816
Dec 5, 2008
1,253
1,050
Brockton, MA
Its also future proofing which is never a bad idea.
Yeah, this is why I am actually considering getting a Mac Studio with the M1 Max chip, configured with a 1 TB SSD, to replace my 2012 quad-core i7 Mac Mini with 16 GB of RAM. Of course, I'm going to get it with the default 32 GB of unified memory, as that would be just suitable for my tastes. Also, there's a chance I may soon start shooting more 4K videos (my iPhone SE is capable of shooting it) for certain subjects.
 

jollydogfellow

macrumors member
May 8, 2022
30
42
Once Apple adds a new feature that requires Bluetooth 5.2 or Wi-Fi 6E, they will limit support to Mac Studio 2024 or newer. Spending more for future proofing doesn’t make sense. At best we wind up with a slower processor than if we had spent half as much twice as often. At worst, we miss out on some of the most compelling Apple ecosystem features.

Sorry, I quite literally don't understand.
What's your point?
Are you saying that the integration features are not useful and/or not worth the entry price?

Maybe, but subjectively, to me, they are.
In fact, I'm happy with my ancient MBA that doesn't support Sidecar because it still does a crapton of things with my iPad.
 

darngooddesign

macrumors P6
Jul 4, 2007
18,362
10,114
Atlanta, GA
Once Apple adds a new feature that requires Bluetooth 5.2 or Wi-Fi 6E, they will limit support to Mac Studio 2024 or newer. Spending more for future proofing doesn’t make sense. At best we wind up with a slower processor than if we had spent half as much twice as often. At worst, we miss out on some of the most compelling Apple ecosystem features.
This argument really depends on how significant the new feature is, what the future-proofing upgrade is, how expensive the future-proofing upgrade is, how much faster that next computer is, and how much you can get by selling/trading-in your computer.
 

StudioMacs

macrumors 65816
Apr 7, 2022
1,133
2,270
Sorry, I quite literally don't understand.
What's your point?
Are you saying that the integration features are not useful and/or not worth the entry price?

Maybe, but subjectively, to me, they are.
In fact, I'm happy with my ancient MBA that doesn't support Sidecar because it still does a crapton of things with my iPad.
I'm saying that "future proofing" requires that you know what the future will bring, otherwise you might miss out on great features that are independent of the processor.
 

StudioMacs

macrumors 65816
Apr 7, 2022
1,133
2,270
This argument really depends on how significant the new feature is, what the future-proofing upgrade is, how expensive the future-proofing upgrade is, how much faster that next computer is, and how much you can get by selling/trading-in your computer.
I agree, but usually you get more value from selling/trading a newer computer.
It's a better value to buy what you need now and then sell/trade to buy what you need in 2 or 3 years – instead of spending almost as much now with upgrades you think might future proof the machine for 6 years. That ignores every other advancement that will happen between now and then.
 

Zendokan

macrumors 6502
Feb 17, 2011
324
172
Belgium
No, you're wrong, that STILL just makes it your opinion based on your experience. It doesn't make it fact.

Myself and all of my friends find Seinfeld to be vastly overrated and a horrible show. That does NOT make it FACT just because my environment re-enforces my opinion, as others, outside of my circles have had other experiences with it.

Where did I say that what I stated were facts?

The text on which you replied was my reply to StudioMacs's accusation that in a previous reply between us two I had written my personal experience based as facts, which I didn't, because I had stated that it was my personal experience and that of my colleagues.
This was followed by him counter refuting only two of my four points based solely his own personal experience.

My opinion is my opinion and it is only based on my experience working (migration projects, department creation and policy setting) for the IT and Governance departments of:
- Major Banks that have a total market share of 78% in Brussels/Belgium and 34% of Western Europe
- Insurance Companies with a market share of 67% in Brussels/Belgium and 41% of Western Europe and 26% of Mainland Europe
- Social Security Institutes with a market share of 72% in Belgium
- Several Belgian Government Departments (Pension, Finance, Social Security, Defense)
- Public Transport (100% of trains, 65% of busses) operating in Belgian and with cooperation contracts with several other European countries
- Energy Providers: 84% in Belgium, 58% in Western Europe
For each of those companies I needed to follow European Union guidelines, European business guidelines and (a very small part with) helping with creating policies and guidelines for usage within the EU and mainland Europe.

Like I always stated before it is my opinion based on my personal experience.

BTW: Seinfeld was a horrible show, but since this statement is solely based on my own experience it is still an (valid) opinion.
 

StudioMacs

macrumors 65816
Apr 7, 2022
1,133
2,270
- Major Banks that have a total market share of 78% in Brussels/Belgium and 34% of Western Europe
- Insurance Companies with a market share of 67% in Brussels/Belgium and 41% of Western Europe and 26% of Mainland Europe
- Social Security Institutes with a market share of 72% in Belgium
- Several Belgian Government Departments (Pension, Finance, Social Security, Defense)
- Public Transport (100% of trains, 65% of busses) operating in Belgian and with cooperation contracts with several other European countries
- Energy Providers: 84% in Belgium, 58% in Western Europe
That's very impressive experience, and I realize there are more considerations in deploying IT solutions in these environments than I can imagine or want to deal with.

All of those deployments require long-term solutions to provide stability and compatibility in to the future (e.g. future proofing).

Buying a personal computer affords an individual the luxury of being much more flexible and agile in response to new technology than a public transportation IT department.

While your experience is notable, it requires a completely different mindset than that of an individual like myself (and most on this forum) who have the luxury of turning on a dime to take advantage of the newest systems rather than maintaining compatibility for then next 10 years.

That's why I'd rather spend half as much, twice as often – to take advantage of the little features that are added along the way.
 

Zendokan

macrumors 6502
Feb 17, 2011
324
172
Belgium
That's very impressive experience, and I realize there are more considerations in deploying IT solutions in these environments than I can imagine or want to deal with.

All of those deployments require long-term solutions to provide stability and compatibility in to the future (e.g. future proofing).

Buying a personal computer affords an individual the luxury of being much more flexible and agile in response to new technology than a public transportation IT department.

While your experience is notable, it requires a completely different mindset than that of an individual like myself (and most on this forum) who have the luxury of turning on a dime to take advantage of the newest systems rather than maintaining compatibility for then next 10 years.

That's why I'd rather spend half as much, twice as often – to take advantage of the little features that are added along the way.
Hello StudioMacs,

I fully understand your point of view and like we both have said, it really depends on your line of work.
In my line of work stability is the dominant factor and so I can afford to stay longer on a certain architecture using heavier systems, while in your line of work you will need to follow the technology faster, so it makes no sense to invest in a too heavy system because you have a higher replacement rate.

Lamborghini makes sport cars and tractors. Both are good at what they do so long you use them in the correct way, although it would be funny to see an Aventador pull a plough in a field.
 
  • Like
Reactions: StudioMacs

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Yet a $499 Xbox Series X is 20% more powerful than a M1 Max.
Simply not true. References? No, of course not. Would be nice if people actually did some homework before they posted.

So:

1) CPU: M1 Max much faster. According to https://www.pcworld.com/article/393717/microsoft-xbox-series-x-vs-gaming-pcs-comparison.html , "The Xbox’s CPU is comparable to a lower-clocked, last-gen Ryzen 7 3700X"

Geekbench 5 single-thread and multi-thread scores for the 3700X are 1253 and 8404, respectively. For the M1 Max, they're 1755 and 12344 (using the scores for the Studio).

That puts the M1 Max at ~40% faster ST, and ~50% faster MT. GB5 isn't a prefect measure, but those differences are large enough to invalidate claim that it's 20% less powerful in that category.

2) GPU: M1 Max somewhat faster. According to https://www.pcworld.com/article/393717/microsoft-xbox-series-x-vs-gaming-pcs-comparison.html ,"Microsoft’s console performs somewhere in the ballpark of the last-gen RTX 2080, perhaps a little less."

Relative GPU performance is harder to gauge, since it can vary significantly based on the application. But the rough consensus seems to be that the M1 Max's GPU is comparable to that of an RTX 3070 laptop GPU, and that GPU is faster than the RTX 2080 desktop. From this we can surmise the M1 Max's GPU is likely to be somewhat more performant, on average, than that of the XBox.

It's not going to game as well as the XBox's GPU, of course, but that's because the XBox's GPU is specialized for gaming, and because the gaming software written for the XBox is much more highly optimized to run on it than the gaming software written thus far for AS. The M1 Max's GPU, by contrast, is designed for video processsing and general purpose GPU computing.

3) SSD: M1 Max much faster. According to https://www.pcgamer.com/even-sata-ssds-can-compete-with-next-gen-consoles-on-basic-load-times/ , a Samsung 980 Pro PCIe 4.0 SSD is much faster than the SSD in the Xbox. And the M1's SSD is faster than that model of Samsung.

4) RAM capacity: M1 Max's much greater. The Xbox has 16 GB, the M1 offers up to 64 GB.

And, yes, many could benefit from more performance than what the M1 Max offers. Even though the M1 has one of the highest single-threaded speeds in the business, there are still a lot of programs that are single-threaded, and don't complete their operations quickly enough, even on the M1 and the other processors with the highest ST speeds, to avoid some user wait time.

And many of those doing scientific calculations need far more than the XBox's 16 GB RAM and, indeed, even more than the 64 GB available on the M1 Max.
 
Last edited:

Appletoni

Suspended
Mar 26, 2021
443
177
Yet a $499 Xbox Series X is 20% more powerful than a M1 Max.

+ M1 Max is overkill, nobody needs this much power!!​

= The Xbox power is needed and if the Xbox is 20% more powerful than the M1 Max, then the M1 Max can’t be an overkill. Obviously a lot of people needs this much power and probably much more power.
 

Appletoni

Suspended
Mar 26, 2021
443
177
Xbox Series X = 12 TFLOPS
M1 Max. = 10 TFLOPS

And real world benchmarks is not something we need to go there as the Xbox Series X runs every AAA game at 4K resolution without any problems, while the M1 Max even struggles with super old games like Starcraft 2.

Looks like many people use more power than a M1 Max.
It even struggles with old or even retro chess engines.
 

Appletoni

Suspended
Mar 26, 2021
443
177
hmmm. M1 Max has better CPU, has a Neural Engine, supports more RAM up to 64GB and runs cooler than xbox.

Look at Mac Studio size and then look at xbox??

M1 Max is a computer with a OS that is not locked down!

I can't get xcode, word, zoom working at the same time can I?

great hardware the xbox has but its got a very locked down OS where the mac studio i can do everything but game
You can see it also this way: The M1 Max has only 10 CPU cores and not 20. That means you will need to wait 10 to 15 years until it is as strong as the Mac Studio is now. The M1 Max supports only up to 64 GB RAM. Hopefully the M2 will support up to 128 GB RAM and we don’t need to wait 10 to 15 years. 256 GB RAM would be better. It supports up to 8 TB SSD but many people need at least 16 TB.
 

Appletoni

Suspended
Mar 26, 2021
443
177
Not many people buy Macs with gaming in mind, so I think we can discount that scenario.
With the M1 the time has changed and many people buy the Macs with gaming in mind. A lot people use it for games only. 4K and max settings.
OLED and HDMI 2.1 are still missed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.