Most "pro" workloads will never be "near-instant" because workloads will keep getting more complex (photos get more pixels to edit, programs get more lines of code to compile...)performance is important until you get to “near instant”
Most "pro" workloads will never be "near-instant" because workloads will keep getting more complex (photos get more pixels to edit, programs get more lines of code to compile...)performance is important until you get to “near instant”
'Instant' what though? For compute intensive tasks, there will never be 'instant'. There will always be performance gains to be made. If you are talking about simply instant opening of apps, then I don't think that's what is being implied here by performance. Sure, if you are doing only non compute intensive tasks with your machine (which is completely fine) then yeah, performance does not really matter.
I agree, when near instant, then performance doesn't matter, but I don't think most users would switch to efficiency, they'd just be happy and not worry about anything. Anyway, we're nowhere near instant, especially on the low end.I believe that performance is important until you get to “near instant”, then even 2x more doesn‘t matter anymore because you don’t feel the difference. Then you start thinking about other things like durability and power efficiency.
This is not really true. Yes, more mundane tasks have become faster/instant. But hardware and software are always a catch 22 situation and always will be. As hardware becomes faster and more powerful there will inevitably be more complex tasks to be done that will push said hardware. Fortunately (or unfortunately depending on your viewpoint) we will never get to a stage where performance will be 'enough' (given the correct context).Many tasks that used to be time consuming thirty years ago — booting the machine, opening a large file, applying a stroke in a brush application, updating a spreadsheet, recognising letters in an image — are no longer time consuming today. Compute intensive is a sliding scale, and machines get ever faster.
Some time ago we crossed the threshold where the vast majority of consumer tasks are near instant, it is of course never truly instant but it takes about 0.2 seconds for the human brain to react so anything faster is as good as. As more server-level and HPC features trickle into the desktop of tomorrow we will see more Pro tasks becoming instant.
And as for pictures getting more pixels, there are practical limits there too. 4K TVs are slowly taking over, but will 8K ever become a necessity? Its like Retina screens, finer details make no sense because the eye cannot take in more.
This is not really true. Yes, more mundane tasks have become faster/instant. But hardware and software are always a catch 22 situation and always will be. As hardware becomes faster and more powerful there will inevitably be more complex tasks to be done that will push said hardware. Fortunately (or unfortunately depending on your viewpoint) we will never get to a stage where performance will be 'enough' (given the correct context).
The more you can do, the more you want to do.Fear not, for every increase in hardware performance, our brilliant software writers are hard at work making it slower to compensate.
I’m afraid thats incorrect. The majority of tasks that consumers do today are the same as those being done twenty years ago, see the enduring popularity of MS Office (of which most people use only a fraction of the features) and the increasing popularity of Chromebooks.
You can also see it in the number of people (non-gamers) who hold onto older computer hardware for longer. I see it in my immediate surrounds that a lot of non-corporate users hold onto their computers until they break, getting a new faster model isn’t usually a consideration.
now you say this like YouTube, and the internet as a whole isn’t performance intensive.watching youtube, you would obviously not care about performance that much.
It’s a game we like to play.Fear not, for every increase in hardware performance, our brilliant software writers are hard at work making it slower to compensate.
now you say this like YouTube, and the internet as a whole isn’t performance intensive.
YouTube seems to struggle sometimes on my 5,1 when tabbing between windows or when loading a new video. And chewed through battery on my mbp. Who knows.Is it though? I mean for the most part with the correct browser you can watch youtube mostly problem free on a $50 rasberry pi 4b.
Regardless, that is the whole point I am trying to make. Hardware gets faster and more powerful -> software running on it gets more complex and compute intensive. Hardware gets fast enough to stream 4k HDR -> 8k start becoming the norm and so on. 'Good performance' is a moving target.
The more I can do, the more time I get to slack.The more you can do, the more you want to do.
But the Mac Pro won't be able to compete with the 6080 at 16k gaming!!!!Clickbait headlines that indicate "In Trouble" or "Mac Killer" are so annoying. Tom's Hardware is infamous for them IMHO. Saying an unreleased product is in trouble due to an unreleased product that will be expensive, power-hungry, and in great demand does this forum no justice.
I get it.....there will be faster CPUs, there will be faster GPUs. Is that enough to make the switch? What about the effective speed of the Codec? I'm not processing in 8K and don't plan on it and can ill afford a 6K monitor that allows me to edit above 4K resolution (wish I could edit on a 5K monitor, but hard to get). For my photography work, the majority of my work, unless I got to a 100 MB image output I doubt I will outgrow a M1 Max in a long time.
My 54mb images can grow to near 500 MB when doing serious Photoshop work and running that through a add-in can press the limits of many of system, but I'm not seeing any strain at all.
Can we just stop the baiting?
Mostly, yeah. At least in obvious troll threads like this one (of which they've started a few)Has anyone noticed this from the OP, that after posting a thread like this, he never comes back? He posts, then just leaves without any further comment. Is this his MO ?
God. Clickbait.The RTX 4090 will apparently be 3 times faster than the RTX 3090.
And since the Mac Pro will be targetting video editors who will benefit from the GPU, it could be possible that the M1 Max Quadro could be slower than PC’s with a single RTX 4090?
Now the big advantage is that Apple has enough supplies to combine 4 M1 Max together into a M1 Max Quadro, while it will be impossible to get your hands on the RTX 4090 due to the chip shortage crisis we are currently in.
Yeah, I hurry up (usually automate) to relax sometimes too, but that's well defined tasks.The more I can do, the more time I get to slack.
Most "pro" workloads will never be "near-instant" because workloads will keep getting more complex (photos get more pixels to edit, programs get more lines of code to compile...)
All problems expand to fill the available hardware. It's a truism.