Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jblagden

macrumors 65816
Original poster
Aug 16, 2013
1,162
641
It's a little funny. Back when the iMac and Mac Mini were new, Windows guys made fun of them for not being expandable. And yet, look at what's becoming popular - All-in-One and Small Form-Factor PCs. It seems like Apple was about ten years ahead of the curve with the iMac and Mac Mini.

What do you think?
 
  • Like
Reactions: AlexGraphicD
I think there are many areas in which Apple have been leaps and bounds ahead of the competition. They removed the floppy drive very early (as already mentioned), went all SSD with the MacBook Air when everyone else was using hard drives, and realized early on that an all-in-one can be very attractive. I also feel like they were way ahead of their time with my current iMac - a 27" late 2009 with a quad core 2.8 GHz i7. It was so much faster than anything else at the time, although to be fair this was due to Intel as well. But still - so much power with such a (for the time) huge display was awesome!

However, fast forward more than 8 years later and it is basically the same machine (although upgraded quite a lot) that they still sell. The display is no bigger, the footprint on the desk and around the bezels and everything is the same and there has been no changes what-so-ever to the input methods. In other words, I feel like Apple used to be extremely far ahead of the competition and in some ways (all USB-C/TB3 on the new MPB) they still are. But they are not class leading in innovation anymore if you ask me.
 
I think it is a matter of opinion and perspective.

The Fusion drive was a great innovation as was Thunderbolt and 5K Retina. I would be waiting for the new iMac in the next month or os. There are a lot of whispers about Flash Storage being the big thing with the platter drive facing extinction
 
I don't think anyone will argue that Apple has made some great moves and innovation. I agree the addition of a 5k display (and selling the entire computer for 2k) was a masterful stroke.
 
I think it is a matter of opinion and perspective.

The Fusion drive was a great innovation as was Thunderbolt and 5K Retina. I would be waiting for the new iMac in the next month or os. There are a lot of whispers about Flash Storage being the big thing with the platter drive facing extinction

Totally this -- the machine is still pretty much top of the line AIO available today. The GPU could be bumped a bit and Kaby Lake would be nice by now but overall there's not much to change other than starting to add the USB-C stuff.
 
I still don't think people that claimed that would change their opinions. They would just expand it to include AIO PC's.

These are usually the opinion of the PC master race with dedicated cased PC's. They build PC because they enjoy it and may not even have a goal for the build, much like someone that soups up a car or truck but doesn't race it, off-road it, etc etc. But instead of just saying they do it because its a fun hobby for them they try to justify it pointing out the weakness of a iMac.

I'm generalizing btw but its pretty cool to hate Apple for a lot of people.
 
  • Like
Reactions: jblagden
Totally this -- the machine is still pretty much top of the line AIO available today. The GPU could be bumped a bit and Kaby Lake would be nice by now but overall there's not much to change other than starting to add the USB-C stuff.
Though, I’d really appreciate an Nvidia GPU, as would a lot of others. There actually more people than you’d think who want to play games on a Mac. There are also people who are using Adobe’s software, which uses CUDA. And let’s not forget scientists who want to be able to use CUDA for their scientific applications. So, there are plenty of people who would appreciate the option to have an Nvidia GPU in their Mac.
 
  • Like
Reactions: Jnesbitt82
Though, I’d really appreciate an Nvidia GPU, as would a lot of others. There actually more people than you’d think who want to play games on a Mac. There are also people who are using Adobe’s software, which uses CUDA. And let’s not forget scientists who want to be able to use CUDA for their scientific applications. So, there are plenty of people who would appreciate the option to have an Nvidia GPU in their Mac.

I agree their are certain advantages with nVidia and while I'm fine with AMD I have no problem with nVidia either (win win for me).

However even moderately serious gaming doesn't look like it will be a thing on Macs for quite a while.

Not only does the hardware not provide a good experience with todays modern gaming. Examples being OEM peripherals aren't good for gaming, no upgradability as more intensive titles come out, relatively low response time monitor, monitor has 60hz max with no gsync/freesync tech, questionable cooling for long game sessions.

But the software side isn't there either. Few devs bother with Metal which Apple pulled out of Vulkcan for whatever reason which sucks but whatever. Most games are using outdated OpenGL in MacOS. MacOS isn't supported for many games in general. And drivers can be questionable with bootcamp.

We are moving further and further from the mark at this point.

Many people game on Macs and I did for the longest time as well. But even playing the same game ESO as they updated it it pushed my system harder and harder. I had to keep turning graphic detail down further and further. When it first came out I had everything maxed, by the time I stopped playing on my iMac my settings were around medium and low to maintain 40-50 FPS. And without upgradability my only option is to buy a new iMac which is unreasonable for a game.

I do like City Builders though, but even then games like Cities Skylines really struggles with moderate graphic settings. 60fps is impossible with my i5 3.4 and nvidia 775m a lot having to do with the software side. Games will give options like anti aliasing options of on and off just because there isn't adequate support for better means of anti aliasing. So while a 1080 would be nice its a brute force method of hiding the underlying problems.
 
  • Like
Reactions: AlexGraphicD
I agree their are certain advantages with nVidia and while I'm fine with AMD I have no problem with nVidia either (win win for me).

However even moderately serious gaming doesn't look like it will be a thing on Macs for quite a while.

Not only does the hardware not provide a good experience with todays modern gaming. Examples being OEM peripherals aren't good for gaming, no upgradability as more intensive titles come out, relatively low response time monitor, monitor has 60hz max with no gsync/freesync tech, questionable cooling for long game sessions.

But the software side isn't there either. Few devs bother with Metal which Apple pulled out of Vulkcan for whatever reason which sucks but whatever. Most games are using outdated OpenGL in MacOS. MacOS isn't supported for many games in general. And drivers can be questionable with bootcamp.

We are moving further and further from the mark at this point.

Many people game on Macs and I did for the longest time as well. But even playing the same game ESO as they updated it it pushed my system harder and harder. I had to keep turning graphic detail down further and further. When it first came out I had everything maxed, by the time I stopped playing on my iMac my settings were around medium and low to maintain 40-50 FPS. And without upgradability my only option is to buy a new iMac which is unreasonable for a game.

I do like City Builders though, but even then games like Cities Skylines really struggles with moderate graphic settings. 60fps is impossible with my i5 3.4 and nvidia 775m a lot having to do with the software side. Games will give options like anti aliasing options of on and off just because there isn't adequate support for better means of anti aliasing. So while a 1080 would be nice its a brute force method of hiding the underlying problems.
You’re right about the thermals, especially with how thin Macs have gotten. And yeah, Macs haven’t been great on the hardware side, particularly in terms of upgradability. But that’s where the Mac Pro shined, with its upgradable RAM, storage, and GPU.

On the OS side, there’s always Windows and Linux. Though, of those two, I’d probably go with Linux. I could run StarCraft II in Wine, Minecraft is Java (so it’ll work fine) and Steam has a lot of Linux games.
 
Last edited:
Apple always get the stick for going the downstream of technology trends, before anyone even considers.

Same thing is with Mac Pro.
 
It's a little funny. Back when the iMac and Mac Mini were new, Windows guys made fun of them for not being expandable. And yet, look at what's becoming popular - All-in-One and Small Form-Factor PCs. It seems like Apple was about ten years ahead of the curve with the iMac and Mac Mini.

What do you think?
There is a segment of small form factor PCs, but it's certainly not all of them. And I've not seen an AIO on the windows side, in a very long time.

Only difference is that I see small form factor windows machines that are upgradeable. Too bad Apple had to take it too far.
 
Mac's will never have good high end GPU's because of their size and thermal designs. TB3 brings eGPU's closer to reality though and the performance loss from being eGPU is narrowing. Apple should really just officially support them.
 
  • Like
Reactions: jblagden
Yes and no, the other day I went to Best Buy and most desktop options are tower + LCD, it seems that configuration is still mainstream.

In fact I saw HP's Pavillion Wave which I thought was quite aesthetically pleasing for a PC, a stylish tower that looked and is a speaker too.

BfQ0maP.jpg
 
Yes and no, the other day I went to Best Buy and most desktop options are tower + LCD, it seems that configuration is still mainstream.

In fact I saw HP's Pavillion Wave which I thought was quite aesthetically pleasing for a PC, a stylish tower that looked and is a speaker too.

BfQ0maP.jpg

Personally think it's a bit ugly, looks like a Amazon Echo wrapped in carpet. Better than a beige tower though!
 
  • Like
Reactions: jblagden
There is a segment of small form factor PCs, but it's certainly not all of them. And I've not seen an AIO on the windows side, in a very long time.

Only difference is that I see small form factor windows machines that are upgradeable. Too bad Apple had to take it too far.
I know SFF and AIO PCs aren't all desktop PCs, but just the fact they exist is funny due to the ribbing Apple and its fans got for the Mac Mini and iMac not being as expandable as a regular desktop PC. Sure, there was always the Mac Pro, but it's always expensive, and it wasn't exactly popular among most Mac users. Most folks who had Mac Pros were either creative professionals or gamers, and now it's just creative professionals, and even the creative professionals are starting to leave the Mac Pro due to the issue of upgradability.

And yeah, the SFF and AIO PCs are actually upgradable, while Apple has really avoided making anything upgradable, with the exceptions of the Mac Pro and the 27" iMac - though it depends on how you define upgradability, since the GPU can't be upgraded in the iMac and the Mac Pro.
[doublepost=1488813321][/doublepost]
Mac's will never have good high end GPU's because of their size and thermal designs. TB3 brings eGPU's closer to reality though and the performance loss from being eGPU is narrowing. Apple should really just officially support them.
Yeah! It would make it a lot easier for people to use eGPUs if Apple just supported it. If you could just plug the eGPU in and have it work without having to disable SIP and then either manually edit kext files or run automate-eGPU.sh in Terminal, there would probably be a lot more people using eGPUs instead of going over to Windows. As it is, there are probably a lot of people using eGPUs with their Macs. And if Apple officially supported it, they could probably make hot-plugging possible, which is a big deal when you're using a laptop. Also, if they supported it, they might actually work with Nvidia to get Mac drivers for Pascal GPUs, so MacBook (12", Air, Pro) users aren't left with outdated GPUs in their eGPU enclosures.
[doublepost=1488813569][/doublepost]
Personally think it's a bit ugly, looks like a Amazon Echo wrapped in carpet. Better than a beige tower though!
Though, the benefit of the "beige tower" is upgradability. When a new interface (i.e. USB 3.1) comes along, you can just get the PCIe card for that new interface. When the CPU gets a little too slow, you can upgrade it. When you start to need a faster GPU, you can upgrade that too. Also, the beige tower can accommodate more hard drives.

But it really gets down to what you want to do with your computer. If you're the sort of person who would buy a Mac Mini or an iMac, it's the PC for you. But I like the upgrades afforded by a full or mid-size tower.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.