Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hey all.

Wow the thread got "slightly" bigger than when I checked it last time:)

Thank you all for the replies and your help!

Anyway, I think I will wait for an iMac with TB3. I thought I might as well get the latest tech. I think I will also, as someone suggested, get either a low-mid range iMac to last me 3-4 years, or get a maxed out model and hope it will last me 5-6 or so. Either way, I should be fine.
 
Threads like this are why I've been enjoying MacRumors.com for a dozen years now. Lots of great ideas, well-reasoned conjecture, and most importantly respectful disagreement. The rest of my life should be this way. (:)
 
Threads like this are why I've been enjoying MacRumors.com for a dozen years now. Lots of great ideas, well-reasoned conjecture, and most importantly respectful disagreement. The rest of my life should be this way. :))
Hahaha I agree!

I tried asking the same question on tomshardware and received a few bitter answers from PC gamers saying that my iMac will go obsolete in a couple of years and that I should get a custom build PC... I do have a custom build PC that I am selling now piece by piece. I am really set on going from Windows to Mac since I find Macs simple and minimalistic, and man, what a great displays!!!

I am wondering if I actually need TB3. Read some threads here and looks like it's nowhere near to be a necessity but rather a new fancy feature with no so much stuff available now to get a full 100% out of it. Apparently that is why Apple doesn't rush with it - there is really no point having it right now.
 
thunderbolt 3 will (supposedly) let you add a external GPU, should that become necessary. Also, the USB-C socket may prove to be a popular standard in the future.
On the other hand, Apple may choose to seal up the 2016 iMac 27" when it's released.
 
thunderbolt 3 will (supposedly) let you add a external GPU, should that become necessary. Also, the USB-C socket may prove to be a popular standard in the future.
On the other hand, Apple may choose to seal up the 2016 iMac 27" when it's released.

That's what I am afraid of - no more RAM upgrades :(

USB-C will definitely become a standard at one point. But it does not mean there won't be any converters for USB 3.0 to USB-C.

USB 2.0 is still used widely. It is just slower than USB 3.0.

On adding an external GPU, again, I don't game so I don't see any use for eGPU in future.
[doublepost=1453175667][/doublepost]I got a question.

Which i5 CPU is included in a basic 27" iMac? 6500K or 6600K?

If I decide to buy an iMac now it most likely will be one of these depending on the CPU:

  • 3.2GHz quad-core Intel Core i5, Turbo Boost up to 3.6GHz (6500K???)
  • 8GB 1867MHz DDR3 SDRAM - two 4GB
  • 256GB Flash Storage
  • AMD Radeon R9 M380 with 2GB video memory
OR
  • 3.3GHz quad-core Intel Core i5, Turbo Boost up to 3.9GHz (6600K with hyper-threading hopefully?)
  • 8GB 1867MHz DDR3 SDRAM - two 4GB
  • 256GB Flash Storage
  • AMD Radeon R9 M395 with 2GB video memory (worth it vs. M380 for future proofing for 6K, 7K videos etc???)
The price difference between this two setups is $400 Canadian (buying it with a student discount).

Your thoughts...


Thank you.
 
Last edited:
Tip from someone who's been buying computers since the 90s:

Don't try and "future proof" beyond 3 years.

Buy something mid-spec (or rather, "appropriate" spec) today, and upgrade more often.

When you need the additional power it will be cheaper. Pass your 2-3 year old machine onto someone else, and get something more powerful sooner. Put the money you WOULD have spent on the top of the line system in the bank, and spend it in 2 year's time on a new box. Sell the old one for reasonable money.

Trying to stretch a machine out for more than 5 years is just setting yourself up for hardware failure (and then you're screwed anyway and wasted your money), super expensive purchase price, and if you do ever need to upgrade it, expensive upgrades because the standards for everything change (look up the prices of DDR2 RAM vs. DDR3 or DDR4 some time - and DDR2 is slower).

Trying to stretch a machine to 10 years is just sheer lunacy. Well let me re-phrase. trying to purchase with the intent of keeping for 10 years is. If it is still working after 6, fine... just be aware that any spec upgrades you do today will do very little to improve your system's longevity vs. the current mid-range/base.

The price/performance curve is not a straight line. A machine half the cost of top of the line typically has 80% of the performance unless you're talking about specialist tasks.

Tech does not stand still.

To put things in context, 10 years ago I was using (I think) a Pentium D 930 (dual core) with 1 GB of RAM, Nvidia 7600GT and a 250 GB hard drive. That, at the time was a reasonably high end PC. It wouldn't run much of anything today. Certainly not quickly, and it would have been slow garbage for the past 6 years at least.


Don't get me wrong, if you're doing something for work that demands maximum performance today, get it. But then you'd be looking at the Mac Pro... and even if you were to buy one of those, expecting it to last 10 years is wishful thinking. You might get lucky, but banking on it is foolish.

Especially with so much revolutionary tech right on the horizon like intel Xpoint memory, the coming massive increases in core counts, massive storage speed improvements we're seeing right now, etc.

edit:
re: dual core still on sale... sure. but clock-rates have stopped increasing and all the major operating systems have done major work to increase threading, and the ability to use more threads. intel is working on many core CPUs, the Xeons are already up to 20 cores or more, you're going to see an explosion in core counts over the next 5-10 years. Software has been holding multi-core back, but this problem is far less of a thing with recent OS platforms.

Great post. It convinced me to just get a base model 5k iMac when I buy one. Before I wanted to get i7 and 395x but I really don't need that power. My 2012 MBP has enough power. I just want the big screen and desktop experience for music and photo stuff.

If I buy an iMac and in 3 years find out I need a better CPU or whatever then I can just upgrade. We have a tendency to want stuff that's way above our needs, but we rarely ever actually hit limits on our hardware. If it gets to that point for me I'll just upgrade.
[doublepost=1453183373][/doublepost]
Most users do that.

Most users don't upgrade software in 10 years? Yeah right.

Even if you just use the web, the web will get much more powerful and demanding moving forward.
[doublepost=1453183735][/doublepost]
The biggest problem with keeping a brand new iMac for 10 years is [ironically] the display. I think by 2020 40" 4K displays will be mainstream. They are already available at a reasonable price (Crossover 404k is $650). So while your iMac will work fine, the user experience will be noticeably inferior to a modern machine. To put this in context you are getting a 27" screen now and 10 years ago 19" was the standard; no one wants 19" now and that's kinda how 27" will be in a few years.

40 inch 4k would look like crap in a monitor. It would have ~100 PPI.

iMacs will never have a 40 inch display. That's ridiculously massive for a desk.
 
10 years is a totally unreasonable expectation for a computer.
Yes, it will probably still works 10 years from now (a 256 Gb I wouldn't bet on it), but really poorly.
6-7, maybe 8 years are the best you can expect from an high specced iMac to last
 
USB-C will definitely become a standard at one point. But it does not mean there won't be any converters for USB 3.0 to USB-C.

USB 2.0 is still used widely. It is just slower than USB 3.0.

Apple puts the USB sockets on the back of a machine, which means that connecting a usb flash drive is unnecessarily complicated. USB-C sockets, combined with a nifty dual socketed flash drive would go a long way towards solving this problem.

USB-2 is frustratingly slow for mass storage devices. Yes, I still use these sorts of things-- not every machine I use is connected to a network.
 
That's what I am afraid of - no more RAM upgrades :(

USB-C will definitely become a standard at one point. But it does not mean there won't be any converters for USB 3.0 to USB-C.

USB 2.0 is still used widely. It is just slower than USB 3.0.

On adding an external GPU, again, I don't game so I don't see any use for eGPU in future.
[doublepost=1453175667][/doublepost]I got a question.

Which i5 CPU is included in a basic 27" iMac? 6500K or 6600K?

If I decide to buy an iMac now it most likely will be one of these depending on the CPU:

  • 3.2GHz quad-core Intel Core i5, Turbo Boost up to 3.6GHz (6500K???)
  • 8GB 1867MHz DDR3 SDRAM - two 4GB
  • 256GB Flash Storage
  • AMD Radeon R9 M380 with 2GB video memory
OR
  • 3.3GHz quad-core Intel Core i5, Turbo Boost up to 3.9GHz (6600K with hyper-threading hopefully?)
  • 8GB 1867MHz DDR3 SDRAM - two 4GB
  • 256GB Flash Storage
  • AMD Radeon R9 M395 with 2GB video memory (worth it vs. M380 for future proofing for 6K, 7K videos etc???)
The price difference between this two setups is $400 Canadian (buying it with a student discount).

Your thoughts...


Thank you.

6500 and 6600, no K-versions for the I5 afaik
 
And pretty much the only i5:s that supports hyperthreading are the mobile dual core ones.
 
40 inch 4k would look like crap in a monitor. It would have ~100 PPI.

iMacs will never have a 40 inch display. That's ridiculously massive for a desk.
40" @ 4k is slightly higher ppi than 27" @ 1440p bro (110 v 109)
 
Last edited:
If we disregard my total lack of understanding on why you would want a 40" monitor (more than to be able to throw windows around that you do not use regulary but want "around" on top anyway, much as I do with applications and RAM).

the 5k 27" iMac has what? 217PPI? That's the way we are heading, high DPI monitors. And I'd bet we see a reasonably priced (<$1500) monitors with 8k the next 2-3 years.

I have tried running a 1080p 40" LCD as a monitor, and without either having a real deep desk or sitting uncomfortably far from the desk it was basically too big to use as my primary monitor. So I'm having a hard time believing that will ever be the "Norm", even though I have no doubt there will be 40" monitors aimed at PC users sooner or later (and not only creative professionals for video editing as it might be today)
 
  • Like
Reactions: navaira
1. Haha, no, of course not.. But OTHER apps will. I don't see how that's such a big problem to understand. And I wasn't talking about future apps, you were talking about apps allocating ram that they didn't use (due to OS X Memory Management), and I said that's how it is supposed to work as they can request writes to the RAM without needing the allocator to allocate more space for them...)

2. Sure, but I don't jack them up, i use the computer as I see fit and that's the end result... I hit the ceiling constantly when using systems with 8GB of RAM. I leave apps open that I use often because I CAN and they will switch on faster if not swapped out or closed (even faster than the fastest SSD would allow them to).

I keep all my most used applications started always, because I don't want (or need) to wait for it to start if i do...
In my mind People are using RAM the wrong way... As soon as you close one app or tab or whatever for no other reason than to save RAM, you have too little RAM. RAM is Cheap.
And where are these apps? today? If someone bought a i5-2500k 5 years ago they would be able to run all modern apps fine. (there are a few apps where they would want more power like virtualization or video rendering but those are the situations where nothing is good enough and people want more power regardless) In fact they could still do high performance tasks such as gaming perfectly fine because single GPUs don't even max out the old PCIe 2.0 interface. So for all practical purposes the only difference between a good machine from 5 years ago and today is SATA 3.0/NVMe for faster SSD specs.

The fact is the power of PCs peaked around 2011 for regular users i.e. not power users. Power become soo excessive that people started downsizing, going from PCs to Tablets. That's why the PC market is shrinking. From my POV your poor management skills and excessive usage are warping your perception of computers.

If we disregard my total lack of understanding on why you would want a 40" monitor (more than to be able to throw windows around that you do not use regulary but want "around" on top anyway, much as I do with applications and RAM).

the 5k 27" iMac has what? 217PPI? That's the way we are heading, high DPI monitors. And I'd bet we see a reasonably priced (<$1500) monitors with 8k the next 2-3 years.

I have tried running a 1080p 40" LCD as a monitor, and without either having a real deep desk or sitting uncomfortably far from the desk it was basically too big to use as my primary monitor. So I'm having a hard time believing that will ever be the "Norm", even though I have no doubt there will be 40" monitors aimed at PC users sooner or later (and not only creative professionals for video editing as it might be today)
High PPI is meh IRL. I have a retina macbook pro and I prefer my 27" 1440p display. I've used a 5K iMac and i vastly prefer my 40" 4K display.

The problem with a small screen and high PPI is that it's just too hard to notice those super fine detail such as hair or fur strands. On the flip-side not only do you notice fine details on a larger screen, it's also a much more immersive experience with movies or videos because everything is near life size.

40" @ 1080p is pretty bad, even 27" at 1080p isn't great. After using 4K @ 40" it's really obvious that this will be the future. In 5-10 years everyone will look back and "think wow, did we really use 27" screens?" just like we think back to 15 or 19" displays today. You can already see the PC monitor industry is slowly creeping up to this size with 34" 1440p 21:9 displays which are the same width as a 40" 16:9 display.
 
Last edited:
And where are these apps? today? If someone bought a i5-2500k 5 years ago they would be able to run all modern apps fine. (there are a few apps where they would want more power like virtualization or video rendering but those are the situations where nothing is good enough and people want more power regardless) In fact they could still do high performance tasks such as gaming perfectly fine because single GPUs don't even max out the old PCIe 2.0 interface. So for all practical purposes the only difference between a good machine from 5 years ago and today is SATA 3.0/NVMe for faster SSD specs.

The fact is the power of PCs peaked around 2011 for regular users i.e. not power users. They become soo excessive than people started downsizing, going from PCs to Tablets. That's what you are missing, because like you said, 'you use RAM differently from most people' so poor management skills and excessive usage are wrapping your perception of how powerful computers have gotten.

"poor management skills"? I Have systems with 4GB of RAM i use quite often, I teach students on computers with 8GB of ram and we do all kind of stuff (both virtualization and emulations, simulations and every other "tions" operations you can think of). The thing is I DO NOT WANT TO NEED to manage my damn resources on my private computer, if computer resources are cheap, why should I put myself in a situation were I need to manage them? Why should I need to delete files to free up space, why should I need to close applications to free up RAM?

You seem to lack the understanding that people stated the exact same thing in 1990, and in 1995, and in 2000, 2005, 2010, that computers "Have gotten so powerful they don't need to be any faster". Have they been right before? No?
Did the Performance of ordinary computers peak in 2011? Ok, then how come Skylake 6700k is close to twice as fast in single thread operations as the fastest i7 in 2011 (2700)? Sort of a strange saying they peaked in that case?
Why did they increase the performance that much if not needed?

As I've said, I'm tired of this discussion, I'm not going to argue with someone having the exact same arguments as people I've argued with numerous times the last 20 years.

10 years ago i remember someone arguing about the PS3 having a Bluray reader, "No game will ever need more than a DVD!" i remember someone stating. Today some top end games are like 50GB+.
I browsed using a computer with 512MB of RAM, having almost the same amounts of tabs open as now... How could I do that then when today each and every tab allocates 100MB or so in RAM? Could it be that the complexity of the browser and web pages have increased? *GASP*


I'll put a note in my calender.app about this discussion and your username, and we'll see in 2026 who was right... If ordinary mid-level computers sell with 64GB of RAM and more than 4 cores at that time (or even just 4 cores but with dynamic stem cell-like core growth and adaptation), or why not a single storage solution acting as both RAM and storage? (xpoint?) I'll have the last laugh...

I'm quite sure that even by mid 2017 most entry-level tower PCs will be equipped with at least 16GB of RAM...
(Most at ~$6-700 have 8GB today).

But this will probably be my last response to your tiring posts, because you seem to think that the computer industry needs stops just because YOUR needs might have done so (or so you believe). Keep your 2011 computer, I'm going to count on you still using it as a daily driver in 2016 (as 2011 was the year the performance "Peaked").

And sure, our students use computers from 2011 as well, and they work fine... Why? Because they run the exact same software and do the exact same labs as in 2011... As I've said, if you stick with the same software, the same websites (and hope they'll never evolve) it will work fine, forever and ever...
 
Last edited:
You seem to lack the understanding that people stated the exact same thing in 1990, and in 1995, and in 2000, 2005, 2010, that computers "Have gotten so powerful they don't need to be any faster". Have they been right before? No?
That's total BS. You would not want to use a computer from 2000 in 2005 and you could get away with a PC from 2006 in 2011 (if it was a core 2 duo and not a pentium although you might want to upgrade) but today you are perfectly fine using a PC from 2011. That's a big difference. And that shows the pattern too that performance gains become more trival as the years went on.

Did the Performance of ordinary computers peak in 2011? Ok, then how come Skylake 6700k is close to twice as fast in single thread operations as the fastest i7 in 2011 (2700)? Sort of a strange saying they peaked in that case?
Why did they increase the performance that much if not needed?
And for most applications that doesn't matter. That's the point.
I'll put a note in my calender.app about this discussion and your username, and we'll see in 2026 who was right... If ordinary mid-level computers sell with 64GB of RAM and more than 4 cores at that time (or even just 4 cores but with dynamic stem cell-like core growth and adaptation), or why not a single storage solution acting as both RAM and storage? (xpoint?) I'll have the last laugh...

I'm quite sure that even by mid 2017 most entry-level tower PCs will be equipped with at least 16GB of RAM...
(Most at ~$6-700 have 8GB today).

But this will probably be my last response to your tiring posts, because you seem to think that the computer industry needs stops just because YOUR needs might have done so (or so you believe). Keep your 2011 computer, I'm going to count on you still using it as a daily driver in 2016 (as 2011 was the year the performance "Peaked").

And sure, our students use computers from 2011 as well, and they work fine... Why? Because they run the exact same software and do the exact same labs as in 2011... As I've said, if you stick with the same software, the same websites (and hope they'll never evolve) it will work fine, forever and ever...

I have a i7-5820k for my gaming rig and I fully expect it be 100% usable for the next 5 years. No performance problems at all until graphic cards max out the PCI 3.0 bus. For regular usage it'll easily last 10 years no problem. That's a lot different from 2000 to 2010.

And yes PCs will eventually be shipped with 16GB of ram standard however less and less people will buy those PCs proving my point that most people just don't need that power. That's why tablets are even a thing.
 
Last edited:
  • Like
Reactions: Max(IT)
I sincerely doubt that the specs would be the major concern for long term use of a new computer, especially based on the OP's use scenario. The primary concern I would say is reliability and longevity of the components. My 2011 mid-range spec iMac has been chugging along without a hitch for 5 years. It is currently supporting 10 peripherals in a small business environment quite easily, in addition to being tasked for most home requirements and media management. This one did not come with USB3, but it does have T-bolt which allows for use of higher speed peripherals through the right adapter.

I have been fortunate enough to have no hard drive issues with my stock unit, but if I do, this iMac is still relatively easy to open up and replace the offending item. I also have easily updated RAM, which is an option still available in the 27" iMacs, but possibly an option that will not exist in the near future, if Apple follows their current path of denying user serviceability / planned obsolescence.

The other big consideration in my opinion is the relative rapid changes (not necessarily improvements) to the OS. If a system can be "restricted" to Operating systems that work well with it, those annual revisions probably wouldn't be necessary. The Windows world is a better example, but the point is still valid for Macs. Apple does make it more challenging, I'd say, to keep an older OS even though it is perfectly valid for a user's needs. Personally, I'm using 10.9.5 Mavericks because it ain't broke, so I ain't fixin it. That's something that may need to be contended with in the near future.

To use the old automotive cliché, I'm driving my 2011 iMac til the wheels fall off. It doesn't matter if next year the fad in displays is some zillion-pixel 16K display requiring a 32GB video card, yadda yadda yadda. The current one works perfectly fine for what it needs to do.
 
To anser just the last paragraph in your answer:
Heck, as I said earlier, I browsed the web just fine on Windows 95 in 1997... Why would I ever want to update from my Pentium 1 with 24MB of RAM? It did what I wanted it to and on todays computers the entire OS could fit on a small flash drive on the motherboard and boot in less than a second. No need for any additional storage for the OS itself, and 64MB of RAM would suffice for anyone (heck, I could store the entire OS in that amount of RAM).

Hmm... How come no one uses Windows 95 to browse with these days? It worked just fine 20 years ago? Surely the needs for browsing haven't changed?
 
Last edited:
I'll put a note in my calender.app about this discussion and your username, and we'll see in 2026 who was right...
Are you making the assumption that Calendar app will still be around in 2026 though ;)

Some people above have touched on something interesting though. PC sales are dwindling because apparently people don't need so much power. Fair enough. But tablet sales also seem to have plateaued. So is it that most people now just need a phone/phablet? I've got to admit I have two desktops, one laptop, one tablet and one smartphone and out of those I mostly use the laptop and the phone. Tablet is kind of around if I need to read comics or a magazine, and desktop is for when I am fiddling with iTunes or making music.
 
I have a Dell Dimension 9200 I still use everyday. I bought it in 2007 so it's coming on 9 years old and I can still see myself using it for years to come. This cpu is a Core 2 Duo if people remember those. As a daily machine, it does everything I need plenty fast (don't play modern games on pc).

My iMac is a mid-2010 (i5) so coming on 6 years with no signs of slowing down my productivity (xcode 7 runs perfectly fine although I did put in an ssd in 2011)

Now, the systems work more than acceptably but peripheral technology marches on: usb 3, usb c, tb, sata 3, displays (important for all-in-ones),etc . But they are by and large, nice to haves vs must haves for the most part.

There's really no tech changes on the 3-5 year horizon that will make what you buy today obsolete. There will always be faster and more shiny offerings but what you have won't break because of it. I think after 5 years, it's too far to say. I would buy and budget a cost that was comfortable to understand it will be replaced in 5 years at a similar cost (a 2015-16 iMac costs basically the same as a 2010 although they have perfected incremental upgrades to raise the cost somewhat)
 
Are you making the assumption that Calendar app will still be around in 2026 though ;)

Some people above have touched on something interesting though. PC sales are dwindling because apparently people don't need so much power. Fair enough. But tablet sales also seem to have plateaued. So is it that most people now just need a phone/phablet? I've got to admit I have two desktops, one laptop, one tablet and one smartphone and out of those I mostly use the laptop and the phone. Tablet is kind of around if I need to read comics or a magazine, and desktop is for when I am fiddling with iTunes or making music.

The Calendar app syncs to my owncloud solution using caldav. Should be fairly exportable to any replacement app I'll use in the future ;)
 
  • Like
Reactions: navaira
I got a little less than 8 years out of my '08 MacBook Pro 17 inch C2D. I upgraded it to a max of 6 gigs of ram and two drives. SSD in the boot bay and a 750 platter in the optical bay. Ran like a champ until the graphic card bit the dust recently. Given that, I think 10 years is too long to expect out of high performance electronics.

I plan on either an i5/256 flash/base ram (stuff it myself) or max Mini.

Dale
 
Any computer can last 10 years if you stop updating your software at some point unless there is some hardware failure. It's the need to run the latest and greatest applications that dictate the need to upgrade your computer. If your computer is bought for a specific task and the software installed on it do that task perfectly and no general hardware failure occurs then I don't see why it wouldn't last at least 10 years.

If on the other hand you update your software regularly, there will come a day where you won't be able to install it as the hardware won't be supported anymore.
 
  • Like
Reactions: navaira and bogg
do that then when today each and every tab allocates 100MB or so in RAM? Could it be that the complexity of the browser and web pages have increased? *GASP*
Do you use more than a hundred tabs?
 
And as I've said, for the same activities, using the same software and the same usage pattern a computer is good enough for years.

Heck, if you don't mind waiting and missing out on stuff a 10 year old computer should be able to brows
Do you use more than a hundred tabs?


Why would I use more than a hundred tabs? 100x5=500 and it was 512MB ram I was discussing in that sentence
[doublepost=1453227983][/doublepost]
Any computer can last 10 years if you stop updating your software at some point unless there is some hardware failure. It's the need to run the latest and greatest applications that dictate the need to upgrade your computer. If your computer is bought for a specific task and the software installed on it do that task perfectly and no general hardware failure occurs then I don't see why it wouldn't last at least 10 years.

If on the other hand you update your software regularly, there will come a day where you won't be able to install it as the hardware won't be supported anymore.


Which is basically what I've been trying to say numerous times: if you continue to use the same software and do the exact same tasks it will pretty much last forever. But as soon as you start using new services (be it offline or online), new OSes with new features and new more advanced web pages and applications, it will quickly go south no matter how powerful your system was X years ago.
 
Last edited:
  • Like
Reactions: tuxon86
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.