Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
10 years is a totally unreasonable expectation for a computer.
Yes, it will probably still works 10 years from now (a 256 Gb I wouldn't bet on it), but really poorly.
6-7, maybe 8 years are the best you can expect from an high specced iMac to last

I agree but I always see people with old stuff and seem perfectly fine having a computer over 10 years. A lot of times it comes down to finances and what they really do with it. Slow is something they will complain about but they seem to be ok dealing with it verse replacing it. In 10 years, while everyone is enjoying USB 6.0 (new connector shape) and flame wire 1.0 (new smaller connector), they will be happy with their USB 3.0.
 
I agree but I always see people with old stuff and seem perfectly fine having a computer over 10 years. A lot of times it comes down to finances and what they really do with it. Slow is something they will complain about but they seem to be ok dealing with it verse replacing it. In 10 years, while everyone is enjoying USB 6.0 (new connector shape) and flame wire 1.0 (new smaller connector), they will be happy with their USB 3.0.

People are most often ok with unnecessarily slow for the same reason as some people are OK with their cars engine sounding like it's filled with gravel or people being ok with their bicycle tires being almost flat: They're either ignorant or cheap :)
 
I was bit stupid when I bought a Powermac G4 1.25 Ghz MDD when I could have gotten a refurbed Powermac G5 instead. It was a question of budget. Nevertheless, I ended up upgrading the RAM to 1.75 GB, the hard drives to about three or four times what I had in there before. The GPU went from a Radeon 9000 to a Radeon 9600 and I think I upgraded the USB to 2.0 so that I could use an Ipod. What killed it was the single core nature of the G5-- Time machine expected that it could be backgrounded, and not interfere with the user's enjoyment of the machine.

I then purchased a Imac 9,1-- ended up eventually upgrading the memory, first to 4GB. then to 8 GB, and adding about 4 GB of drives. What killed it was a lack of any SSD support, and the utter weakness of the 9400M when working with shaders-- which had, by this time, become so commonplace in video cards as to become the core of OpenGL/DirectX.

I now have a iMac 5K-- video card isn't that spectacular and the SSD portion is merely 128 GB. But it's a whole lot faster where it counts. How will the machine fare when it get's replaced in 2020 or sooner. I have no idea. But at this point it's likely that the graphics card will age out first-- unless I start doing stuff with multicam UHD video.

The trouble with expecting machines to last, in a useful capacity, longer than four or five years is that eighteen months from now, someone will develop something really special that catches on. Eighteen months after that, the software makers will really start to exploit that something. And eighteen months after that, they'll be relying on that something.
 
  • Like
Reactions: navaira
People are most often ok with unnecessarily slow for the same reason as some people are OK with their cars engine sounding like it's filled with gravel or people being ok with their bicycle tires being almost flat: They're either ignorant or cheap :)
Or they can't afford newer stuff... This isn't the same as being cheap.
Replacing a piece of working equipment because it is slower than the next gen is a first world problem.
 
  • Like
Reactions: navaira and T Coma
Or they can't afford newer stuff... This isn't the same as being cheap.
Replacing a piece of working equipment because it is slower than the next gen is a first world problem.

I think it goes without saying I mean if they can afford the repair/replacement...
After all, we are discussing in a thread about purchasing a quite expensive computer, I don't think anyone here has poor people living in poor regions of the world or people on welfare in their thoughts while posting. ;)

People who are glad to have been able to afford a computer at all probably don't care much if it takes 3 minutes to load google.com, for them it might be a fantastic experience being on the internet at all.

And I'm not talking about people who use their computer an hour a month to pay their bills either ;)
 
Last edited:
Wow R9 380 IS considered gaming GPU. Now I know I am good with M380 for my needs...


And this is for i5-6500. Also, pretty powerful and very close to i7-6700K:


I wanna say to whoever wants to get a decent iMac, base model is already powerful and will last if not 10 years, but pretty damn long ;)
 
Wow R9 380 IS considered gaming GPU. Now I know I am good with M380 for my needs...

That's a foolish assumption-- amd's desktop cards are more powerful than "equivalently numbered" mobile cards.
I have a R9 m290x in my iMac. Some windows games recognize it as a R9 270, which barely meets the Fallout 4 system requirements. If I ever have time to play the game, I'm suppose I'm all set.

An r9 m380, judging by the number of cores (640? 768?), is down around r7 260/250 levels.


A full desktop R9 380x card has 2048 cores (same as the m395x). A full desktop R9 380 card has 1792 cores (same as the m395.) And a r9 390x card has 2816 cores...

It's better than an Iris Pro, but you might have to run a lot of games at 1080p or below. In the end, it will probably be the first obsolete part in your "ten year machine". Really, it's meant for people who adamantly insist that no one will be playing games -- and that playing games is grounds for disciplinary action, if you get my drift.
 
That's a foolish assumption-- amd's desktop cards are more powerful than "equivalently numbered" mobile cards.
I have a R9 m290x in my iMac. Some windows games recognize it as a R9 270, which barely meets the Fallout 4 system requirements. If I ever have time to play the game, I'm suppose I'm all set.

An r9 m380, judging by the number of cores (640? 768?), is down around r7 260/250 levels.


A full desktop R9 380x card has 2048 cores (same as the m395x). A full desktop R9 380 card has 1792 cores (same as the m395.) And a r9 390x card has 2816 cores...

It's better than an Iris Pro, but you might have to run a lot of games at 1080p or below. In the end, it will probably be the first obsolete part in your "ten year machine". Really, it's meant for people who adamantly insist that no one will be playing games -- and that playing games is grounds for disciplinary action, if you get my drift.

I totally forgot that mobile versions are weaker than desktop ones.

What do you mean by disciplinary actions? I am just buying iMac for other needs. I have PS4 for games but I rarely even turn it on, played maybe 2 games for the whole 2.5 years I had it.

i5-6500 is still a decent choice it seems. And even that M380 is that "weak" in a gaming world - so be it.

I bought i3 with nVidia 610 GT and a small SSD for my mom's PC and it runs everything she uses it for, even 4K video at 1440p runs well. Mind you that 610 GT equals to a ~10 year old gaming GPU (minus all the new gaming tech like DX11, multi-sampling etc., but non-gamer would not even need DX10)
 
In 2007 I had 2GB of RAM and couldn't imagine a need for more. You do know it sounds ridiculous when people state things like that there will not be a need for more because right now _I_ don't use more? How in the world is "right now" relevant to 5-10 years from now?

16+ ram let you change your usage patterns drastically, you no longer need to shut off applications to start new ones. If someone offered me 64GB RAM for a reasonable price I'd jump on it in a second. I usually hover around 17GB of ram during normal use, and normal use for me is both non-gaming and non-creative, that is normal use is browsing, watching videos, having a bunch of tabs open and a few applications (that I never close because I don't need to).

Future OS X versions will undoubtedly use more ram and more cpu in itself, not to mention all the applications
RAM usage slowed down in the last 10 years.
In 2011 I was using 8 Gb, and it wasn't exactly an high end configuration.
As of today I'm using the same 8 Gb (of faster memory) on a 2015 Mac, while most of the Mac and PC out there are sold with 4 Gb configurations.
Unless you are doing virtualization or gaming, memory requirements settled down on 8 Gb for most of the users.

thunderbolt 3 will (supposedly) let you add a external GPU, should that become necessary. Also, the USB-C socket may prove to be a popular standard in the future.
On the other hand, Apple may choose to seal up the 2016 iMac 27" when it's released.
External GPU are available but are ridiculously priced. A new computer is more convenient....

40" @ 4k is slightly higher ppi than 27" @ 1440p bro (110 v 109)
I still don't see the need to have a 40" display on my desk.
I have my Mini connected to a 32" and it is huge....
 
It will last 10 years. But it will not be useful. Try a 10 year old computer. It was great back then, but just not good today.
 
It will last 10 years. But it will not be useful. Try a 10 year old computer. It was great back then, but just not good today.

You might want to get up to speed and read through the thread. Many, many people use ten year old (or close to it) computers to browse the Internet, watch video, etc., myself included. I have zero trouble doing anything that would considered normal use for an average user on these machines.
 
  • Like
Reactions: ToroidalZeus
It is inexact but a good thing to look at would be…

How useful today is a 2006 iMac or mac in general?

(Not very...painful to use mostly)
 
It is inexact but a good thing to look at would be…

How useful today is a 2006 iMac or mac in general?

(Not very...painful to use mostly)
If it's a PowerPC Mac then ya not really useful. But it's a core 2 duo then it'll do basic things like internet, mail, word, fine. I have a 2008 iMac and the only issue I've really run into is that the newest version of Office for Mac seems to crash on it.
 
If it's a PowerPC Mac then ya not really useful. But it's a core 2 duo then it'll do basic things like internet, mail, word, fine. I have a 2008 iMac and the only issue I've really run into is that the newest version of Office for Mac seems to crash on it.

I actually have one of the first Intel iMacs here still....and it is so abhorrent to use I wouldn't even think about it

10 years from a computer is just too much. Not worth worrying about. Who knows what we will be doing in a decade.

10 years ago there was no iPhone!!
 
I ran a machine that was over 25 years old and it ran perfectly fine. It was an Amiga 3000 and I only ran what was still on the hard drive like Deluxe Paint, Amiga Vision, and whatever else I could find on it. It wasn't slow but without the web and the interface was kinda ugly and mouse was barely usable due to the age mostly.

I have a iMac G3 with OS 9 with many classic games of that era and they run great. I beefed up the memory to 1 GB (max) when I got it so it has a lot of zip to it. I can technically call it my gaming rig with all the games it has on it. Classics like Warcraft, Star Trek Borg, and some arcade games like Pacman and Space Invaders. The speaker is breaking up so I have to use headphones or external speakers.

Maybe the iMac today will eventually have speaker problems, no battery for the date/time after reboot, but still run pretty good with software of today.
 
I actually have one of the first Intel iMacs here still....and it is so abhorrent to use I wouldn't even think about it

10 years from a computer is just too much.
Put an SSD into it. I had a 2008 core 2 duo iMac and with an SSD it was perfectly fine for basic tasks.

I still don't see the need to have a 40" display on my desk.
I have my Mini connected to a 32" and it is huge....
Same reason you have a 32" screen and not a 27" or 24" or 19". Why stop at 32"; What makes that number special?

From personal experience at 50" it's impossible to use the display at normal 2-3ft distance from the screen. You get that sitting in the front seats at a movie theater feel. So that's roughly the upper cap.

Hence why ~40" is roughly the magic number. It gives you that big screen next-gen vibe without being cumbersome. Ideally a curved display would be the best at that size tho because it eliminates any color shifting. A very important aspect about 40" is that it's a standard TV size. And TVs are what are pushing the 4k revolution. Right now it's possible to get 40" 4K 60hz 4:4:4 TVs for about $600. When that price drops by about half they'll flood the PC market and compete directly with regular monitors. You can already see on slickdeals that people are interested in using 4K TVs as monitors.

I think once people go to 40" that'll be the final monitor standard because of human physics and we'll be using 40" screens for as long as we have monitors. Rough time-frame IMO is somewhere around 5 years for this. [YES I IZ VISIONARY, ALL HAIL STEVE JOBS]
 
Last edited:
You might want to get up to speed and read through the thread. Many, many people use ten year old (or close to it) computers to browse the Internet, watch video, etc., myself included. I have zero trouble doing anything that would considered normal use for an average user on these machines.
Exactly. But the question he asked was whether a top of the line iMac would be sufficient. If he has those needs (just browsing), he wouldn't need a top of the line. Just buy the base model, and it should run 10 years without problems. But it would make no sense buying the top of the line if he's going to use it for light stuff like that anyways.

And I've owned a 2006 PowerMac G5. Even when it was 8 years old, it was a pain.

Also, you should think about the security perspective. Like - if you have a 2006 iMac (Core2Duo), the newest OS X it will run is Lion (10.7). There's a large amount of vulnerabilities, and the OS will not be patched. This is a huge issue. I would say 7 years is MAXIMUM time to use a Mac. After 7 years, the OS will probably no longer be updated, and there it'll be a security risk to use that computer. And I discourage the use of EOL operating systems.
 
At one, time, I was intrigued by the possibilities of ios handoff, which promised me the option of starting my work on one computer and resuming it on another. (It's not compatible with my iPad 3 and it wasn't compatible with my old 2009 iMac).

Had I a pair of more recent devices, I could have tried it out and decided whether it fit my style. Alas no.

Apple periodically decides to exploit newer technologies and take them in a whole new direction. A ten year old mac is likely to have many such missed opportunities.
[doublepost=1453405886][/doublepost]
Hence why ~40" is roughly the magic number. It gives you that big screen next-gen vibe without being cumbersome. Ideally a curved display would be the best at that size tho because it eliminates any color shifting. A very important aspect about 40" is that it's a standard TV size. And TVs are what are pushing the 4k revolution. Right now it's possible to get 40" 4K 60hz 4:4:4 TVs for about $600. When that price drops by about half they'll flood the PC market and compete directly with regular monitors. You can already see on slickdeals that people are interested in using 4K TVs as monitors.

Figuring out what's ideal sometimes requires a great deal of money and time.

I can justify my 27" 5K + 20 inch 1080p setup as "works for me." Is it ideal? Far from it. The 1080p screen isn't IPS, and I just cannot get it to work on windows 10, as well as it does in Mac OSX.

Meanwhile, the 5K aspect of the main screen is so very easy on the eyes for looking at text-- and having lots of text is presumably why you'd want huge desktop in the first place.

If I had a 40 inch 4K screen for looking at text, it would be harder to read, but I could store more stuff in my peripheral vision. Honestly, I'd have to spend the money and test it out-- and I don't have that kind of money for such experiments. If I had millions of dollars, I could build more interesting prototypes like 8K displays and what not and land on the perfect configuration for what I do. And it probably would not be a pair of 27 inch 5K displays, or a 40 inch 4K display, but something that hasn't quite been invented yet.
 
Last edited:
I can justify my 27" 5K + 20 inch 1080p setup as "works for me." Is it ideal? Far from it. The 1080p screen isn't IPS, and I just cannot get it to work on windows 10, as well as it does in Mac OSX.

Meanwhile, the 5K aspect of the main screen is so very easy on the eyes for looking at text-- and having lots of text is presumably why you'd want huge desktop in the first place.

If I had a 40 inch 4K screen for looking at text, it would be harder to read, but I could store more stuff in my peripheral vision. Honestly, I'd have to spend the money and test it out-- and I don't have that kind of money for such experiments. If I had millions of dollars, I could build more interesting prototypes like 8K displays and what not and land on the perfect configuration for what I do. And it probably would not be a pair of 27 inch 5K displays, or a 40 inch 4K display, but something that hasn't quite been invented yet.
You don't have to be a millionaire to get a 40" 4K display. If you can afford a 5K iMac than you can already afford one.

http://www.ebay.com/itm/Perfect-Pix...845900?hash=item5d5cdb798c:g:xAgAAOSw3ydV6VRU

http://www.ebay.com/itm/WASABI-MANG...026733?hash=item5d5a9a666d:g:iwsAAOSwDNdVoVy9

http://www.amazon.com/Philips-Computer-Monitor-3840x2160-Truevision/dp/B00UBCVY02

Review:

If you want the best multitasking support I would go with 2 32" 4k displays. For a single display a 40" 4k is ideal IMO.

Comparing 27" 5K to 40" 4K you'd notice text as less sharp but multimedia content such as a pictures would appear to be more detailed as the larger screen makes it easier to notice fine detail (e.g strands of hair).
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
I just replaced my 2006 Core Duo 32 bit iMac 20" with a 27" i5 3.3ghz, 2tb Fusion M395 model. My old iMac was still my primary home computer and it was sufficient for most tasks. I upgraded it to an internal SSD about 3-4 years ago, and that gave it some new life. What finally made me upgrade was just the lack of software updates. It was stuck at 10.6.8 and a lot of SW doesn't run on 32 bit anymore, even TurboTax! It had started to lack the power and capabilities to run the latest plug-ins, some websites stopped working, and it was too slow for 1080p video to play smoothly.

So now I have an up-to-date computer, and so far the only thing I've done above what I used to do is to watch the 2160p demo videos. There's nothing YET out there to watch that's mainstream, Netflix and Amazon are so far not releasing movies to stream on computers in that format. I don't do video editing, haven't uploaded a photo in a few years, do the occasional Excel or PPT. This computer I'm sure is wasted on me, but man, it is blazing fast. If this lasts another 10 years I'll be surprised, but at 27" it'll always be good for video watching.
 
10 years is a long time to expect a computer to keep up as your main system. I suggest going with the highest configuration you can afford.

10 years is a stretch, but it is possible. I have a 2006 iMac 2.33 Core 2 Duo with the VRAM upgrade (256MB) and it still runs. The biggest bottleneck is that it can't go beyond Snow Leopard. Other than that, it runs fine and doesn't even have an SSD. OP should be able to squeak by with a fully loaded iMac for 7-10 years at most.
 
10 years is a stretch, but it is possible. I have a 2006 iMac 2.33 Core 2 Duo with the VRAM upgrade (256MB) and it still runs. The biggest bottleneck is that it can't go beyond Snow Leopard. Other than that, it runs fine and doesn't even have an SSD. OP should be able to squeak by with a fully loaded iMac for 7-10 years at most.
The main change around those years was the change to 64-bit systems instead of 32-bit. This is also the (main) reason why there are no longer updates for the older models. I have a late 2009 iMac. This model was introduced on October 20th 2009, which is now 6 years and 3 months ago. I have absolutely no doubt that this machine will still run perfectly fine in 3 years and 9 months. It supports up to 32GB RAM at 1333 MHz and it is possible to have two internal SSD's in this machine. The main things it is missing is the retina display, Thunderbolt and USB3. The fastest connection on the machine is FireWire 800.

I do not think we can solely look back in time to get an idea of what to expect in the future. As have been already pointed out by someone else, a lot of lower performing computers are being sold now and they will most likely still work (and be useful) a number of years from now. Despite the late 2009 iMac being over 6 years old, it is still 50% faster than the fastest MacBook (not Pro) currently being sold. I think we have now reached a point where power is plenty (for now) and that advances are now more incremental than ever for the average consumer. However, I fully expect this to change as AI becomes more and more advanced. The question is, when will this happen.
 
Really the best way of answering the OP's question would be with another question - how long is a piece of string? In other words it's impossible to answer with any degree of certainty as there are so many variables.

One of the variables for the OP could be, and I know this sounds morbid - he could be dead tomorrow, so whilst his iMac may stand the test of time he will be long past caring.

What he should do is buy the machine that suits his requirements now with a bit added on. If it keeps running fine and if it doesn't then he has other options. My iMac is now 5 years old and I've breathed some new life into it by fitting SSD and upgrading RAM. However, it now owes me nothing, so if it keeps going fine and if not then I will look at new possibilities when that happens.
 
Last edited:
  • Like
Reactions: navaira
10 years is a stretch, but it is possible. I have a 2006 iMac 2.33 Core 2 Duo with the VRAM upgrade (256MB) and it still runs.
10 years is a stretch, and its really impossible state how SSDs will be in 10 years, there's just no evidense at this point to provide any direction.

I think other components should be fine, but then what about the dGPU, apple's track record with dGPUs hasn't been stellar (maybe that's why they went with AMD), all I'm saying is its impossible to project that far into the future :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.