Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But isn't Linux better and more advanced than OS X and windows?

Linux was never a more advanced OS for home users. It had its advantages, like ten years ago, but those have all slipped away. It was very light and could run on almost anything. It is now far behind both OS X and Windows. I used to use Linux to run my media center, but current instabilities and a lack of proper resolution support made that to much more difficult to justify. I now use Windows 7 for my media center/'do-everything' unit and a newer iMac running OS X for more specific use - as the desktop for household use and creative work. I have forgotten about Linux. Win 7 is so much better for older machines that need a light OS.
 
  • Like
Reactions: AlexGraphicD
To keep prices lower, the 1TB Fusion Drive option includes a 1TB hard drive paired with a 24GB SSD. Earlier iMac models included a 1TB hard drive with a 128GB SSD. 128GB of storage for the Fusion Drive is now limited to the 2TB Fusion Drive options.

Whenever I read this I really wonder how in the world a multi billion company resorts to 24gb ssd to "keep the prices lower".

Come on Apple, as if putting a higher ssd drive in the base model would cost you so much money that you would have to increase the price tag.

This and the 5200 rpm drive, I can't help but think that Apple is really being cheap and cares only how to get richer by selling average hardware specs.
 
Last edited:
What is the use case that you would classify the MBP as having a severe problem without a dedicated GPU?
What? I said a not being able to get a dedicated GPU until you spend $2500 is asinine ... on top of that ... it's not even a high end mobile GPU. I would say it's a problem that none of their MacBooks Pros have any sort of dedicated GPU at their price range. If you don't think that's a problem I don't know what to tell you.
 
I wouldn't make it seem as though all of the major studios use 5k iMac's, seeing as 6-9 months ago Sony and Universal in CA both had Avid bays running on some Dells, as well as a few old gen Mac Pro's...

Granted, I've been to a few major studios that have had a few iMacs for quick editing; but that doesn't say much, as anyone in the professional media industry knows, you can even use a 7 year old 'Dell 7100' to edit a 1080p video for TV with no problem at all. And you can pick that up for $150-200.

Obviously, no one is using iMac's for intensive tasks, like grading and CG, etc. Those 5k's fall flat on their face due to the graphics limitations. But they are perfectly fine for editing video; as is just about any modern computer. Editing does not require a fancy setup. If you simply edit 4k video and nothing else, the iMac will be great for you. If you want a computer that can handle everything, obviously go the PC or old MacPro route.

I did not suggest that all the major studios are doing all their work on iMacs. Grading and CG is indeed not happening on iMacs.

What I am telling you is that we have a building full of editors pushing out a very popular prime-time show using 5k iMacs. 10 camera multicam, thousands of hours of footage. And we are using Avid as it happens. The 5k iMac is on the Avid approved systems list. This is not "quick editing" as you call it, it is as intensive as an offline edit gets.

Post production has used workflows moving from machine to machine for decades. There has never been a scenario where everything is created on one maxed out nerd rig.

That is not how professionals work.
 
  • Like
Reactions: \-V-/
wait for the next imac with the new dGPU, these are suppose to have at minimum 50% performance boost than current one
 
wait for the next imac with the new dGPU, these are suppose to have at minimum 50% performance boost than current one

50% more performance??
Really???
I hope apple won't double the price of the next iMac upgrade :(
 
Last edited:
I tihnk iGPUs are coming into their own. Time to drop the dGPU for an IRIS Pro 580?

:ducks:
 
What? I said a not being able to get a dedicated GPU until you spend $2500 is asinine ... on top of that ... it's not even a high end mobile GPU. I would say it's a problem that none of their MacBooks Pros have any sort of dedicated GPU at their price range. If you don't think that's a problem I don't know what to tell you.

"Because I said so"

I understand you are saying its a problem, I'm asking why. What are you doing on a MBP that is limited due to it not having dedicated graphics or mid range dedicated graphics?

Most the bottlenecks I run into (short of gaming) are due to the CPU encoding video. However I don't expect too much there.

If I just wanted a high end graphics card for no good reason there is a plethora of gaming laptops available that I would look into.
 
"Because I said so"

I understand you are saying its a problem, I'm asking why. What are you doing on a MBP that is limited due to it not having dedicated graphics or mid range dedicated graphics?
The same thing everyone else would be doing, which is using applications that are GPU-intensive ... such as Final Cut Pro and a host of other applications.

Most the bottlenecks I run into (short of gaming) are due to the CPU encoding video. However I don't expect too much there.

If I just wanted a high end graphics card for no good reason there is a plethora of gaming laptops available that I would look into.
What a worthless response. You are actually defending the lack of a dedicated GPU in machines that start at 1300+ dollars. I don't care if you don't even have a use for them at all, they belong in all versions of the MacBook Pro. This isn't about your ridiculous non-use case, it's the fact that every other laptop on the planet has a dedicated GPU in the price range of even the barebones MacBook Pro. MacBook Pros used to come with dedicated graphics in the much cheaper models, and better options were available as the price went up. That is not the case anymore. I honestly have no idea what you're defending, but it's ridiculous. They are perfectly capable of including dedicated GPUs in their laptops. It is not a size constraint. Even on their base 15" model, which is 2 grand, it still doesn't ... you only get dedicated graphics on the highest end configuration they have ... and it's not even a great mobile GPU. This is a problem, whether you have a use for dedicated graphics or not.
 
  • Like
Reactions: AlexGraphicD
The same thing everyone else would be doing, which is using applications that are GPU-intensive ... such as Final Cut Pro and a host of other applications.


What a worthless response. You are actually defending the lack of a dedicated GPU in machines that start at 1300+ dollars. I don't care if you don't even have a use for them at all, they belong in all versions of the MacBook Pro. This isn't about your ridiculous non-use case, it's the fact that every other laptop on the planet has a dedicated GPU in the price range of even the barebones MacBook Pro. MacBook Pros used to come with dedicated graphics in the much cheaper models, and better options were available as the price went up. That is not the case anymore. I honestly have no idea what you're defending, but it's ridiculous. They are perfectly capable of including dedicated GPUs in their laptops. It is not a size constraint. Even on their base 15" model, which is 2 grand, it still doesn't ... you only get dedicated graphics on the highest end configuration they have ... and it's not even a great mobile GPU. This is a problem, whether you have a use for dedicated graphics or not.

You are confused, I'm not defending anyone or anything. Nor is my responses worth to you of any relevance to me. I was merely asking you a question. Thats for the mature reply.

I dont suppose you would actually care to answer the question? Or is the best I get "FCP and a host of other applications". I use FCPX so you can be as detailed as you'd like....
 
I'm sorry, but you weren't just asking a question. You were being condescending while making random assumptions. Comments like "for no good reason" and "because I said so" aren't really conducive to a productive discussion, nor is playing innocent when that's hardly what you were being. Do you honestly think I'm using myself as an authority to justify the stupidness that is the lack of a dedicated GPU in a 2,000 dollar computer? Does one need anything more than common sense to conclude that this is asinine? Maybe we both could have approached the conversation differently, but don't act like I'm an idiot. FCPX and pretty much every other video editing software performs better and faster with a dedicated GPU. That is a fact that doesn't need debating. The Adobe software that I use on a daily basis also performs better with a dedicated GPU. Most things in general perform better when they have a dedicated GPU to offset the load from the CPU. I don't know why this has turned into a discussion about what apps benefit from a dedicated GPU. The entire point was the lack of one in a machine perfectly capable of housing one that costs 2 grand. It is unacceptable. You have to spend nearly 3,000 dollars after taxes to get a laptop from Apple that has a mid-range mobile GPU. Please tell me how this is okay?
 
  • Like
Reactions: AlexGraphicD
Most the bottlenecks I run into (short of gaming) are due to the CPU encoding video...If I just wanted a high end graphics card for no good reason there is a plethora of gaming laptops available that I would look into..

What a worthless response....I don't care if you don't even have a use for them at all....This isn't about your ridiculous non-use case...I honestly have no idea what you're defending, but it's ridiculous...FCPX and pretty much every other video editing software performs better and faster with a dedicated GPU..

What cynics said is absolutely correct. I'm a professional documentary film editor with years of experience. Most bottlenecks in video editing and encoding are CPU-bound, not GPU-bound. Anybody can see this themselves by doing these tasks while monitoring with iStat Menus or Activity Monitor.

The GPU does help when doing certain effects but the most common types of video encoding (H264, MP4, AVCHD) cannot be meaningfully accelerated by a GPU since the core algorithm is inherently sequential.

An obvious example of this is a MacBook can sometimes encode H264 video faster than a 12-core Mac Pro with Dual D700 GPUs. This is because the MacBook has a CPU with Intel's Quick Sync, which the Xeon CPUs in the Mac Pro don't have. The powerful D700 dual GPUs in the Mac Pro are of no benefit in that common editing task. This is using FCPX, since Premiere CC does not use Quick Sync.
 
What cynics said is absolutely correct. I'm a professional documentary film editor with years of experience. Most bottlenecks in video editing and encoding are CPU-bound, not GPU-bound. Anybody can see this themselves by doing these tasks while monitoring with iStat Menus or Activity Monitor.

The GPU does help when doing certain effects but the most common types of video encoding (H264, MP4, AVCHD) cannot be meaningfully accelerated by a GPU since the core algorithm is inherently sequential.

An obvious example of this is a MacBook can sometimes encode H264 video faster than a 12-core Mac Pro with Dual D700 GPUs. This is because the MacBook has a CPU with Intel's Quick Sync, which the Xeon CPUs in the Mac Pro don't have. The powerful D700 dual GPUs in the Mac Pro are of no benefit in that common editing task. This is using FCPX, since Premiere CC does not use Quick Sync.
The topic was the lack of a dedicated GPU in a 2000 dollar laptop that is fully capable of housing one. And the high end 2500 dollar laptop Apple has offers only a mid-range GPU ... with no option to go any higher on the GPU. You guys have strayed so far off topic I don't even see the point in discussing this anymore. Do you or do you not agree that a 2,000 dollar laptop without a dedicated GPU is okay ... regardless of your need for a GPU? Good grief people.
 
I don't know if the comparison is fair or if it makes sense but it's like a Ferrari model in 2016 that still comes with a CD player on, no usb connection for iTunes music or lacking in horsepower significantly compared to other sports cars of our age.
 
the late 2015 imac is probably the most compelling all-in one computer with a 4k/5k screen, mostly because it's the only (5k) all-in one solution with this kind of resolution. now there is also the asus zen with a 4k screen, which is much more affordable, maybe check this one out.

the cpu is powerful, the ram upgradable and with a 500gb ssd + whatever external (RAID) storage you are using it is a great choice for any visual artist who does not require a hardcore rendering workstation.
you get a fantastic screen (10bit) in a beautiful form factor.

if you go for the 395x gpu you can play lots of games on 1440p without any issues and since there are plenty of AAA games which aren't considered graphics demos with guns, you can play those in 4k/5k if you wish.
you won't be playing the newest and best looking shooter game at 5k and 60fps because since you have done your research you will know already that even the high end dedicated graphics cards the size of a shoe box will struggle getting you the performance you demand or wish for.

is the gpu underwhelming? sure, for heavy gaming and heavy rendering. for 4k video editing it's fine and photography taxes the cpu most, for this it is fantastic. working on 60mp raw photos is a joy, 200mp is fine. if you slap dozens of layers on your images you can expect lagging, which is expected.

if you are a working professional you have probably made the math already but if this computer, compared to a windows machine, saves you only a few hours of discomfort and annoyance which you can use to create something nice and then write the bill matching your work, you will still have spent more money but also made it up already.

if you expect a macbook pro to do the same work than a fully fletched workstation or complain that a build with a "bad" gpu like the imac 5k cannot be used to play the newest and greatest AAA games on 5k, then you should spend more time being actually productive so you can afford a windows machine for your gaming hobby/profession.
 
Don't know what these guys are griping about, I run witcher 3 at 1440 on ultra settings usually around 40fps and it's 60 when at 1080p. I don't play hunched up next to the monitor like Smeagle, I'm usually sitting straight or leaned back in my chair and honestly the difference between 1080p and 1440 is negligible when there's motion on the screen (many double blind studies have confirmed this). Why do you think at best buy when they're showing off the 4k screens, it's almost always just a static non moving image of a ocean or something? It's cuz when there's motion, anything beyond 720p is giving you a very diminished return. Talk to any professional videographer and the main reason for filming in 4k is to give you room later on to crop and adjust, it makes very little real world difference beyond 1080p.

The bottom line, the 5k screens main advantage is for static images such as photoshop and text. It makes everything more crisp and it's a great option to have. Gaming is more than fine when at a normal resolution like 1080p or 1440. Whoever is talking about needing to go to 720 clearly has a low end model or something cuz I can barely get the witcher 3 to drop below 60fps when at 1080p.
 
Last edited:
I assure you there are many people doing color grading on top-spec iMacs using things like Color Finale or DaVinci Resolve. You can check on Reduser.net and see this. DaVinci's own Resolve 12 configuration advice says a top-spec iMac 27 is OK for this: https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwix9Oquup3MAhVK2SYKHVc0B6UQFggcMAA&url=http://documents.blackmagicdesign.com/DaVinciResolve/DaVinci_Resolve_12_Beta_Configuration_Guide.pdf&usg=AFQjCNFhiFupMAo2QXG1LyIRYPZE1GOfoA&sig2=_qZyflFtup9o-7utMj41_A&bvm=bv.119745492,d.cWw

You can see that professional editors and colorists have a balanced view. They do not say "no one is using iMacs for grading". Rather they say an upper-end iMac can work for this but a well-equipped Mac Pro is more optimal: http://noamkroll.com/is-apples-new-...-why-i-might-consider-it-over-the-new-macpro/

I color grade 4k documentary material on a top-spec 2015 imac 27 using Color Finale and it works fine.

There definitely is a difference between performance of different editing and color grading software. If you are only experienced with a limited range of products this can skew your perspective into thinking it's a hardware performance issue, when in fact the software is simply inefficient. Here is an example of varying render performance of different software running on the same hardware:


This is a nonargument, the iMac 5k was sRGB up until the latest batch with p3 panels; no matter if you can literally grade and mess around with color; sRGB is simply not up to par; it falls flat on its face. The newer ones are a bit better, but still don't match up to high-end systems. (even at the same price) I don't know a single person who would use that for accurate color grading. Small houses, and "close enough" grades, sure. But no professional for a high end production would use the iMac or advise others to use it. This is not a software-related argument, as there are numerous variables; these are hardware discussions, that software has no impact on.

And again, once you get into intensive graphics in AE, it literally comes to a complete stop with its m390 GPU - (benchmarking at 1/10 of what a decent card will be able to handle.) It's just a fact, you can't use it for graphics intensive, or truly accurate color grading; it falls short in every way. For "close enough" and just editing, sure. Even for basic 1080p AE comps, it'll do fine. But nothing truly intensive.
 
Wow! This thread went south fast. Very interesting however. Some good points on both sides I would say, but there definitely seems to be some Apple hate here.
 
I been using computers since the Apple IIc, Commodore 64 (Great little computer at the time) and the Atari 400 (A Friend's computer that we use to play games on, visit BBS (Bulletin Board Services) before the Internet. I am currently using the iMac 5K for web development (PHP, Ajax, Json and JQuery), but I have windows 10 to play a few games with. Granted there are better computers out there with better cpus and gpus, but I have no problem play games on Windows 10 at high resolution. I don't even have the top of the line specs. I have run Photoshop with no problems and have done a little video editing. Though if I was strictly doing video editing I would had gone with a Mac Pro, but this suits my needs. People who say about the color matching (That's what I call it) are correct, but most high end users aren't going to be using run Imacs or general purpose computers for those purposesy. I was a color matcher for an automotive paint manufacturer for almost 20 years, where we used a $20,000 X-Rite computer to "help" us match the paint to the customer's oem standard(s). I still remember my boss talking to a customer who had come for a visit explaining how we match color. The thing I remember the most is when the customer starting arguing that certain batch didn't match the oem standard because of a computer printout. My boss took the customer over to a color lightbooth with the sample paint plaque and oem standard plague. Showed him how it looked in daylight, horizon light (Think Sunset or Sunrise) and Cool White (Ultra Violet Light basically) and show him it on the face & flop that it matched. My point I'm trying to make is people put too much trust on what the computer is telling them instead of their eyes. Everybody sees color a little differently, it's basically people telling the different shades differently that determines the color (unless you're color blind, but that's a different story). I passed the color test with a perfect score, which most people don't do (I don't even think my boss had a perfect score). Though like I said all people see color a little differently. I think it's kind of funny how some people put to much emphasis on computer standards when it comes to calibrating. Don't get me wrong X-rites are fantastic calibration devices and it would be tricky for most people to calibrate it by eye. Heck I don't think I would want to do that, for it would probably take me a whole month to do it and by then it would have to be recalibrated. :D Point again after all my rambling is someone from Pixels studio isn't going to be using an iMac for their work, but your every day web developer (like me) will. ;)
 
And again, once you get into intensive graphics in AE, it literally comes to a complete stop with its m390 GPU - (benchmarking at 1/10 of what a decent card will be able to handle.)

A decent card being a card that's licensed to use CUDA, perhaps?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.