Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As much as I appreciate being called a "newbie" because someone disagrees with me, I've been intimately involved in PC hardware as a profession and hobby since my first 386. I'd venture to say I know more about PC hardware than most and these days that's all a Mac is - PC hardware with a different OS (or not even that if you run Windows like I do).

You can be unhappy with Apple's hardware choices all you want, but trying to say hardware is or is not this or that when the facts don't agree with you is simply absurd. I realize Core i7 Quads have more cache, but you also have 2 more cores using that cache as it is shared. I also know that the dual core i7 has 4MB vs the i5's 3MB. Cache helps, to be sure, but its not enough to make the 2012 faster than the 2014 on an IPC and single threaded basis.

You don't have to take my word for it - go to PCPer, Toms Hardware, Anandtech, Tech Report, Notebookcheck, Bit-Tech, or whatever your favorite PC hardware review site is and look at the comparisons yourself. There have been a metric f-ton of reviews comparing IPC of Sandy Bridge, Ivy Bridge, and Haswell. You can see for yourself theres between 8-10% difference in sheer IPC alone per generation. So, clock for clock in a non thread limited atmosphere Haswell is faster than Ivy Bridge. It cannot be denied.

I don't get you guys. You can make a perfectly fine case for the 2012 Quad which is a great machine on its own and especially in multithreaded apps. You don't need to go denying reality or making crap up to justify your feeling. Just come out and say "hey, I need/want a quad core and it sucks that I can't have one in the new model." I certainly wouldn't hold it against you - that's a totally fair and accurate statement.

Whats crap is when you start making these weird extra claims that suggest you're either paranoid about Apple trying to screw you, or have such a tenuous grasp of hardware performance that you start claiming the new one is clock for clock slower, PCI-E SSDs are somehow worse, and the iGPU isn't much better.

Stick with the facts - there are use cases for each model where it makes sense. Neither is "all bad for everyone."

The "Newbie" comment was less an observation based on your deliberate focus on dual-core vs dual-core while side stepping the lack of a quad option in favour of dubiously beneficial single-thread performance and more to do with the fact you literally are a newbie. See attached.

As for your continual failure to understand why an external GPU that will never run anywhere near it's full speed and offers no external "solution", faster I/O and storage (as an paid for extra option) changes nothing when even the ludicrously over-priced 3Ghz model is still spanked on multi-threaded CPU performance and still offers poor value.

Your only argument in favour of the 2014 model is the mid-range one because at least it's £70 more than a previously £499 dual-core i5 with 4Gb and a 500Gb to get a faster dual-core with 8Gb and a 1Tb drive. Anything higher up the range is an utter waste of money.
 

Attachments

  • image.jpg
    image.jpg
    47.3 KB · Views: 100
I'm not really sure what your point is, but I have an upgraded 2012 i7 Mac Mini, this PC, and a KVM. My setup is versatile and convenient. The KVM even takes care of my audio switching. I'm not sacrificing much of anything and since both sleep nicely, I never have to reboot a machine to use it.

The 2014 Mac Mini is an easy refresh to avoid and plenty of us are skipping it.

Good for you, smart setup that suits your needs.

Others may wanna have a powerful gpu in OSX too.
2014 has TB2 to make that happen and is an interesting upgrade for dual core 2009-2011 users. It's as powerful as rMBP 13" which to make things worse has to drive a high res display, but I don't see that bashed nearly as much.

Quad core was not possible on BGA1168. Using 2 different motherboards for dual and quads was out of question.

This Mini's only fault is coming out in fall 2014 instead of fall 2013 or spring 2014 (in time for haswell refresh cpus).
 
The "Newbie" comment was less an observation based on your deliberate focus on dual-core vs dual-core while side stepping the lack of a quad option in favour of dubiously beneficial single-thread performance and more to do with the fact you literally are a newbie. See attached.

As for your continual failure to understand why an external GPU that will never run anywhere near it's full speed and offers no external "solution", faster I/O and storage (as an paid for extra option) changes nothing when even the ludicrously over-priced 3Ghz model is still spanked on multi-threaded CPU performance and still offers poor value.

Your only argument in favour of the 2014 model is the mid-range one because at least it's £70 more than a previously £499 dual-core i5 with 4Gb and a 500Gb to get a faster dual-core with 8Gb and a 1Tb drive. Anything higher up the range is an utter waste of money.

Oh please. You obviously meant it in a derogatory way and then called me a fanboy. Not only that, you're projecting what you THINK I'm saying instead of reading what I'm actually saying and understanding it.

let me be super clear here - Obviously a quad core is preferable to a dual core. If the 2014 had a quad core option and I wish it did, I would have gotten it. Obviously non soldered RAM is preferable. But, neither of those two things have occurred. No one is going to change that, it is what it is.

That being said, being mad that the reality is no quad core and no upgradeable ram does NOT make the new Mac Mini a piece of garbage. That is my solitary point. The unfortunate position of many on this forum is that just because there isn't a quad core option obviously the new mac mini is a vagina troll from the womb of hades herself. That's just not true. It is a perfectly capable and fine machine provided it fits your use case.

For those who NEED more cores, they either already have a 2012 or can buy one. If that's not an option they will HAVE to do something else.

I'll further explain via an analogy. Lets say you're a Mustang fan and all of a sudden one model year they no longer make a GT. The non-GT models don't suddenly suck more as a result. They're still fine for the same things they were always fine for. You CAN be mad theres no more GT, but you don't have to dump on the non-GT also to overdramatize and overemphasize your disappointment.

Is that clear enough, or do you need more?

At no point did I say the dual core was better or equal to a quad core. Simply that the entire line is now not rendered totally useless as a result.
 
Oh please. You obviously meant it in a derogatory way and then called me a fanboy. Not only that, you're projecting what you THINK I'm saying instead of reading what I'm actually saying and understanding it.

let me be super clear here - Obviously a quad core is preferable to a dual core. If the 2014 had a quad core option and I wish it did, I would have gotten it. Obviously non soldered RAM is preferable. But, neither of those two things have occurred. No one is going to change that, it is what it is.

That being said, being mad that the reality is no quad core and no upgradeable ram does NOT make the new Mac Mini a piece of garbage. That is my solitary point. The unfortunate position of many on this forum is that just because there isn't a quad core option obviously the new mac mini is a vagina troll from the womb of hades herself. That's just not true. It is a perfectly capable and fine machine provided it fits your use case.

For those who NEED more cores, they either already have a 2012 or can buy one. If that's not an option they will HAVE to do something else.

I'll further explain via an analogy. Lets say you're a Mustang fan and all of a sudden one model year they no longer make a GT. The non-GT models don't suddenly suck more as a result. They're still fine for the same things they were always fine for. You CAN be mad theres no more GT, but you don't have to dump on the non-GT also to overdramatize and overemphasize your disappointment.

Is that clear enough, or do you need more?

At no point did I say the dual core was better or equal to a quad core. Simply that the entire line is now not rendered totally useless as a result.

I understand you perfectly but I also know when I'm responding to someone who believes they're literally incapable of ever being wrong about anything because their ego simply won't allow it. Hence your lengthy, stubborn and defensive word-vomit responses.
 
I understand you perfectly but I also know when I'm responding to someone who believes they're literally incapable of ever being wrong about anything because their ego simply won't allow it. Hence your lengthy, stubborn and defensive word-vomit responses.

Typical of an internet forum post. Nothing I've said is at all factually incorrect, but because it doesn't fit with your reality you resort to insults. Feel free to direct your nerd rage elsewhere, I won't waste any more time on it.
 
barkmonster could you please go explain all these people here:

http://bit.ly/10sVFdI

how their eGPUs are choking on bandwidth?

Of course they will not go 100%, doesn't mean it's pointless.

Up until a while ago (before pcie3.0) pcie2.0 16x was the norm.

TB2 currently allows for pcie2.0 4x eGPU, just 1/4 of what was the norm not so long ago.

TB3 (fall 2015) will be backed by pcie3.0 4x (~equal to pcie2.0 8x), or just 1/2 of what was the norm not so long ago.

Looking at the graphs here (and we're talking about a GTX Titan), pcie2.0 8x doesn't exactly looks like "choking":

http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/

at least for gaming.
Maybe it takes a bigger hit in other applications, still not pointless.
 
barkmonster could you please go explain all these people here:

http://bit.ly/10sVFdI

how their eGPUs are choking on bandwidth?

Of course they will not go 100%, doesn't mean it's pointless.

Up until a while ago (before pcie3.0) pcie2.0 16x was the norm.

TB2 currently allows for pcie2.0 4x eGPU, just 1/4 of what was the norm not so long ago.

TB3 (fall 2015) will be backed by pcie3.0 4x (~equal to pcie2.0 8x), or just 1/2 of what was the norm not so long ago.

Looking at the graphs here (and we're talking about a GTX Titan), pcie2.0 8x doesn't exactly looks like "choking":

http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/

at least for gaming.
Maybe it takes a bigger hit in other applications, still not pointless.

Nobody in their right mind would go to the expense of a Thunderbolt enclosure to run a GPU with a dual-core mobile i5 on a desktop based on it. It's a false economy and even by your own figures, a waste of money for running GPUs to their full potential. Claiming something that's operating at 1/4 it's designated bandwidth isn't "that bad" because a few years ago it would have been fine defeats the arguement. It's designed to perform at it's peak with a set amount of bandwidth that it's simply not being provided with. I used to have a SATA card in a G4. The theoretical bandwidth should have been much higher than the 60Mb/s at most it provided because it was choking on the PCI bus.
 
In no particular order

1) Cooler, quieter, less power consumption
2) Wifi AC
3) PCIE 4x 1TB flash option with Apple firmware optimization and Apple TRIM
4) new UEFI+GPT bootcamp of PCIE ssd Macs (useful for some stuff like plug and play external GPUs in Windows)
5) Two thunderbolt ports for a grand total of 12 TB devices (think of the possibilities, one port could be completely dedicated to eGPU)
6) Thunderbolt2 instead of Thunderbolt1 (20Gbps vs 10Gbps), again useful for eGPU

(subject to changes and additions)

Now many of you will hear about "eGPUs" for the first time, but basically it's now extremely easy to hook up a badass nvidia Maxwell GPU (like a gtx 970) to your Yosemite Mini (using products like Vidock 4, Akitio Thunder2, Sonnet III D and the like), there's a lot to read about it on "tech inferno forum".

http://bit.ly/1FMdAvD

http://www.journaldulapin.com/?p=17538

Image

In that regard, the 2014 Mini, having TB2 and two ports, is better equipped.

Think about what kind of "modular" Mini the 2014 can end up being.
A 4x1TB ssd raid0 for booting and 5x6TB for storage on one TB2 port.
A GTX 970 on the other port.

Down the line, 3-4 years from now you only change "the brain" (the Mini) but you keep all the Thunderbolt equipment. The miracles of having an "external PCIE" interface.

Apple won't give us a "MacX" or "MacCube", but with two TB2 you can basically build one on the outside.
With one TB1 not so much.

But go ahead hoarding on the 2 years old model. (of course people actually benefitting from a quad core in their workload are right, but everybody else...it's a "2012 Mini hysteria"....)

This is a fetish type proposition. IMO you are working backward. Chances are if you are that serious about video cards and gaming you are probably running Windows, so why not buy a pc tower with all the Pcie and hardrive slots you would need? Why on earth would you want spaghetti all over the place and have to pay through the nose just to bring you to a point where you can justify the mini? The new mini is a devolved piece of gear and there isn't anything anyone can say that would change my mind about this. Had there not been a quad core before, there wouldn't be so many disappointed mini users.
 
It's designed to perform at it's peak with a set amount of bandwidth

I don't care what you think they're designed to.

Whoever lived in the AGP 4x era will remember even back then people debated the usefulness of AGP 8x.

90% of the commonly available GPUs in 90% of home user cases were and are far from saturating the fastest available interface.

As I showed you (numbers, not philosophy "it's designed to"), the current "good enough" interface for home users (= no rocket science calculations off the GPU) is pcie2.0 8x (or pcie3.0 4x). Those graphs speak for themselves.

TB3 by fall 2015 will allow external pcie2.0 8x.

As for the whole system being limited by the CPU, you probably missed how Apple itself lead the way in skewing systems towards GPU-centric-ness.

Give a man a 3.0Ghz i7 dual core and a GTX 970, and wait until he complains about how he feels CPU-limited. You'll wait a lot. Here's a recent SFF gaming box from Gigabyte:

http://www.gigabyte.com/products/product-page.aspx?pid=5096#ov

i5 2.8Ghz dual core + gtx 760.....madness! :rolleyes:
 
This is a fetish type proposition. IMO you are working backward. Chances are if you are that serious about video cards and gaming you are probably running Windows,

Not necessarily.
Cue 3D games (es. Bioshock Infinite) on Mac App Store.
Cue Steam for OSX
Cue Blizzard games for OSX
Cue GPU acceleration in video players like Movist, Quicktime, XBMC and Plex using the system-wide api "VDADecoder"
Cue CUDA/OpenCL-powered applications

so why not buy a pc tower with all the Pcie and hardrive slots you would need?

Can't install both OSX (officially with no update hassles) and Windows on those.

Why on earth would you want spaghetti all over the place and

You're the second poster mentioning this.
Proper cable management is a disappearing art, apparently.
 
Apple could just make a non pro gaming machine and solve all this. They don't because Apple hates you.
 
I don't care what you think they're designed to.

Whoever lived in the AGP 4x era will remember even back then people debated the usefulness of AGP 8x.

90% of the commonly available GPUs in 90% of home user cases were and are far from saturating the fastest available interface.

As I showed you (numbers, not philosophy "it's designed to"), the current "good enough" interface for home users (= no rocket science calculations off the GPU) is pcie2.0 8x (or pcie3.0 4x). Those graphs speak for themselves.

TB3 by fall 2015 will allow external pcie2.0 8x.

As for the whole system being limited by the CPU, you probably missed how Apple itself lead the way in skewing systems towards GPU-centric-ness.

Give a man a 3.0Ghz i7 dual core and a GTX 970, and wait until he complains about how he feels CPU-limited. You'll wait a lot. Here's a recent SFF gaming box from Gigabyte:

http://www.gigabyte.com/products/product-page.aspx?pid=5096#ov

i5 2.8Ghz dual core + gtx 760.....madness! :rolleyes:

Advances in GPU blindside people like you into ignoring the mediocre CPU power.

You think large numbers of people buying Macs care more about gaming than overall performance? Seriously? That's hilarious :)

If Apple cared about gaming they wouldn't have integrated GPUs on most of their range.

Your "man" is the typical dope who puts pictures of his pets and dinner on Instagram, browses the web, does nothing creative and wouldn't know why he needed a non-integrated GPU in your case or a quad CPU in countless more even if it was explained to him. :)
 
Last edited:
1) Cooler, quieter, less power consumption
Not relevant: Basically both are using 22nm chips, the relevant factor defining energy use. As the haswell is underclocked by default, the idle is a little lower. Most important reason it is cooler: it lacks 4 threads! The old mini's idle at 10 watt and cannot be heard. This is now maybe 8 watt and not heard. Well that is a big step....

2) Wifi AC: not relevant. Chicks want radiation free homes. There's the same old gigabit for that.
3) PCIE 4x 1TB flash option with Apple firmware optimization and Apple TRIM.
You rather spend the price of a New Mac Pro on PCI SSD than have the ability to use 2 Sammy 450 pro's in RAID 0 for 400 bucks.... Yeah right!

4) new UEFI+GPT bootcamp of PCIE ssd Macs (useful for some stuff like plug and play external GPUs in Windows)
You buy a mac or a PC. Any old PCIX2 Core2Duo PC with Nvidia Titan will blow away any EGPU solution.

5) Two thunderbolt ports for a grand total of 12 TB devices (think of the possibilities, one port could be completely dedicated to eGPU)
Do you drive your BMW Mini with 6 trailers?

6) Thunderbolt2 instead of Thunderbolt1 (20Gbps vs 10Gbps), again useful for eGPU. Again, why spend a buckload on a GPU that is hardly utilized in OSX.
 
I'm not really sure what your point is, but I have an upgraded 2012 i7 Mac Mini, this PC, and a KVM. My setup is versatile and convenient. The KVM even takes care of my audio switching. I'm not sacrificing much of anything and since both sleep nicely, I never have to reboot a machine to use it.

The 2014 Mac Mini is an easy refresh to avoid and plenty of us are skipping it.

Not trying to sabotage thread, but what kind of KVM switch and display do you use? I'm currently wiring house for Ethernet and will be running 2012 Mac mini for servers but I've been reading a lot contradicting things on using KVM switches and consoles with macs so I was thinking about running headless.
 
Not trying to sabotage thread, but what kind of KVM switch and display do you use? I'm currently wiring house for Ethernet and will be running 2012 Mac mini for servers but I've been reading a lot contradicting things on using KVM switches and consoles with macs so I was thinking about running headless.

I don't recommend using KVMs for the V part. So, make sure your monitor has at least two HDMI/DVI/DP inputs. Here's the monitor I'm using:

http://www.newegg.com/Product/Product.aspx?Item=N82E16824014375

The KVM I use is no longer available, but here it is:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817107054

So here's what I would buy if I needed a replacement:

http://www.amazon.com/TRENDnet-4-Po...&srs=2530657011&ie=UTF8&qid=1413044443&sr=1-7
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
But that "brain" is a mobile CPU. No comparison whatsoever to a nMP's Xeons. Sure a mobile CPU can turbo for awhile but will eventually throttle down. Those Mac Pro Xeons can crunch 24 hrs a day for years in a row. Modularizing a Mini is like putting a Porsche body on a VW.

My 2012 Mini turbos indefinitely. I just keep it in a cool well ventilated environment and keep the fan cranked up. It's never throttled down to its base clock.
 
My 2012 Mini turbos indefinitely. I just keep it in a cool well ventilated environment and keep the fan cranked up. It's never throttled down to its base clock.

Amen to that. How can I check to see if mine is throttled down or not? I keep the fan at 2600 all the time with smcFancontrol. Also cool and well ventilated.

Except when my cat has draped herself over it for warmth.
 
My 2012 Mini turbos indefinitely. I just keep it in a cool well ventilated environment and keep the fan cranked up. It's never throttled down to its base clock.

How do you know this? It's unlikely to be turbo clocked indefinitely. Otherwise it would be based at whatever clock speed you claim it is indefinitely running at.
 
I was under the impression that only noise i7 Mac Mini was the 2011 server model because the CPU generation it used was far less power efficient than the IvyBridge CPUs in the 2012 models..

According to Apple's own supplied noise level data, this year and last year are the same noise levels at idle and in operation, one you subtract the i7 quad model.

Now there may be external tests that are a bit more detailed but according to what Apple is providing, the implication is the i5 versions and i7 dual cores are the same noise level this year and last.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.