Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I really haven't followed this thread to its entirety, so please forgive me if this is a retread, but who is to say that they will hold it back? There are legitimate performance gains that can be given to the iMac and Mac Pro that can't be given to the MM that don't require artificial restraints.
Apple isn't worried about the MM vs MacPro. MacPro buyers are in an entirely different league. They are concerned with the iMac and MBP. Obviously not everyone who looks to purchase a MBP would opt for a MM instead, but offer a compelling graphics option in a MM and there would a non-trivial % who would would reconsider and go with some combo of MM and iPad/Air/older MBP instead of a new MBP.

Apple has always held the mini back from eating into the iMac and MBP sales. They will continue to do so. So look for the bare minimum in integrated GPU upgrade for the next MM update.
 
The best graphics that could possibly be used are apparently in the i7-4900HQ and i7-4950HQ. If they are not used, than that will be your "gimp."
 
The 5100 is going into Ultrabooks I thought. Something else would go into the MacBook Pro and mini.

Who knows. Intel does not actually link the numbers to the chips. I went with their TDP's that stated the 5100 was paired with the 28W and since the MBP's are dealing with current 35W, I just ASSumed...
 
Haswell will be a nice upgrade in terms of the GPU. However as a few here have said I can see Apple making the Mini even smaller with soldered ram and Msata storage.

If that happens, it'll be the last new Mini I buy. Luckily, always plenty of 2nd-user Minis up for auction on sites like eBay.

Though most of my serious work is still done in OS X, frankly, I've already switched stuff over to Windows 7 & 8 in anticipation of more cosmetic nonsense like this from Apple.

Smaller, thinner laptops? Excellent! A genuinely beneficial evolution for most users requiring portable computing. We all appreciate that. Increasingly smaller, thinner desktops at the risk of compromising power, graphics, etc. with soldered RAM? :rolleyes: Just embarrassing.
 
If they go with a dual core processor without discrete graphics on the $799 model mini, I will be disappointed. I have faith in Apple though that they will not be that foolish.
 
I don't know how "Haswell Mini" will looks like, but I'm almost sure that the Late 2012 version will be the last with on-board FW800.

And FW800 stay a very reliable tech in some domains, despite all ppl likes to mock it : many audio interface still use it and you can edit or grade few 2K video streams with storage system based on it.

TB>FW800 bridge works well for datas, but audio hardware are very sensitive about latency or glitches on a chains, and it could be complicated to mix TB storage/chassis, "normal" DisplayPort monitor and such hardware.

It's fun how Apple launch the Mini as an entry-mac, but now it's just (in it's own and unique segment) one of the best workstation of the market.
 
It's fun how Apple launch the Mini as an entry-mac, but now it's just (in it's own and unique segment) one of the best workstation of the market.

It is the future of workstations. If I put my Mini in a small protective bag after work and take it home I can continue in the weekend. Thanks to the SSD/HD dual drive it is ridiculously fast, yet super portable. It is even easier to take around than a MacBook Air as there is no power brick. I think we had the biggest and hottest chips back in 2007 or so, and from that year, we will shrink back to 1980 computer power levels.
http://www.sociamedia.nl/ict/energy/
 
You are arguing outside the normal for the mini. The vast majority of mini's had no discrete GPUs.

Original 2005 yes. ( 32MB VRAM G4 based so not all that surprising. )
2006 no ( Intel GMA 950 )
late 2006 no ( Intel GMA 950 )
2007 no ( Intel GMA 950 )
2009 no ( Nvidia 9400M )
late 2009 no ( Nvidia 9400M )
2010 no ( Nvidia 320M )
2011 partially ( either HD300 or discrete 6630M 256MB VRAM )
2012 no ( HD4000 )

Guess you mean no discrete Video RAM.
Every Mini had a discrete GPU, they could as well have the CPU take care of the image and feed a shift register for digital HDMI out ;). There is only the choice on whether to bake CPU and GPU on one piece of silicon (together is only a benefit as the bandwidth is much better, not a hike!)
GMA 950 was a dedicated GPU, no matter how bad it was!
All were SEPARATE chips up to the 320M/6630M, and only the 6630M and the Radeon 9200 (mine had 64Mb btw) had dedicated VRAM. Since the HD3000 the GPU is on the same silicon as the CPU.

Sharing RAM is also becoming less of a problem, as the main RAM is speeding up nicely the last few years. 1600Mhz DDR3 is still better than a 2008 9800GTX with 1100mhz DDR3, yet the GTX is twice as fast as HD4000. So the shared memory should not be a real problem, and combining GPU and CPU on one chip is the future (look at all the iPhone iPad chips, they lead the way). Combining 2 on one chip does not make the GPU less dedicated.
 
Last edited:
It is the future of workstations. If I put my Mini in a small protective bag after work and take it home I can continue in the weekend. Thanks to the SSD/HD dual drive it is ridiculously fast, yet super portable. It is even easier to take around than a MacBook Air as there is no power brick. I think we had the biggest and hottest chips back in 2007 or so, and from that year, we will shrink back to 1980 computer power levels.
http://www.sociamedia.nl/ict/energy/

I get the feeling Apple doesn't quite know what to do with the Mini. The fact they are being used as workstations and professional servers is amazing.

It's so odd that Apple has the Mini and the Mac Pro yet show lukewarm support for both even though there are devotees of each. Yet they push the iMac which I'll never buy because it has too many drawbacks to list here.
 
If they go with a dual core processor without discrete graphics on the $799 model mini, I will be disappointed. I have faith in Apple though that they will not be that foolish.

IMHO, since they killed off the discrete HD 6630M GPU in the higher-end 2011 Mini in exchange for Intel's slightly weaker, integrated HD 4000 in both 2012 Minis, I'm distinctly pessimistic that we'll see discrete graphics inside the Mini ever again.

FWIW, I hope I'm wrong, but as reflective AIO's are useless to me, it's one reason why I bought a PC for graphically demanding tasks, whilst keeping a Mini for other serious work.

Guess you mean no discrete Video RAM.
Every Mini had a discrete GPU, they could as well have the CPU take care of the image and feed a shift register for digital HDMI out ;). There is only the choice on whether to bake CPU and GPU on one piece of silicon (together is only a benefit as the bandwidth is much better, not a hike!)
GMA 950 was a dedicated GPU, no matter how bad it was!
All were SEPARATE chips up to the 320M/6630M, and only the 6630M and the Radeon 9200 (mine had 64Mb btw) had dedicated VRAM. Since the HD3000 the GPU is on the same silicon as the CPU.

You're right about the VRAM points, but I'm not sure what interpretation you're using regarding what exactly constitutes "a discrete GPU". Suffice to say, I'm fairly sure it's not an understanding that's commonly shared by most users or the industry at large. :rolleyes:

As you indicate, the early PPC Mac Minis had a low-end discrete GPU. Since then, only the higher-end 2011 Mini had proper discrete graphics, ie. the HD 6630M. All the other Mac Mini GPUs are chipsets integrated onto the motherboard, not discrete.
 
You're right about the VRAM points, but I'm not sure what interpretation you're using regarding what exactly constitutes "a discrete GPU". Suffice to say, I'm fairly sure it's not an understanding that's commonly shared by most users or the industry at large. :rolleyes:

I would concur with your assessmnet. Even though the HD4000 may have its own chipset, the fact that it gets its VRAM by sharing the system RAM means it's NOT a discrete GPU. I adhere to the definition of integrated graphics as offered by PC Mag:

"Locating a computer's display circuitry in the chipset on the motherboard rather than on a separate plug-in card. Integrated graphics shares memory with the CPU (see shared video memory) and provides a more economical alternative to the stand-alone card, which is known as a "discrete graphics" or "dedicated graphics" card. The integrated graphics may be non-programmable circuits or a programmable GPU."
 
Guess you mean no discrete Video RAM.
Every Mini had a discrete GPU, t..... There is only the choice on whether to bake CPU and GPU on one piece of silicon (together is only a benefit as the bandwidth is much better, not a hike!)
GMA 950 was a dedicated GPU, no matter how bad it was!

No. As long as the function of the GPU is packaged ( it doesn't have to be the same die. It could be two dies in a single package) with another major component it is integrated. Typically that means an integration with the man RAM memory controller. As the Memory Controller merged onto the CPU's die (and/or package) that has mean the GPU has been integrated into the package also.

The discrete GPU is where not reusing the main memory controller and not being packaged into the same component.

It is doubtful someone would create a multiple chip module (MCM) with a GPU die that wanted its own path to memory. It meets the limits test on integration because it is in the same module. Connection into one physical package is an integration. I don't see how that it is not in any semantically aware sense of the word.






All were SEPARATE chips up to the 320M/6630M, and only the 6630M and the Radeon 9200 (mine had 64Mb btw) had dedicated VRAM. Since the HD3000 the GPU is on the same silicon as the CPU.

Being integrated specifically with the CPU is not the litmus test of integrated status.
 
IMHO, since they killed off the discrete HD 6630M GPU in the higher-end 2011 Mini in exchange for Intel's slightly weaker, integrated HD 4000 in both 2012 Minis, I'm distinctly pessimistic that we'll see discrete graphics inside the Mini ever again.

FWIW, I hope I'm wrong, but as reflective AIO's are useless to me, it's one reason why I bought a PC for graphically demanding tasks, whilst keeping a Mini for other serious work.

Well after last year where the $799 version had a quad-core processor, I hope they keep it. Putting in a dual-core processor wouldn't be a good idea.
 
Well after last year where the $799 version had a quad-core processor, I hope they keep it. Putting in a dual-core processor wouldn't be a good idea.

Agreed. I'd be very surprised if they dropped quad-core from the pricier Mini. If they did, IMO, it'd also be catastrophic for sales. However, regarding the GPU, I think Apple's foreseeable roadmap for the Mini is staying with Intel's integrated chipsets. Haswell certainly offers improvement on the HD 4000, but it won't have dedicated, faster DDR5 VRAM as with the higher-end 2011 Mini.

For some of us owning the higher-end 2011 Mini, who don't need a faster processor, it's probably debatable (certainly from a financial aspect) whether upgrading onto the next higher-end Mini will even be worthwhile. Waiting for Haswell's successor, Skylake, probably due for circa late 2014 to early 2015 may make more sense. :rolleyes:
 
Being integrated specifically with the CPU is not the litmus test of integrated status.
Anyway, the discussion on naming is irrelevant, as integrated is the future. In 2016, seperate GPU's will be extinct, as does the PCI slot system. The only way for Nvidia to survive, is to have either MS port Windows to 100% GPU code, to start making their own X86 component on the GPU die or to go the 100% ARM SOC route. AMD will have less trouble as it already does both, and combining is easy for them.
 
Anyway, the discussion on naming is irrelevant, as integrated is the future. In 2016, seperate GPU's will be extinct, as does the PCI slot system. The only way for Nvidia to survive, is to have either MS port Windows to 100% GPU code, to start making their own X86 component on the GPU die or to go the 100% ARM SOC route. AMD will have less trouble as it already does both, and combining is easy for them.

Having recently bought an upgradable desktop PC to join millions of other users, I strongly doubt that discrete GPUs will be extinct anytime soon. Certainly not by 2016. At least not on the PC platform, where gaming is still a major interest.

http://www.xbitlabs.com/news/graphi..._Market_Set_to_Prosper_for_Years_to_Come.html

Integrated chipsets are making good progress. That'll no doubt continue. But there's no chance of integrated chipsets matching the graphical power of the best discrete GPUs for a long time yet.
 
Having recently bought an upgradable desktop PC to join millions of other users, I strongly doubt that discrete GPUs will be extinct anytime soon. Certainly not by 2016. At least not on the PC platform, where gaming is still a major interest.

http://www.xbitlabs.com/news/graphi..._Market_Set_to_Prosper_for_Years_to_Come.html

Integrated chipsets are making good progress. That'll no doubt continue. But there's no chance of integrated chipsets matching the graphical power of the best discrete GPUs for a long time yet.

Agreed. Integrated might be getting a little better now-a-days but they're not up to discreet yet. And I don't see that happening in 3 years. Meanwhile, discreet will continue to get faster.

Even if they close the gap to something really small, discreet will still be more powerful. And some people want (or even NEED) more GPU power. Gamers want, but some professionals need (CAD, video, etc).

Prior to Retina, integrated graphics were probably fine for most laptops (that were NOT gaming laptops). Now though, it might be pushing it a little.
 
Having recently bought an upgradable desktop PC to join millions of other users
Yet the number of newly bought tower desktops is in free fall. Dedicated GPU is dead. Dedicated CPU is dead. For one it is dead today (ipad/iPhone only people), for others maybe in 2020, but SOC's are the only thing for the future. Moore is walking faster nowadays than we can think of new applications. Even a step from 1080p to 4K will only deliver a glitch in the strong line downwards. Turnover for the industry, size, heat, number of components, all go down very fast at the moment.
 
I thought Broadwell was after Haswell.

You're correct, it is. Though Broadwell has the same micro-architecture as Haswell, it should still offer a significant step up. Ditto with Sandy Bridge & its superior successor, Ivy Bridge, which also have a common micro-architecture.

I omitted mentioning it only from the perspective that if I pass on Haswell as not being a worthwhile enough upgrade for me, then I'd probably wait for the far more powerful Skylake chipset, though Broadwell will come before that. Apologies for any confusion.

It also suits my preferred timeline of basically buying a new Mac after every 3-4 years.
 
Yet the number of newly bought tower desktops is in free fall. Dedicated GPU is dead. Dedicated CPU is dead. For one it is dead today (ipad/iPhone only people), for others maybe in 2020, but SOC's are the only thing for the future. Moore is walking faster nowadays than we can think of new applications. Even a step from 1080p to 4K will only deliver a glitch in the strong line downwards. Turnover for the industry, size, heat, number of components, all go down very fast at the moment.

True, they've declined sharply. As have all desktop sales. In the case of PC desktops, it's not that surprising considering the huge numbers sold in bygone years. Ultimately, every market eventually peaks. Then it's usually a matter of course that sales decline.

As for this idea that dedicated GPUs & CPUs are "dead": IMO, there's no basis for assuming this. I agree that it may happen fairly soon with consumer Macs. Then again, the Mac userbase has always been relatively small. It still is. The choice of hardware has always been limited as it's wholly dependant on what Apple decides to offer. However, the limited range of available Macs doesn't reflect the computer industry at large.

Thankfully, things work very differently on the PC platform. There's always been so much more choice. :) If Dell & HP stop making upgradable towers, it's really no big deal. Seriously. There are many other PC hardware makers that will be keen to step up to meet any demand for that segment of the market for years to come.

Keep in mind, there are hundreds of millions of PC users out there. Many still with upgradable towers that they've upgraded over the years. The idea that all of them, or even most of them, will suddenly have no need for upgradability because other computing devices are available, just lacks credibility. :rolleyes:

For eg., I have a Mac Mini, a Mac laptop & an upgradable PC. All serve different needs. I may also buy a tablet soon. That doesn't mean I'll get rid of my PC. In fact, I'm very sure that, whether I buy a tablet or not, my current upgradable PC won't be my last one.
 
Yet the number of newly bought tower desktops is in free fall. Dedicated GPU is dead. Dedicated CPU is dead. For one it is dead today (ipad/iPhone only people), for others maybe in 2020, but SOC's are the only thing for the future. Moore is walking faster nowadays than we can think of new applications. Even a step from 1080p to 4K will only deliver a glitch in the strong line downwards. Turnover for the industry, size, heat, number of components, all go down very fast at the moment.

How is this different from any other point in computing history? As things get smaller, more discrete components are absorbed into the cpu package. There isn't anything meaningful to declaring time of death on whatever computing item.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.