Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Being realistic about it, based on the direction Apple has chosen, other than getting better CPUs from Intel, the one really missing choice is other video card options. But really, we have yet to see real world performance from these cards.

But replaceable GPUs (and more choices) would be the big thing for me. The CPU and memory should last me quite a while. Video cards are what fall behind.
 
I do not want much
Apple if you reading this please add an 2" to hight and give us 3 SSD bays and an extra CPU slot
 
Just added native support for USB 3.0 on Haswell chipset to wishlist.
 
an updated ACD.
Not necessarily 4k, but at least not 3 times as thick as the iMac and with design language that is closer to the iPad Mini and nMP.

an extended, wireless, black and grey apple keyboard.
black and grey mouse and/or trakckpad.

put power switch, USB and headphone jack in the front.

since we are apparently going to need to have a lot of cables on our desks from now on, how about some elegant apple designed cable management solution?

TB3
 
Just need GPU card upgrades, and a new CPU card when the new processors are out...
 
Should this "turd" flop there will be no more computers other than iMacs and MacBooks.

Professional workstations are seldom profitable.




----------

An iMac would be the only alternative.

Having used the turd briefly I don't think it will flop. I reckon they will sell 3/4 times more of them than the old towers and take a bite out of the lower end workstation market forcing hp and dell to look at this thermal core design.

I'd like a big box to sit or host underneath the 7,1, with four thunderbolt leads going into it. Inside the box 4 SATA sleds, and some PCIe card slots. Silver would be nice but I will have it in shiny black as long as it has a cheesegrater face!
 
Going all out futuristic I would like a cube design, like a modern lego but with a seamless design.

Each major component would be within a cube section so you can remove or add cubes (components) with a click and thats it. Obviously thats a radical design. Want a 3rd or 4th processor? just add another CPU cube to your fMP.

I would also expect perfect voice communications (think Hal) and a link that manages my entire house from heating, lights to person identification based on a human auraprint, if I walked into my office showing stress, the lights would dim, gentle music would play and my fMP would gently coo to me :)

If thats not doable (come on Apple!) then everything just kept up to date with current tech but backwards compatible.
 
4 PCI slots,
4 drive bay slots I'm OK with 2.5",
8 or 16 dim sockets this is a must,
Something that doesn't look like garbage can :)
 
yeah.. the concept, story, acting, filming are great..

pretty lame that spike jonze uses fcpx though.. i'm sure the movie would've been better if he used real pro software.
;)

Where have you heard it was edited on FCPX? I can't find that info anywhere.

That being said, FCPX is fine at this point (for the most part) but the damage is likely done. I doubt they'll ever see the same market share with it again (at least among the"pro" shops).
 
Where have you heard it was edited on FCPX? I can't find that info anywhere.

i haven't heard (or read etc) that it was used on her specifically.. i know he's used final cut on recent projects of a less blockbustery nature..

i'll see if i can dig up some links tomorrow (or hopefully someone else finds out first.. those are down-the-rabbit-hole searches :) )


EDIT- yeah.. it's hard searching for things like jonze and apple or software together because the subject of his latest film keeps popping up (because of the link to siri)..

here's one find which alludes to final cut being used on her..



http://variety.com/2011/digital/news/hollywood-s-editing-divide-1118030479/
Final Cut has made its share of converts. Eric Zumbrunnen cut Spike Jonze’s “Being John Malkovich” on Avid but, for the last five years (including Jonze’s “Where the Wild Things Are”), has worked exclusively on Final Cut. “Spike knows (I switched), but I don’t think he cares,” said Zumbrunnen, “as long as it allows him to see what he wants to see sooner.”

Zumbrunnen saw an edge in using Final Cut for the “Wild Things.” “We needed to carry many tracks of 24-bit audio to preserve the actors’ dialogue, preserve it in the rough cut and export the media without the sound designers having to re-conform everything.” Though Zumbrunnen cites the “lack of rigidity with Final Cut” as appealing, he reports that “the re-rendering of things can be frustrating.”
 
Last edited:
EDIT- yeah.. it's hard searching for things like jonze and apple or software together because the subject of his latest film keeps popping up (because of the link to siri)..

Yeah, I was running into a lot of the same results.


here's one find which alludes to final cut being used on her..

http://variety.com/2011/digital/news/hollywood-s-editing-divide-1118030479/

That article definitely conveyed the Avid vs. FCP mindset some editors had (and still have), but funny enough, it was published in January of 2011 (mere months before any of us found out about the drastic changes Apple was making). So they were talking about FCP7 there. Would be interesting to hear some of their perspective on the landscape now, especially with Adobe making inroads in the editing community.

I doubt "Her" was edited with FCPX. Although if it was, Apple should jump on that for marketing purposes. The article you linked to made specific mention of when Walter Murch switched to FCP to edit "Cold Mountain" and I can remember Apple making a big deal about it back then, basically saying "hey, look. Big time Hollywood editors are using this too!" So really, if any high profile film/show/commercial/etc. is cut on FCPX, then Apple should tout that to try and shake the misnomer of it not being ideal for "pro" applications.
 
I doubt "Her" was edited with FCPX. Although if it was, Apple should jump on that for marketing purposes. The article you linked to made specific mention of when Walter Murch switched to FCP to edit "Cold Mountain" and I can remember Apple making a big deal about it back then, basically saying "hey, look. Big time Hollywood editors are using this too!" So really, if any high profile film/show/commercial/etc. is cut on FCPX, then Apple should tout that to try and shake the misnomer of it not being ideal for "pro" applications.

right. i would think so too.

i'm retracting my earlier statement (the one about his recent projects) because even though those were made last year, i found out they were on fcp7.. (spike jonze is in the skateboard industry.. the videos he makes are the recent projects i'm speaking of)

actually, i'll retract the original statement of her being edited on fcpx until further notice (i.e. proof) :) i don't want to be spreading bad info like that.

on a side note, it's interesting that siri has maybe 10 or so responses to asking "are you her?" (siri will respond things like "no, but some of my best friends are fictional" or "joaquin? is that you?" etc.)

that could be viewed as fun&games with a client, taking a stab at an ex-client, or over-reading into the fcpx side of apple and there's no actual correlation.
 
I do not want much
Apple if you reading this please add an 2" to hight and give us 3 SSD bays and an extra CPU slot

That won't physically work.

A 2nd CPU socket would pragmatically need another 4 DIMM slots. Not going to get 4 DIMM slots in 2" taller cylinder. No way, no how.

A 2nd CPU socket also would throw the thermal core out of whack and a 2" taller core isn't going to cut it anyway. The clocks on the GPUs are already chopped down. Another major heat source would chop them down more.

Mac Pro is currently PCIe lane oversubscribed. Questionable they can round up another x4 lanes for a 2nd SSD. Highly unlikely going to find 8 more lanes. Can just crank up the oversubscription but all doing is diluting the bandwidth to these SSD. Highly doubtful Apple (or a large percentage of customers ) is going to bite on slower throughput SSDs.


The real reality is that next gen E5 will have more cores ( which over longer term decreases the need for dual CPU packages). Higher core counts will simply be packed into a single package. Similarly memory densities will increase. 128GB , 256GB , etc... over time those will fit in 4 DIMM slots. There is a long and very established track record showing exactly that over last 20 years.

----------

Just added native support for USB 3.0 on Haswell chipset to wishlist.

That is already basically announced already by Intel. The next generation C610 (Wellsburg ) chipset that matches E5 v3 (Haswell) has USB 3.0.


http://www.xbitlabs.com/news/cpu/di...icroprocessors_for_Launch_in_2015_Report.html

----------

Just need GPU card upgrades,

I wouldn't hold my breath on these GPU connectors working with new, next gen GPU boards. Saddling with Crossfire more likely makes them "one offs".

The next design would hopefully enable the option of getting into the GPU upgrade service, but Apple needs to plan and develop on multiple fronts to enable that.

and a new CPU card when the new processors are out...

New CPUs need new chipsets. Those are on different boards. By time have swapped out 4-5 boards ( CPU , cihipset/backplane, GPU , GPU . and possible I/O port board ) pragmatically have bought a new computer. There is hardly anything left.

For example in next generation chipset USB 3.0 is in the core chipset. Right now that is implement on the I/O port board. Next generation the board placement changes.
 
That won't physically work.

A 2nd CPU socket would pragmatically need another 4 DIMM slots. Not going to get 4 DIMM slots in 2" taller cylinder. No way, no how.

A 2nd CPU socket also would throw the thermal core out of whack and a 2" taller core isn't going to cut it anyway. The clocks on the GPUs are already chopped down. Another major heat source would chop them down more.

Mac Pro is currently PCIe lane oversubscribed. Questionable they can round up another x4 lanes for a 2nd SSD. Highly unlikely going to find 8 more lanes. Can just crank up the oversubscription but all doing is diluting the bandwidth to these SSD. Highly doubtful Apple (or a large percentage of customers ) is going to bite on slower throughput SSDs.

If they added a second processor they could get up to 80+ lanes--more than enough. As for the thermal core, they could swap that out and have individual fans and heat sinks for the processors and GPU. Maybe they could add a few more drive bays and room for the extra dimm slots you mentioned. Wait, I've got it!:

attachment.php


Anyways, you make it sound like adding a second CPU to the tube design is so tough. Flat five did it in photoshop; see? Easy!: /s

Image
[...]
dunno.. sounds kind of tough but at this point.. i wouldn't say it's impossible.
 
Not sure of the point in this? For those that like the new Mac Pro design, the answer is simply "faster/better everything", while for those that don't, well probably snarky comments about internal hard drives, PCIe slots etc. which simply aren't going to happen at this point.

I agree. I used to think if the iTube flopped in terms of sales they would consider returning to the tower form-factor (by this I don't mean doesn't do "okay," I mean if it doesn't have a good ROI for Apple compared to the opportunity cost when counting the new factories, added complexity of supply chain, etc. [remember Apple has insanely profitable product lines they could be using their productive capacity on]). However, now I think regardless of what happens with the "iTube" line, PCIe slots are dead in the Mac. As far as internal upgradability, if Apple can convince its users to roll with the added expense of externalized everything and disposable appliances even in their top tier, that'd be a windfall of epic proportions. We have thousands of users on here using 4-7 year old Mac Pros upgraded to the hilt on here--forcing them all to throw their boxes away and buy anew would be huge.
 
I would also like Apple to release a 32" 4K thunderbolt display by 2015.
I actually hope they dont.
I've got a 32" TV at home. A display that large will be awful at 4k unless Apple makes adjustments in display prefs so that we can get text and icons to a comfortable level.
I'd actually prefer a 27" or smaller form factor for 4k or other ultra-Xtreeem-hires monitors.
 
I agree. I used to think if the iTube flopped in terms of sales they would consider returning to the tower form-factor (by this I don't mean doesn't do "okay," I mean if it doesn't have a good ROI for Apple compared to the opportunity cost when counting the new factories, added complexity of supply chain, etc. [remember Apple has insanely profitable product lines they could be using their productive capacity on]). However, now I think regardless of what happens with the "iTube" line, PCIe slots are dead in the Mac. As far as internal upgradability, if Apple can convince its users to roll with the added expense of externalized everything and disposable appliances even in their top tier, that'd be a windfall of epic proportions. We have thousands of users on here using 4-7 year old Mac Pros upgraded to the hilt on here--forcing them all to throw their boxes away and buy anew would be huge.

Apple's not doing this for the money. This market is barely on their radar in terms of revenue. They're doing this because they believe it's a better approach. This stuff goes all the way back to the beginning. This machine, which Apple has already said is its vision for what a workstation should look like over the next decade, is basically designed around three expected outcomes:

1) Intel will eventually ramp Thunderbolt to 100 Gbps as per current roadmaps.

2) CPU core counts will continue climbing such that dual socket configurations become irrelevant to more and more customers (18 cores with Broadwell-EP, etc.)

3) GPU advances will continue to outpace CPU advances and many of the important new capabilities that computers gain over the next decade (with advances in machine learning, image processing, etc.) will be driven by such GPU advances.

This machine is, in some sense, at its worst right now. As the above trends play out, the approach Apple has taken will only make more sense, not less. Unless one of these expectations turns out to be wrong, Apple is not going to significantly change course here.

----------

I actually hope they dont.
I've got a 32" TV at home. A display that large will be awful at 4k unless Apple makes adjustments in display prefs so that we can get text and icons to a comfortable level.
I'd actually prefer a 27" or smaller form factor for 4k or other ultra-Xtreeem-hires monitors.

Personally, I'm rather interested in the prospect of running a 46-50" 4K TV in non-retina mode. At 4K that TV would have a pixel density similar to a non-retina 24" monitor, so you could sit at about the same distance, on-screen interface would be a reasonable size... but you'd have a screen with 4x as much physical and virtual space. Might have to rethink the working environment a bit to make this plausible. If you had the base of such a screen sitting on your desk at typical monitor viewing distance you'd have to crane your neck uncomfortably to see the top of it, for instance. But I suspect this sort of thing could be accommodated, and the gains could be substantial.
 
Apple's not doing this for the money. This market is barely on their radar in terms of revenue.

Annnnd scene. You heard it here first, Apple did something without money in mind!

You may be right about the future leaning towards some features in the nMP (Tbolt/GPGPU). That still does not justify the form-factor, lack of PCIe slots, lack of second CPU, lack of RAM, etc.

With a less proprietary form-factor, they could've still made Tbolt work. They had options, and they chose to go the way of CUBE [* Saruman voice*].
 
You may be right about the future leaning towards some features in the nMP (Tbolt/GPGPU). That still does not justify the form-factor, lack of PCIe slots, lack of second CPU, lack of RAM, etc.

'More!' is a choice you can make when designing virtually any device. Why stop at two CPUs when you can have four? Or eight? Why stop at 128 GB of RAM when you can have 512 GB? Simply pointing out that a particular device does not have as much of x as it could have is not really a useful criticism. At some point 'more' takes you past the point of diminishing returns. This machine is designed to ride that edge with respect to most of the Mac Pro's customer base, and assuming the above trends play out as expected I think it will do so successfully.

With a less proprietary form-factor, they could've still made Tbolt work. They had options, and they chose to go the way of CUBE [* Saruman voice*].

Thunderbolt and dual GPUs distinguish this machine from what Cube 2.0 would have been.
 
'More!' is a choice you can make when designing virtually any device. Why stop at two CPUs when you can have four? Or eight? Why stop at 128 GB of RAM when you can have 512 GB? Simply pointing out that a particular device does not have as much of x as it could have is not really a useful criticism. At some point 'more' takes you past the point of diminishing returns. This machine is designed to ride that edge with respect to most of the Mac Pro's customer base, and assuming the above trends play out as expected I think it will do so successfully.



Thunderbolt and dual GPUs distinguish this machine from what Cube 2.0 would have been.

And an almost uncanny eerie silence when you are expecting to hear hover mode!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.