Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
One could expect a VR production tool for it at some point.

That would be awesome. Most of the people I know who work on VR and 360 video actually use nMP's.

The specs sound realistic and conservative so you're probably telling the truth. Not sure if it's worth it for me to upgrade. I think I'll stick with my 6c/d500/16GB/1TB and drop a 10c and 64GB ram into it. The machine serves me well and my 4K/5K workflow won't be overpowering it anytime soon.

The only thing I want from Apple at this point is a god damn UHD or 5K standalone display.
 
  • Like
Reactions: ManuelGomes
Aw, it won't be that bad. Up-to-date and faster everything for the same basic price structure. Besides, what would be left to gripe about here if we got every (magical) thing we dreamed of?

By the way, the "no HBM-2" spec is due to delays in its availability. But the message from the masses has been received, and one could hope for a subsequent update with a more significant (and likely expensive) improvement in the top-end graphics cards in about a year's time.

Ok, now I'm skeptical. If Apple update the Tube in 2016 and then update the GPUs again in 2017 then it would be seriously out of character for the company. More likely, they would wait until 2017 for HBM-2.

The VR theory is interesting but wouldn't Apple need to offer both AMD and Nvidia option in the Tube for such development? It's hard to imagine VR developers being content to test their products only on AMD GPUs, especially given AMD's lagging tech and marketshare.

And why did Apple wait three years to update the Tube? Come on, convince us you really have inside info! Go beyond "the tube will be the same, but with new parts that are analogous to the old parts". Tell us something we cannot already predict based on Sierra kexts and Xeon parts lists.

Throw us a bone, man!
 
  • Like
Reactions: Aldaris
Ok, now I'm skeptical.
This whole detour is nonsense - I am especially suspicious of anyone (and everyone) who says something like "that's all that I can say without losing my job".

NDAs (Non-Disclosure Agreements) are pretty clear that you can't say or even hint about anything that you've learned to people who aren't under the same NDA.

Complete and utter nonsense for a troll to say half a dozen things, then say that "I can't say more". Only the naïve would fail to understand that she could lose her job over the first half dozen things.
 
The only thing I want from Apple at this point is a god damn UHD or 5K standalone display.

Why don't you just buy third party? It's not like Apple will include a sprinkle of the ashes of Steve Jobs within every 5K standalone display. They will just be overpriced versions of panels available from every other manufacturer, except with thermal issues resulting from being too thin.
 
Last edited:
Why don't you just buy third party? It's not like Apple will include a sprinkle of the ashes of Steve Jobs within every 5K standalone display. They will just be overpriced and probably suffer thermal issues as a result of being too thin.

The current crop of 3rd party (Dell, HP, LG, et al.) 4K and 5K Displays have relatively poor build quality and sleep/wake issues with OS X, and they don't have extra features like the TB hub, webcam, and Macbook charger that the recently discontinued Apple 27" display had. They have great prices though. I love the fact that I can get a 24" UHD monitor with 10bit color and near-100% conventional gamut coverage for just $379. That was just simply unheard of 5 years ago. Basically the only thing the PC world has going for it is cost-effectiveness, mostly because that industry is a race to the bottom.

Your comment about "thermal issues" and "thinness" likely stems from a general ignorance and pessimism towards Apple that seems to be the trend on this website as of late. Mods must be asleep. You've probably never used one, but Apple's displays in the past have had 1) excellent build quality and fairly competitive color reproduction and image 2) flawless operation, compatibility, and simple connections to corresponding Mac computers. No muss or fuss, no sleep/wake issues, you just plug it in and it works. Oh yeah, and it just looks beautiful sitting on a desk and impresses clients. But I refuse to pay 500-900 dollars for discontinued 2011 technology, so I'm just waiting until September for the inevitable standalone 5K display. I'm making due with my HP Dreamcolor until then.
 
Last edited:
  • Like
Reactions: ManuelGomes
So, again, all that software exists on OSX but nobody is using it? Are companies buying it, but just letting it sit on a shelf? How does the economics of all that porting and testing work? Could you be wrong, and perhaps some people are doing CGI with Macs?
I hope what you say is true and finally Macs are being used for CGI work. I just never see a Mac when I watch the making-of footage of CGI-heavy films.
 
  • Like
Reactions: George Dawes
Your comment about "thermal issues" and "thinness" likely stems from a general ignorance and pessimism towards Apple that seems to be the trend on this website as of late. Mods must be asleep. You've probably never used one, but Apple's displays in the past have had 1) excellent build quality and fairly competitive color reproduction and image 2) flawless operation, compatibility, and simple connections to corresponding Mac computers. No muss or fuss, no sleep/wake issues, you just plug it in and it works. Oh yeah, and it just looks beautiful sitting on a desk and impresses clients. But I refuse to pay 500-900 dollars for discontinued 2011 technology, so I'm just waiting until September for the inevitable standalone 5K display. I'm making due with my HP Dreamcolor until then.

With statements like the bolded one above it's hard for me to take anything you say seriously. Apple long ago made the decision to sacrifice functionality and reliability at Ive's altar of thinness. Since then their hardware has excelled at solving imaginary problems. "My Mac Pro has too many PCIe slots!" "What's with all these ports on the sides of my MacBook?" "Help! My iMac is too thick! I can't reach any of the ports because they're hidden behind all the thickness!"

Maybe Apple really will offer a 5K display, and maybe it will be thick enough for adequate thermal control. Maybe this 5K display will be introduced alongside the Mac Pro predicted by No One. Such a future is possible. Hope springs eternal. Pass that thin aluminium jug o' Kool-Aid, will ya?
 
Last edited:
Since then their hardware has excelled at solving imaginary problems.

Phil: You asked for a computer with better support for multiple power-hungry graphics cards...
Audience: *audible drawing of breath*
Phil: Here's your new computer, with two graphics cards, that are built-in, which you can't change, that will never be updated, and you'll only be able to make use of one for most tasks including driving displays.
Audience: thef*ck?
Phil: GIVE IT UP FOR ME!
 
I hope what you say is true and finally Macs are being used for CGI work. I just never see a Mac when I watch the making-of footage of CGI-heavy films.

You are conflating CGI with CGI heavy films. Those blockbuster movies are only a part of the CGI industry, and by some metrics (eg total screen time, number of vendors) they are a very small part of the industry. They just happen to be the most visible, due to size and quality of output.

CGI heavy films employ VFX companies for whom 1000 Linux workstations + IT dept works out cheapest, and at that level, money is everything. Feature film VFX is brutally competitive because there are only 7 or so major film clients and dozens of hungry VFX vendors.

All the other CGI in the world, from game cinematics, TV commercials, web content, arch-viz, simulator rides, product visualisation, documentary graphics, TV VFX etc. employs a broad range of companies some of whom find it better / cheaper to work with Macs for a number of reasons.

It's not a matter of companies switching to Macs. Macs have always been in creative companies, and as those companies have moved into CGI the demand was there for the software. And as the software arrived smaller companies have been able to start up using Macs. We got the chicken, then the egg, now more chickens.

Of course, IMHO, the decline is happening now. The egg may go back up the chicken. Macs will never invade the huge VFX companies, but the proliferating smaller ones will be finding Macs less attractive in the current climate. The design of the Mac Pro, the weak graphics of the iMac, the years of silence, the constant threat of abandonment, the soldering, the thermal issues, the annual OS disruption, the costs, are all causing grumbles of discontent and I know from personal communications that software vendors do not enjoy supporting OSX. If demand drops enough for a key player to pull out, there could be an avalanche.
 
Announcement along with a new Macbook Pro in September. Immediate availability is planned.
I do hope they finally decide to produce a 17" rMBP. I just replaced my trusty late 2008 17" MBP 4,1 with a 15.4" late 2013 rMBP which is a lovely machine but I really miss that extra 1.6". I have it scaled to the maximum space but then the fonts & icons are just that bit too small. A 17" or 18" retina screen would be perfect.
 
  • Like
Reactions: filmak
Throw us a bone, man!

He told us about the nnMP's "RX490" not to be a Dual p10 GPU card as WC-Tech blog speculates but a custom GPU (Vega or a custom P10 with more fp64 cores?).

The current crop of 3rd party (Dell, HP, LG, et al.) 4K and 5K Displays

ASUS just announced a 5K USB-C/DP1.3 display, I suspect it will be available at some Apple store in September.
 
He told us about the nnMP's "RX490" not to be a Dual p10 GPU card as WC-Tech blog speculates but a custom GPU (Vega or a custom P10 with more fp64 cores?).



ASUS just announced a 5K USB-C/DP1.3 display, I suspect it will be available at some Apple store in September.

With a $500 Apple logo on it?
 
Is that because of technical issues or more related to the smaller marketshare?

Generally its technical issues - poor drivers, frequent and disruptive OS updates, and Apple being Apple. More than one SW vendor has repeatedly said the OS X version is the most troublesome of the three.

I can only imagine the market share is healthy enough at the moment to put up with it. New Apps are still arriving on the Mac platform so I guess times are still good.

One Red Flag is in the form of GPU renderer RedShift. I could have sworn they used to say 'OSX version coming soon' but no more. It's CUDA only for now, and so thanks to Apple they suddenly have virtually no market on OSX. Hope they hadn't spent any money on it. Another CUDA-only renderer, Octane, still lives afaik, but Octane for Nuke has been put on hold. The developer's own words...

'Whilst the plugin is very stable on Windows and Linux, I have had a lot of problems getting it to compile and run in a stable fashion on OSX. The Nuke OSX plugin development environment is challenging to work with - so I have stopped work on the OSX version. If there is sufficient interest from users, I can take another look in the future.'

Edit: I'm forgetting another GPU renderer, VRay RT. This from 2013 no less...(Ars)...

"Apple may be trying to not-so-delicately nudge everyone to move their code from CUDA to OpenCL, but I’ve seen a first-hand failure of AMD’s OpenCL support with V-Ray RT for Maya. Chaos Group built V-Ray RT on OpenCL, but after extensive work trying to get the GPU variant of its RT render engine running on AMD hardware and an effort by yours truly to light a fire under Apple and AMD, Chaos Group gave up and ported it to CUDA instead. So V-Ray RT’s GPU mode only works with OpenCL and CUDA—on Nvidia hardware.

If Apple wants OpenCL and AMD to be the answer to CUDA, the support needs to be there so this kind of thing doesn’t happen again. As it is, I now have zero options for V-Ray RT GPU on the 2013 Mac Pro since the software doesn’t work on AMD cards, and no Quadro/Geforce card is available for the machine. I’m sure some company will eventually build a Thunderbolt 2 Nvidia GPU-in-a-box for people who need CUDA, but to quote our own Peter Bright, “Thunderbolt 2 is equivalent to 2.5 lanes of PCIe 3 or 5 lanes of PCIe 2,” so it's hard to say what kind of a performance hit that will incur for compute tasks with an external GPU."

Nothing this author wrote improved. Afaik, VRay RT still doesn't work on Mac/AMD combos. eGPU's didn't show. And even worse, it looks like Apple's attempt to destroy CUDA in favour of OpenCL was just the prelude to destroying OpenCL. As for why? I've no idea. Vendor lock-in is a good conspiracy theory, but which vendors? Games? VR? GPU renderers? There isn't going to be anyone left to lock in.
 
Last edited:
  • Like
Reactions: rGiskard
Back when Apple supported OpenCL there was at least a hint potential multi-platform synergy for devs doing things like GPU renderers. Apple's been telling devs for high end software to go away for awhile now.
 
He told us about the nnMP's "RX490" not to be a Dual p10 GPU card as WC-Tech blog speculates but a custom GPU (Vega or a custom P10 with more fp64 cores?).


I suspect "custom" simply means it's on a proprietary PCB and the GPU is downclocked to adapt it to the Tube's weak thermal management. But really, all he did was predict Apple would use new AMD silicon. That's not convincing insider info. May as well call me an insider because I know a new Tube will feature Xeon Broadwell-EP CPUs.
 
At least they've got their priorities right...

New watchbands !!!

I really do think Apple under Capt Cook have given up on pros , he wants you guys to use an iPad !

When I was selling Macs back in the late 50's the profit margin on a Mac was around 10%.
The PM on the Firewire cable I sold the the customer was around 200%.
Apple's PMs were prolly even higher.

Which is why, ladies and germs, Tim wants to sell a zillion watch bands
and sez the iPad Pro is a super computer if you add a dozen adapters and accessories.

maxresdefault.jpg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.