Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm hearing his latest podcast, what he said are old news, "He Thinks, no new Macs in September event..."

He's usually right about things. But one thing to keep in mind is what he said was very specific and does not necessarily mean what people are taking it to mean.
 
  • Like
Reactions: koyoot
Tube, tower, hypercube... just give back the ability to upgrade the machine's GPUs. I'm willing to negotiate on other design choices, but GPU upgradability is pretty much a deal-breaker. CPUs may be stagnating but GPUs still have considerable room to improve year to year. A workstation's CPU/memory subsystem will be perfectly performant long after its GPUs have become poky. And yeah, eGPUs are becoming more practical, but that's a crappy alternative to upgradable GPUs in a desktop workstation. I don't want a couple of obsolete, disused GPUs sucking up power and limiting the thermal headroom of my box.

Dual sockets would be nice; high core counts can be achieved far more economically on a dual socket system... but upgradable GPUs are essential on a graphics workstation IMO.

In 2013 I crossed my fingers, hoping that the AMD GPU's and OpenCL would get much love in my 6,1,the potential was there, the hype was there. 3 years later OpenCL is still way behind CUDA support for my apps. As a dual system user, I wish I had the upgrade path of NVidia to make the Mac more versatile for my business needs. Instead I eventually got fed up and learnt Blender/cycles to get my 3D work done quickly rather than bootcamp switching to 3D Studio Max/Vray (a 20 year veteran user). As an indy developer, It was not a great business decision on my part. So, yes, I agree. We need GPU upgrade paths. I now have VR development opportunities so have no option other than to invest in Windows / NVidia to meet VR Ready status.

I love Apple products but at the end of the day, the tool has to fit the job, forcing the reverse of that is not good for productivity.
 
  • Like
Reactions: ssgbryan and Roykor

This is more a "Think Different" kind of Ad. There is a huge number of people who equate 'computer' with as specific form factor. What the commercial's core message is that form is not function. A computer can be deliver in multiple forms. That fixation on form over function is largely why Apple dropped 'Computer' from their name. It isn't that Apple has the issue, but more so the audience they are talking to that can't (or at least are highly unwilling to ) move a fossilized definition.

Like I said, Tim said the desktop is dead. Guess he really meant it, at least for Apple anyway.

Desktop isn't dead as much as not over inflated with users who don't actually need a desktop. Prior to the early Apple era ( 'late 70s - early 80s' ). A computer used by a very small group (or single usage) was a "minicomputer' about the size of a refrigerator. It was a "mini" relative to the size of a "mainframe" class of computer. Classic Desktop PC expanded the market past what "mini" did. Mobile/laptops expanded past desktops over a decade ago. Handhelds have expanded past laptops and desktops. Computers have gotten smaller (without loosing relative capability) over time. There is an extensive track record on this.

The iPad Pro is Apple's play into the market space that Windows tablets ( Surface) / 2in1 / etc are doing decently well in. It is also a not so well aligned response to ChromeBooks ( not hitting the same price point zone. ). ChromeBooks are another high growth area. Apple's decision to keep the Mac product line out of that blended/transition area has some merits given the negative blow back MS got with Windows 8.

Desktop isn't dead as much as not top priority. That is driven as much (if not more ) by customer's true needs/requirements and actions than by unilateral action by Apple. Apple's objective never was to be "a Dekstop Computer" company. The technology constraints kept them in that box for early years, but there is no indication that was a voluntary, self imposed constraint.
[doublepost=1470150195][/doublepost]
Fixed.
[doublepost=1470055051][/doublepost]I sent an email to Tim Cook a while ago, and made several points to him;

  • No one will buy a product they don't know about. Look at the Apple website; the Mac Pro is hardly present anywhere.

Chuckle. http://www.apple.com/mac/ ( link for "Mac" on the top level page )

Right there in the middle of the row of Mac products listed is the "Mac Pro". On the far right is a "Compare" .... http://www.apple.com/mac/compare/ Mac Pro is on the bottom ( largely due to having the highest price).


Are there disco lights shining on the Mac Pro on Apple's top page? No. It is stale product. The high price (even if wasn't stale) isn't going to be offset by shining a disco ball on it on the top level pages.

What is actually hard to find is iPod through the top level navigation guides (although apple.com/ipod works) .
 
Last edited:
  • Like
Reactions: ManuelGomes
tim cook.jpeg

"What's a real computer? This. Your future Mac pro tablet."

Jon Ive: "As you can see...I made ipad pro to become so user friendly...that my little buddy is officially considered smarter than humans."

satified-ipad-user.jpg

[doublepost=1470162370][/doublepost]
It wouldn't surprise me if nothing came in September. It just leaves the same questions: when? what? why?
"why?" HAHAH
 
  • Like
Reactions: JimmyPainter
Coz a Mellanox dual 100 gps fabric costs ~1000$ per node (Inc fiber optic cable) while 40gbps tb3 it's almost free (actually adds about 40$ per node but will be included on most mainstream workstation as default).

I believe Thunderbolt 3 point-to-point networking is still limited to 10 Gbps, like TB 2. Plus, if you need a cable longer than 9 feet, you're looking at active, fiber TB cables and not the "cheap" copper TB cables.
 
I believe Thunderbolt 3 point-to-point networking is still limited to 10 Gbps, like TB 2. Plus, if you need a cable longer than 9 feet, you're looking at active, fiber TB cables and not the "cheap" copper TB cables.

Since you brought up thunderbolt(TB), I'm going to blame a significant part of the delay of a MP update, at the feet the TB fuster cluck, that Intel and Apple have created. Because Apple is all in on TB, it is now required, because everyone expects it, that Mac desktops have their video output via TB. Thus the GPU has to be built specially to route its pixel output back through the system bus. And in doing so saturating the system bus with a constant stream of pixel data that without TB would have been isolated in a direct cable connection from GPU to monitor. Because Apple insists on TB, and Intel requires video output to be included, or its not TB, Apple has to custom design its own GPU cards. They can't just take an AMD/NVIDIA spec design and jam it into an Apple branded box, because those spec designs don't include TB. So yeah, AMD and NVIDIA have new chips, but Apple has create new GPU cards that use those chips with TB video output.

TB makes sense for small form factor devices where physical space for ports is limited and pixel bandwidth requirements are relatively low. But for a desktop, it makes more sense to let the GPU connect directly to the monitor so you can drive three 4k displays, or two 5K displays, without killing your system bus.

I won't even get into the issue of having TB and USB3.1 ports and cables all look physically identical except for some tiny pictographs to differentiate them. I can't wait for all the problem calls that are coming to tech support because someone is trying to use a USB-C cable as a TB connector, and vice versa. Or worse the wrong cheap-o cable that ends up causing physical damage because some poor soul put the wrong cable in the wrong port.
 
  • Like
Reactions: phairphan
But I doubt that will change anybody's mind about the workstation around here... :D

They're not gonna get with it until every single Xeon has officially been EOL'd. :rolleyes:
Lol. I guess.
It's just weird that they are heading that direction. IMO, desktop (especiallyworkstation) is the basic technological need.
 
According to Dalrymple there will be no new Macs on September event.

The Annual Phone/iPod (iOS device ) event?

https://www.macrumors.com/2016/07/28/iphone-7-7-plus-pre-orders-maybe-sept-9/

Really???? there was any expectation that Apple would come out and pitch Mac in the middle of the iPhone "show"?

The last couple of years Mac have tended to come in a separate October event ( or press release event).

Fall Dates pulled from BuyerGuide ( https://buyersguide.macrumors.com//#Mac )

October 2015 iMac
October 2014 Mac Mini
October 2013 MBP Retina
Sept 24 2013 iMac (after the iPhone event earlier in the month)

Nov 2012 iMac ( I think substantive design change announced late Oct )

October 2010 MBA


Apple has done some Summer Mac drops ( late June , July , early-mid August ), but have relatively consistantly carved out September for iOS/iPod launches. The Macs are typically pushed back from that major logistics hiccup until later in the Fall. An iPad + Mac event in October would not be surprising.
 
  • Like
Reactions: Aldaris and amack
Lol. I guess.
It's just weird that they are heading that direction. IMO, desktop (especiallyworkstation) is the basic technological need.

Things are likely going to decline the opposite direction. Tablets will canibalize from the bottom of the market, not the top. Basic consumer desktops will be in decline, but workstations and higher end laptops won't be. It makes Apple's decision making with the Mac Pro all the more silly. Users like Macbook Air users are the ones looking at switching to an iPad Pro, not Mac Pro users.

The segments Intel is probably looking at shifting are probably things like Netbooks, low end laptops, maybe small form factor PCs as well. Basically Intel is trying to get out of the $500 laptop business. They're not going to start by cutting Xeons.
 
In 2013 I crossed my fingers, hoping that the AMD GPU's and OpenCL would get much love in my 6,1,the potential was there, the hype was there. 3 years later OpenCL is still way behind CUDA support for my apps. As a dual system user, I wish I had the upgrade path of NVidia to make the Mac more versatile for my business needs. Instead I eventually got fed up and learnt Blender/cycles to get my 3D work done quickly rather than bootcamp switching to 3D Studio Max/Vray (a 20 year veteran user). As an indy developer, It was not a great business decision on my part. So, yes, I agree. We need GPU upgrade paths. I now have VR development opportunities so have no option other than to invest in Windows / NVidia to meet VR Ready status.

I love Apple products but at the end of the day, the tool has to fit the job, forcing the reverse of that is not good for productivity.


You seem to be in a very similar spot as I am. Except for the VR, I use 3ds max. But I have big issues converting to blender just to stick with OS X...Though Blender is free, learning a new 3d progam is a long road, a lot longer than getting set up and comfortable in a different OS.
[doublepost=1470211459][/doublepost]

Well, yeah must be a 20 year road. Even companies like Intel and Apple needs the commercials made and cad renders etc. So They will always need powerful machines, and someone needs to make them. I assume Jony Ive today sits and doing CAD on a Windows machine with Nvidia cards and multi core Intel CPUs (probably thats the machine thats always covered with black drapes when you see clips from inside apple ;)
But, Intel wants to get away from pc business, apple wants to switch to arm, Apple cancelled Quicktime for PC, Microsoft said win 10 is the last....Its all just loose plans big companies shares, and an evolution, stuff will always need to be made (CAD drawings, special effects for movies, marketing material, music, games, 3d, etc etc. ) so someone has to deliver the hardware for it, unless we are going back to paper and pen.
 
Last edited:
  • Like
Reactions: H2SO4 and pat500000
You seem to be in a very similar spot as I am. Except for the VR, I use 3ds max. But I have big issues converting to blender just to stick with OS X...Though Blender is free, learning a new 3d progam is a long road, a lot longer than getting set up and comfortable in a different OS.
[doublepost=1470211459][/doublepost]

Well, yeah must be a 20 year road. Even companies like Intel and Apple needs the commercials made and cad renders etc. So They will always need powerful machines, and someone needs to make them. I assume even Jony Ive today sits and doing CAD on a Windows machine with Nvidia cards (probably thats the machine thats always covered with black drapes when you see clips from inside apple ;)
But, Intel wants to get away from pc business, apple wants to switch to arm, Apple cancelled Quicktime for PC, Microsoft said win 10 is the last....Its all just loose plans big companies shares, and an evolution, stuff will always need to be made (CAD drawings, special effects for movies, marketing material, music, games, 3d, etc etc. ) so someone has to deliver the hardware for it, unless we are going back to paper and pen.

The circle of technological life......
Once...there was a machine...born under apple company....
 
I assume Jony Ive today sits and doing CAD on a Windows machine with Nvidia cards and multi core Intel CPUs

No dude, Jony sits and doodles on an iPad Pro (before that paper and pencil) - actually turning that into a CAD based product is underling grunt work.

"Real work" ie having the ideas, sketching them out, and then editorial on the more realised interpretations of that, is very viable on an iPad, you just have to be at the top of the creative chain.
 
  • Like
Reactions: pat500000
I believe Thunderbolt 3 point-to-point networking is still limited to 10 Gbps, like TB 2. Plus, if you need a cable longer than 9 feet, you're looking at active, fiber TB cables and not the "cheap" copper TB cables.
According Intel TB3 networking is 40gbps, and while Tb fiber optic cable are expensive, Mellanox's fabric fiber optic cable it's way too more expensive (the only way to stage on the big league is spending money).

BTW for a compute appliance as I predict (a small Linux node loaded with Tesla P100 or Xeon Phi) a short 3ft TB3 cable it's fair.
 
... Thus the GPU has to be built specially to route its pixel output back through the system bus. And in doing so saturating the system bus with a constant stream of pixel data that without TB would have been isolated in a direct cable connection from GPU to monitor.

That's inaccurate, Thunderbolt gets video from an dedicated display port bus on the motherboard, thus the video signal streamed from the GPU don't mixes with PCIE data (I assume it's what you mean by system bus)

Because Apple insists on TB, and Intel requires video output to be included, or its not TB, Apple has to custom design its own GPU cards. They can't just take an AMD/NVIDIA spec design and jam it into an Apple branded box, because those spec designs don't include TB. So yeah, AMD and NVIDIA have new chips, but Apple has create new GPU cards that use those chips with TB video output.

False too, any video card with Display Port complaint output can feed a Thunderbolt 3 controller (either Alpine Ridge - tb3- or Falcon Ridge - tb2), proof of the are Gigabyte motherboards with Thunderbolt integrated, those cards have an Display Port Input port to connect to the Gpu (video cards in your words) output behind the computer.


TB makes sense for small form factor devices where physical space for ports is limited and pixel bandwidth requirements are relatively low. But for a desktop, it makes more sense to let the GPU connect directly to the monitor so you can drive three 4k displays, or two 5K displays, without killing your system bus.
Moot since Thunderbolt don't account on PCIE data bandwidth, Thunderbolt what does is to assume the USB Long missed promise to replace all PC Peripheral ports with one unique Universal cable.

I won't even get into the issue of having TB and USB3.1 ports and cables all look physically identical except for some tiny pictographs to differentiate them. I can't wait for all the problem calls that are coming to tech support because someone is trying to use a USB-C cable as a TB connector, and vice versa. Or worse the wrong cheap-o cable that ends up causing physical damage because some poor soul put the wrong cable in the wrong port.

FYI TB3 Cables lite up the connector when plugged to an TB3 port not just an basic usb-c (only usb 3.1 and display port).

Further FYI all new std USB-C Ports will be dual purpose at least providing USB 3.1 or Display Port video output among power delivery.
 
I assume Jony Ive today sits and doing CAD on a Windows machine with Nvidia cards and multi core Intel CPUs
Ive works with pencil & sketchbook.. his cad monkeys use Alias and Rhino (mac versions on macs).. i assume their in-house prototyping/CNC is going through Windows OS with windows based software at some point.

not sure about their renders but no reason they aren't being done on macs unless they're outsourcing them (not probable due to secrecy etc).
---
edit- or maybe apple has an in house render farm? (with non-mac components).. seems likely but i don't actually know..
but all of the renders i see from apple could most certainly be done strictly on macs with osx based software.
 
Last edited:
Users like Macbook Air users are the ones looking at switching to an iPad Pro, not Mac Pro users..

Depends. I traded my iPad for a MacBook Air. I've always had a 15 inch MacBook Pro and wanted something smaller to carry around. So I gave an iPad a try for over a year. In the end the iPad was far too limiting and at times incredibly frustrating to use to get any actual work done.

By the time I added a keyboard to the iPad it was as big or bigger than a MacBook Air and still far less productive.

Copying and pasting between multiple documents or emails is a massive PIA on the iPad. Heck adding more than one attachment to an email on the iPad is a joke and opening multiple documents at the same time in the same app like Pages impossible. The virtual keyboard doesn't even have arrow keys which makes navigating a fiddly inaccurate exercise. And the list goes on.

The iPad is good at consuming content and slightly more productive than doing work on iPhone 6+, but it's not a replacement for a real computer. Not by a long shot, so I have no idea what they are smoking over at Apple these days.

Regardless I ended up buying the new small iPad Pro model but not as a 'computer' or Mac replacement. I bought it to draw on, because it's cheaper than the WACOM equivalent. It rides in my bag as a supplement to a laptop Mac. Now if Apple gave me a OsX Surface I could just carry around one device, instead of two, but that would mean one less sale for Tim and they would have to eat crow and admit that conceptually Microsoft was right.
 
Last edited:
  • Like
Reactions: H2SO4
Yes, I find it hard to believe that nobody at Intel saw that tying T-Bolt (a PCIe expansion bus) to video outputs was incredibly stupid.
Did you know what's insanely stupid? having at least 3 decade of digital video displays (lcd) we still need to segregate video data from non-video data instead to merge everything in a wide enough universal data bus (more less what promised USB at lauch).

Thunderbolt neither USB-C are universal data buses, are just means to share the physical network layer.

Ideal is to have an unique universal peripheral data bus, where all the digital data flow concurent to its respective peripheral, but there are a bunch of Industry interest to delay this as long as possible. (video-entertainment industry seems the one who hates more this idea).
 
  • Like
Reactions: zephonic and koyoot
That's inaccurate, Thunderbolt gets video from an dedicated display port bus on the motherboard, thus the video signal streamed from the GPU don't mixes with PCIE data (I assume it's what you mean by system bus)

False too, any video card with Display Port complaint output can feed a Thunderbolt 3 controller (either Alpine Ridge - tb3- or Falcon Ridge - tb2), proof of the are Gigabyte motherboards with Thunderbolt integrated, those cards have an Display Port Input port to connect to the Gpu (video cards in your words) output behind the computer.

Moot since Thunderbolt don't account on PCIE data bandwidth, Thunderbolt what does is to assume the USB Long missed promise to replace all PC Peripheral ports with one unique Universal cable.

FYI TB3 Cables lite up the connector when plugged to an TB3 port not just an basic usb-c (only usb 3.1 and display port).

Further FYI all new std USB-C Ports will be dual purpose at least providing USB 3.1 or Display Port video output among power delivery.

I'll stipulate you may be correct here, but some of what you are claiming does not jibe with observed behavior of the MP 6,1. In particular I/O throughput dropping significantly when display band width increases when using high res displays.

Also if you look at HP workstation TB support it requires a PCIe TB card in tandem with a GPU that supports it. In that case, how does the pixel data from the GPU get to the TB PCIe card if not through the system bus? Not trying to be snarky here, just asking.

You may be correct that motherboards may be designed with separate video bus lanes to TB3 connectors, and the appropriate switches to allow direct video access and general PCIe data I/O depending on what is connected, but I haven't been seeing such designs on workstation class motherboards. Do you know of an example of one?
 
  • Like
Reactions: ssgbryan
I'll stipulate you may be correct here, but some of what you are claiming does not jibe with observed behavior of the MP 6,1. In particular I/O throughput dropping significantly when display band width increases when using high res displays.

Also if you look at HP workstation TB support it requires a PCIe TB card in tandem with a GPU that supports it. In that case, how does the pixel data from the GPU get to the TB PCIe card if not through the system bus? Not trying to be snarky here, just asking.

You may be correct that motherboards may be designed with separate video bus lanes to TB3 connectors, and the appropriate switches to allow direct video access and general PCIe data I/O depending on what is connected, but I haven't been seeing such designs on workstation class motherboards. Do you know of an example of one?
The pci-e card is an pci-e X4 one and it has an cable for the chipset to TB info. The video is an DP loop back cable like the old voodoo video cards.
 
Did you know what's insanely stupid? having at least 3 decade of digital video displays (lcd) we still need to segregate video data from non-video data instead to merge everything in a wide enough universal data bus (more less what promised USB at lauch).

Thunderbolt neither USB-C are universal data buses, are just means to share the physical network layer.

Ideal is to have an unique universal peripheral data bus, where all the digital data flow concurent to its respective peripheral, but there are a bunch of Industry interest to delay this as long as possible. (video-entertainment industry seems the one who hates more this idea).

Why would I want that? What you describe would be great as soon as someone invents the infinite bandwidth bus. But here and now all bus systems have a maximum bandwidth, and with the bandwidth of a single 5K display requiring more than half of a TB3 bandwidth cap constantly, that seems like a pointless amount of contention for a limited resource. And 8K monitors, yeah they are coming, will need TB4. If I can separate pixel output, from disk,network,peripheral I/O why would I not. I'd rather not have my RAID array output suffer because I have three 4K/5K displays connected.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.