Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I am in the 3D art world, and we are all about cores and ram - there is NEVER enough.
There still some 3-D libraries Single Threaded, required for boolean operations among solids(while some people has tried to rebuild it MT, its still alpha and very buggy), Single thread speed will still relevant for CAD for a while.
[doublepost=1526667903][/doublepost]
Reading Comprehension. Where did I knocked them about it? Only YOU read this in my post.

Is this your opinion, or educated opinion? ;)

Samsung started 7 nm volume manufacturing. TSMC is undergoing 7 nm volume production TODAY. Nvidia MIGHT release next generation GPUs on this very process. Nothing here is confirmed. Current rumor is that next gen. Nvidia GPUs are coming out in July. 7 nm Vega is coming Q4 2018. GloFo should start 7 nm HP VP in July-August. And 7 nm CPUs should come Q1-Q2 2019 from AMD on this node.

Where does this: "nobody will have 7 nm in quantity by at least 4 years" comes from?
Lets see these product hit the street... I mean 7nm will be only available for a while for power (watt) critical applications, such as Mobile CPU/GPU/RAM but even Flash may wait a while until it matures.
 
Those folks complaining have a very, very limited definition of what is "workstation" software. I am in the 3D art world, and we are all about cores and ram - there is NEVER enough.
Thanks for bringing up an application that does scale to many-core systems.

What tools do you use, and how do you know that they'll actually scale to high core-count processors? And by "scaling" I mean that the output bandwidth (or fps or whatever) scales with CPU count. Just having all of the cores busy doesn't mean anything if they're spinning on thread synchronization.

And note that Apple OSX only supports 64 threads, so you'd have to turn off "hyper-threading" on that 64-core server processor. With Apple focused on iOS, what chance do you think that Apple will do the hard work to move beyond 64 threads? (Note that Windows supports 64 *sockets* and an unlimited number of threads.)

Amdahl's Law is never your friend.
amdahl.png
 
Last edited:
Apple is building systems around thunderbolt.
How is possible for AMD to be capable to fulfill this requirement?
Could this help?

Intel will open-release Thunderbolt 3 spec in 2018

"Intel plans to make the Thunderbolt protocol specification available to the industry under a nonexclusive, royalty-free license. Releasing the Thunderbolt protocol specification in this manner is expected to greatly increase Thunderbolt adoption by encouraging third-party chip makers to build Thunderbolt-compatible chips." -- eenewsembedded.com
 
  • Like
Reactions: dboris
Could this help?

Intel will open-release Thunderbolt 3 spec in 2018

"Intel plans to make the Thunderbolt protocol specification available to the industry under a nonexclusive, royalty-free license. Releasing the Thunderbolt protocol specification in this manner is expected to greatly increase Thunderbolt adoption by encouraging third-party chip makers to build Thunderbolt-compatible chips." -- eenewsembedded.com
Probably not, unless the open license drops the idiotic requirement to put DisplayPort signals on a peripheral bus. T-Bolt would have taken off by now if the DP signals weren't required.
 
Thanks for bringing up an application that does scale to many-core systems.

What tools do you use, and how do you know that they'll actually scale to high core-count processors? And by "scaling" I mean that the output bandwidth (or fps or whatever) scales with CPU count. Just having all of the cores busy doesn't mean anything if they're spinning on thread synchronization.

And note that Apple OSX only supports 64 threads, so you'd have to turn off "hyper-threading" on that 64-core server processor. With Apple focused on iOS, what chance do you think that Apple will do the hard work to move beyond 64 threads? (Note that Windows supports 64 *sockets* and an unlimited number of threads.)

Amdahl's Law is never your friend.

As far as Apple, I am preparing to transition to Windows. I don't have time for Sir Idiot Boy and the other two stooges. Once I move the computer, the phone and tablets will follow. Mr. Bean Counter has made it much easier to leave the eco-system as he has killed off anything that doesn't provide 40% margins.

Workflow as follows.....

Modeling - ZBrush Or Shade 8 - One day, I'll get the hang of Blender.
Scene setup - Poser 11.1
Daz Studio - for exporting DS content out of DS and into a real program.
Outdoor Scenes - Vue
Photoshop 5 - Postwork
Acrobat - for building the PDFs.
Render Engines - Firefly, Superfly (Still learning this one)

As far as Apple, I am preparing to move back to Windows - I hear it has changed a bit since the last time I used it at home. (Windows 3.1) All I have left to do is decide on an iTunes replacement.
 
I've been using Apple since System 6, depending on it these days for for music production, video and graphic design. Like so many here, I'm now actively preparing to switch to Windows.

Here's all I need from a pro machine; I will purchase whatever ticks the boxes:

(1) Stable OS, so that I'm spending my time working, not troubleshooting
(2) RAM that can grow with my needs, and the ever-increasing demands of pro apps and workflows
(3) GPU that doesn't feel like a compromise, and that can be swapped out as more powerful (and time saving) options emerge, with full support for NVIDIA.
(4) Plenty of USB and thunderbolt ports so I can use my peripherals without dongle hell.
(5) Plenty of internal storage bays, so my desk is not covered with external drives.

Apple still has an edge on point 1. Windows is currently the only way I get 2,3,5, and (on a laptop) 4.

Keep posting to this thread; help it become torches-and-pitchforks that the 2019 Pro design team might notice.
 
  • Like
Reactions: BlueTide
This. It’s also about multitasking while waiting for a sim or render to finish.

Truth is it’s mainly about the sweet spot..which is where the likes of threadripper, Xeons and i9s shine.. fast enough low core count speed + multi core speed that leaves the 4/6 cores in the dust.

Also threadripper x 2 slots... c’mon AMD.

Fair enough, but what about come on Adobe and come on Apple ?

There is a huge gap between some of the major applications who utilize multithreading and the many that don't .
As for OSX, multitasking still seems to be restricted to slowing down everything randomly, while keeping iTunes and Spotlight running at all cost . Hyperbole, perhaps . ;)

In that respect, Aiden makes a good point - who needs high core counts ?

Or rather, who is finally going to provide software optimized for multithreading accross the board , integrate it with affordable high performance GPUs , lower the power requirements and makes multitasking a thing that works without throwing lots of money at it to be prepared for any user case ?

Who will provide solutions that don't commit to one way of doing things to the detriment of another ?

It's all well and good to discuss future technology or even the most advanced current tech, but it has no impact whatsoever on the present unless affordable mainstream products are on the shelfes .

It's not progress when you pay more to get more, that's not even stagnation .

Considering inflation and the increased cost of living beyond that, it could be argued that today we get the worst value in computing since the pre CD-R era .
At least as far as Macs are concerned , and current OSX versions .
 
Fair enough, but what about come on Adobe and come on Apple ?

There is a huge gap between some of the major applications who utilize multithreading and the many that don't .
As for OSX, multitasking still seems to be restricted to slowing down everything randomly, while keeping iTunes and Spotlight running at all cost . Hyperbole, perhaps . ;)

In that respect, Aiden makes a good point - who needs high core counts ?

Or rather, who is finally going to provide software optimized for multithreading accross the board , integrate it with affordable high performance GPUs , lower the power requirements and makes multitasking a thing that works without throwing lots of money at it to be prepared for any user case ?

Who will provide solutions that don't commit to one way of doing things to the detriment of another ?

It's all well and good to discuss future technology or even the most advanced current tech, but it has no impact whatsoever on the present unless affordable mainstream products are on the shelfes .

It's not progress when you pay more to get more, that's not even stagnation .

Considering inflation and the increased cost of living beyond that, it could be argued that today we get the worst value in computing since the pre CD-R era .
At least as far as Macs are concerned , and current OSX versions .
Its funny that people on this forum start to call for more optimized software, when I was calling about this this, what... 2 years ago? 3 years ago?

What happens when you start optimize software? You can save TONS of money on hardware. Like this:

cifar10_average.png


Comparison of performance in TensorFlow between GPUs.
Volta V100 is 9000$. Vega is 1200$, and delivers 85-90% of performance of V100.

You can buy 6 of Vega GPUs for the price of single V100. Just by optimizing your software. What will have better impact on your job?

But hey. Nobody cares about money saving on this forum.
 
Its funny that people on this forum start to call for more optimized software, when I was calling about this this, what... 2 years ago? 3 years ago?

What happens when you start optimize software? You can save TONS of money on hardware. Like this:

-snip-

Comparison of performance in TensorFlow between GPUs.
Volta V100 is 9000$. Vega is 1200$, and delivers 85-90% of performance of V100.

You can buy 6 of Vega GPUs for the price of single V100. Just by optimizing your software. What will have better impact on your job?

But hey. Nobody cares about money saving on this forum.

I'm not an expert by any standards, as one can tell by my postings, but even I have been aware of those issues for ages , and they have been thoroughly discussed ever since .
Certainly since the Intel switch when comparisons where made easier .

Where the rubber hits the road there is so much traction lost ... never mind the car anologies ... ;)
 
I switched to Windows some time ago, and thanks to the hardware I can use, I extract all of the performance out of it, hence why my workflow is ending 30-40% faster than when I was on Apple hardware.

Yes, I do have an 8 Core/16Thread CPU, for which I paid 240$, Mobo, for Which I paid 80$, and 32 GB for which I paid... 400$. But all that saved me TONS of time and money I can spend on ... making more money, without complaining about how rubbish Apple's software ecosystem is in performance.

When I showed my friend MacOS lately, he said: "Its nice, and cute, but why is it so slow doing anything on it?" Aaaand we are doing video editing using Adobe Premiere.
 
Lets see these product hit the street... I mean 7nm will be only available for a while for power (watt) critical applications, such as Mobile CPU/GPU/RAM but even Flash may wait a while until it matures.
Aaaaand, Lisa Su in a conference call has just said two things:

Zen 2 is a LOT faster than previous generation. She sounded almost SMUG, when she said that. I wonder why...
Second thing: Server CPUs are coming BEFORE consumer CPUs. What this means: Server Ryzen CPUs are most likely going to be made on 7 nm node from... TSMC.

And, AMD has also TSMC volume for GPUs. Both 16 nm, and 7 nm. Wouldn't it be funny if Vega 12 was actually 16 nm TSMC?

I guess we all will be laughing very hard at Intel next year ;). To lose 10 year lead in just two years requires massive imbecils as your managers.
 
Last edited:
  • Like
Reactions: Mago and ssgbryan
And for Apple that is: to lose 20 years lead in just four years.

Ok, a little exaggeration there... :p
Yay! Single source of hardware lock up! Lets all cheer for this!

To show you all the reality that Intel is facing, and what we, consumers will get, because of 10 nm fiasco:

This year we will get Coffee Lake-R(efresh). Core i3 - 4C/8T, Core i5 - 6C/12T, Core i7 - 8C/16T CPUs.

Next year: we will get another round of "Lake" CPUs, that are based on Skylake architecture, and 14 nm +++ process(Coffee Lake is 14 nm ++) because Intel cannot put out 10 nm to work for 2019.
https://www.semiaccurate.com/2018/05/14/another-body-of-water-is-forming-in-front-of-our-eyes/
Charlie was alarming about 10 nm fiasco for past 2 - 3 years. Credit goes to him for that.

And then, Icelake-S CPUs are at this moment staged for 2020 release. That is IF Intel is able to push out the yield enough so it is profitable, in any way. Believe me guys, if they would be able to get any money back on each of 10 nm wafers - everything would be fine and dandy.

But they cannot. They lose money on each and every 10 nm wafer they have, and they are not able to make this process to work.

Apple is Most affected by this, and will be.
 
Last edited:
Wow, that's a lot of drama.

In just the last few years, Apple has skipped the E5-x6xx family, the E5-x6xx v3 family, and the E5-x6xx v4 family.

Yet the money from iOS devices keeps on flowing.
Because of Apple's Vertical Integration, the only way you can make content for iOS(Apps) is by using Apple computers.

Devs are not using Macs solely for iOS/Mac Programming. They earn money on other platforms. If other platforms will allow them to make the same amount of money, but in 50% smaller time, who will win in this "unstoppable force, immovable object" fight?

Imagine few years in future, when Intel cannot manufacture any hardware they promised, and AMD can. What would you buy?

MacBook Pro with 6C/12T CPU, that clock up to 4.3 GHz, 16 GB of RAM, or Lenovo/Dell/HP laptop, with 16C/32T, 32 GB of RAM, upgradeable with RAM, and SSD, that stays on battery for 9 hours?

This is reality we face, guys. Unless Apple will switch to AMD, this is what will happen on Apple's ecosystem. Not only Mac Pro will be outdated.
 
As far as Apple, I am preparing to transition to Windows. I don't have time for Sir Idiot Boy and the other two stooges. Once I move the computer, the phone and tablets will follow. Mr. Bean Counter has made it much easier to leave the eco-system as he has killed off anything that doesn't provide 40% margins.

Workflow as follows.....

Modeling - ZBrush Or Shade 8 - One day, I'll get the hang of Blender.
Scene setup - Poser 11.1
Daz Studio - for exporting DS content out of DS and into a real program.
Outdoor Scenes - Vue
Photoshop 5 - Postwork
Acrobat - for building the PDFs.
Render Engines - Firefly, Superfly (Still learning this one)

As far as Apple, I am preparing to move back to Windows - I hear it has changed a bit since the last time I used it at home. (Windows 3.1) All I have left to do is decide on an iTunes replacement.

ssgbryan said:
"Once I move the computer, the phone and tablets will follow."

Yes, Tim Cook has a tin ear and doesn't realize that their desktop market could sap their cash cow portable computing market too.
[doublepost=1526763767][/doublepost]

I don't think Apple Engineering Team reads the posts here. No one there cares what the consumer thinks. King Cook has already pronounced , in public events, that the iPad Pro will never be wedded to a docking laptop like the Microsoft Surface series. Microsoft is now officially more innovative than Apple! As a former Apple Fanboy, it pains me to state that!
 
Last edited:
To tell you the level of stupidity is inside Intel's management:

They are betting that AMD will not get 7 nm in Volume till 2020.

I am laughing my a** off right now. How ignorant, stupid, full of themselves can you be, till you see that your f****** ship is sinking?

Lisa has just said for JPMorgan that 7 nm is coming in Volume from Q4 2018(supposedly she was talking about Vega 7 nm). THIS YEAR! I cannot believe this situation.

AMD has 2 sources of 7 nm: GloFo, and TSMC, and on both of those fabs they are going to design and manufacture Both: CPUs and GPUs. They will decide where to buy the capacity, at the last moment.
 
To tell you the level of stupidity is inside Intel's management:

They are betting that AMD will not get 7 nm in Volume till 2020.

I am laughing my a** off right now. How ignorant, stupid, full of themselves can you be, till you see that your f****** ship is sinking?

Lisa has just said for JPMorgan that 7 nm is coming in Volume from Q4 2018(supposedly she was talking about Vega 7 nm). THIS YEAR! I cannot believe this situation.

AMD has 2 sources of 7 nm: GloFo, and TSMC, and on both of those fabs they are going to design and manufacture Both: CPUs and GPUs. They will decide where to buy the capacity, at the last moment.
Links to *all* of those claims?

You have no credibility if you don't cite your sources.
[doublepost=1526768860][/doublepost]
If there would be anything remotely interesting apart from the screenshot - I would post it.
It would be more than remotely interesting to know the application that is being benchmarked, the OS/software/driver versions,...
 
I don't care that you are not registered for AMD investor relations calls. Everything is here.
Then simply provide the links with your posts. Very simple way to establish credibility.

And investor relations calls are public info - obviously, since you just (belatedly) provided the links.

And please don't think that we're stupid enough to believe that everything that AMD's investor relations team says on a call is the whole truth. Any investor relations team will be masters of half-truths and spin.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.