Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes I know, but this isn't the same as storing actual electricity that is already being produced. And excess electricity being produced here is better used by selling it to our southern neighbour in Vermont, Maine and New York since we, living in more northern region aren't affected by any lack of water in our reservoir...
You must be talking about real-time storage. Yes, very difficult. We wrestle with these problems here in Hawai'i, looking for the best answer. We're trying hard to cut down our consumption of fossil fuels, all of which have to be shipped here.

We have a lot of residential solar and that means that at unpredictable intervals, a lot of energy is going into the system. I appreciate the technical difficulties, for sure.
 
Last edited:
Well, I'm still waiting...............
 

Attachments

  • giphy.gif
    giphy.gif
    494.1 KB · Views: 105
Well obviously, lets use up all of the rest of the oil we have on this planet today, and not think what will happen tomorrow.

Lets not care about the fact that oil is running out on this planet, and in just 100 years in mid 80's last century we used 60% of remaining available supplies of oil that this planet was able to supply to us. What we will power our jet engines with? How will we travel?

How will look energy lanes when 95% of world power will come from electricity? How we will power all of the devices that will be connected? Has anyone of you, who try to downplay efficiency thought a little bit about the consequences?

Lets not go completely off the reservation. Energy is essential and we currently use energy sources that are finite, but as technology advances we'll get better at renewables or essentially infinite sources. Its not out of this world to think that in most of our lifetimes we'll see some form of energy beamed back to earth that was captured in space. This isn't the end of the world, man. We're smart, don't freak out and shove this BS down our throats.
 
Lets not go completely off the reservation. Energy is essential and we currently use energy sources that are finite, but as technology advances we'll get better at renewables or essentially infinite sources. Its not out of this world to think that in most of our lifetimes we'll see some form of energy beamed back to earth that was captured in space. This isn't the end of the world, man. We're smart, don't freak out and shove this BS down our throats.
http://semiengineering.com/the-zen-of-processor-design/
First line of the interview: That’s one thing that is actually fundamental. It’s about performance per watt, performance at a given energy level. It affects everything from PCs and datacenters to IoT devices and phones. The faster you get a task done, the more performance you have. As soon as that task is done, you can return down to a zero state of energy dissipation. The more efficient processing you can implement into your design, the more you are improving your energy efficiency.

I genuinely suggest reading the interview. Points in the direction where whole industry is going.
 
Mago, you got that info on the possible config from the new GM?
Somewhat good news but I wish they wouldn't go back on the x16 for each GPU. That's not future proof, maybe it's not needed now but it will be.
I'd still like to see SKL-W with an updated PCH (DMI3, PCIe3, the works) instead.
Cos when macOS makes full use of both GPUs (and I believe it will, possibly later rather than sooner) we'll be stuck.
With SKL this would be almost a non issue, more PCIe lanes on the CPU.
 
Gonna go with what software?
Super Mario and Pokemon, please.:)
[doublepost=1474022732][/doublepost]
Another version of Rosetta.

I don't see the software houses rewriting their software to accommodate ARM architecture - especially given how Apple ignores computers.
So this will run slower or there will be needed more powerful CPUs, not the strong point of ARM, at least for the moment.
[doublepost=1474022924][/doublepost]
Latest information from macOS Sierra GM drivers suggest it could be configured as 1 x16 dual GPU "polaris xt" + 3 x4 AlpineRidge TB3 + 2 x4 NVMe

These are some serious infos, and on topic.
[doublepost=1474023163][/doublepost]
Well, I'm still waiting...............
patience is a virtue...
can you still move your left hand?:)
 
Last edited:
http://semiengineering.com/the-zen-of-processor-design/
First line of the interview: That’s one thing that is actually fundamental. It’s about performance per watt, performance at a given energy level. It affects everything from PCs and datacenters to IoT devices and phones. The faster you get a task done, the more performance you have. As soon as that task is done, you can return down to a zero state of energy dissipation. The more efficient processing you can implement into your design, the more you are improving your energy efficiency.

I genuinely suggest reading the interview. Points in the direction where whole industry is going.

Dude, we know. This isn't news. Its just silly to try to legislate it, to put it kindly.
 
I keep telling you guys that ARM is the way Apple is gonna go. But hey don't listen to a dweeb.

:p

Well, let's pour a dump truck load of salt on those benchmarks... The iPhone 7 was only fast in one of the benchmark tests. Everywhere else it didn't do so well.

It reminds me of the PowerPC people when Microsoft released the triple core G5 based Xbox. It's triple core! In a small form factor! Why did Apple leave the PowerPC when they could have used that chip! Ends up that the PowerPC chip that the Xbox used was a significantly cut down version to get the energy consumption and heat low enough to put in a console. Much like Apple's ARM chips are significantly cut down compared to Intel's chips in order to fit into a phone.

You could re-add all those features to an ARM chip and bring it up to 6 cores, but you'll get back to where Intel is in power consumption and heat. As others have pointed out, Intel's chips are a pretty modern architecture under the hood. Intel isn't stupid. There isn't some huge inefficiency in x86 keeping it from competing with ARM on the desktop.
 
  • Like
Reactions: ssgbryan
Well, let's pour a dump truck load of salt on those benchmarks... The iPhone 7 was only fast in one of the benchmark tests. Everywhere else it didn't do so well.

It reminds me of the PowerPC people when Microsoft released the triple core G5 based Xbox. It's triple core! In a small form factor! Why did Apple leave the PowerPC when they could have used that chip! Ends up that the PowerPC chip that the Xbox used was a significantly cut down version to get the energy consumption and heat low enough to put in a console. Much like Apple's ARM chips are significantly cut down compared to Intel's chips in order to fit into a phone.


Where did I say the next Mac was gonna be ARM based ? Oh. I didn't.
 
...
I have not suggested that the current A-series are a good fit for MacOS devices.

So you want theoretical numbers for a processor that nobody has any physical baseline implementations of? You want a projection on something that Apple is not doing. Hasn't planned to do. And actually has hinted is not a priority at all.

' ...
And since Apple is doing a fine job with mobile processors, it could conceivably decide to get into conventional chips and bump Intel out of its Mac laptops and desktops. Srouji, of course, won’t go there, though he does allow that his team’s mission is finite. “If we attempt to do everything on the planet,” he says, “I don’t think that would be very smart.” ... '
http://www.bloomberg.com/features/2016-johny-srouji-apple-chief-chipmaker/

If you want numbers pulled out of your butt, then just make them up. Take the benchmark, divide by GHz, and then multiply by whatever GHz you want to make up that is higher. Then take that and multiply by 4. Linear scaling is a theory you can use. It has not too much to do at all with theory of actually modeling computer architecture implementations, but it works as a "pull numbers out of my butt" mechanism.

I read an article one or two years ago, which stated laptops make up about two-thirds of Apple's computer sales. So we are already there.

No we are not. The MacBook is a product. I was not using 'Macbook' as a generalized concept in my quote. It was directed at that specifically named product. It is the lowest "horsepower" Mac in Apple's line up. That could be bumped to ARM. However, it would be shocking if that was even 25% of Apple's laptop sales. It is no where near 2/3.

Maybe if throw in the MacBook Air (which plays the role the older MacBook's used to play as being the "most affordable" laptop in the Mac Line up.) you might climb with in the vicinity 50%. However, ARM does nothing for the Macbook Pro, so there is no way getting anywhere getting close to 2/3.

It remains to be seen how much the iPad Pros are going to eat into that lower edge of the MBA and the very low workload, but highly focused on weight subset of the MacBook market. If anything should be trying to get the MBA/Macbook to move higher out of the way of the iPad Pro if Pros start significant cannibalization.

It seems repetition is the best way to get my point across, so again, I'm not saying the A-series are suitable for Macs. I'm merely speculating about Apple's ability to design processors that are.

Designing phones processors doesn't multiply the team size. They aren't rabbits in a hutch inside of Apple and just multiply if keep them feed with food. They have a team very busy with phones, watches , and now in part, earphones. The number of custom chips inside of the phones is more likely to go up over time. If there is team expansion it is far more likely it will get consumed by that rather than a distraction of Macs.

In the x86 market ( if AMD's Zen pans out as early numbers indicate), they have two competitors vying to supply Mac processors. As long as they are doing a good job.... what is the point?

If Apple has mega resources of engineering talent just lying around doing nothing, where are the new Macs? There is a whole line up of nothing new. Apple does appear to have the ability to throw bodies at empty "green field" lines of development ( 1,000 folks on car , 100 folks on watch bands in the initial ramp up, etc.) but longer mature/maturing product lines Apple seems to pick out a "this is big enough" size and then attempts to deal with whatever that approximately sized group can do.

The phone processor is extremely critical to Apple maintaining their level of penetration in the "expensive phone" market. If Apple lets Qualcomm , Samsung , or some newcomer blown past them they'll be in bad shape on trying to create differentiation. It is highly doubtful Apple will take on Cellular radios so that key component will be shared.


Yes. I'll agree Gruber dropped the ball here (to the point where it looks like he purposely cherry-picked his data to make the A10 look good). But even so, I do think the Geekbench illustrates just how far Apple's processors have come in a relatively short time.

Implementing stuff that was deeply researched and documented 10-15 years ago. Is going to run out of steam at same walls as the other stuff that has already implemented it 5-6 years ago.


The pace at which ARM improves is reminiscent of the leaps x86 made in the late 1990's.

There is little in these A10 numbers but Megahertz. To a large extent Apple is just simply clocking a single core higher than their competitors on single core "drag racing" benchmarks. As soon as get to multicore work the advantage largely disappears. So as long as have a workload that using one of two cores they run much faster. That is not all that amazing.

It will be interesting to see what happens when folks throw single core workloads at the A10 that last for 10-30 mins. ( not the 1-2 geekbench windows). I highly suspect there isn't going to be some huge gap that A10 has at that point and that the Intel/AMD implementations used in most of the Mac laptop line up leave them far behind.
 
http://semiengineering.com/the-zen-of-processor-design/
First line of the interview: That’s one thing that is actually fundamental. It’s about performance per watt, performance at a given energy level. It affects everything from PCs and datacenters to IoT devices and phones. The faster you get a task done, the more performance you have. As soon as that task is done, you can return down to a zero state of energy dissipation. The more efficient processing you can implement into your design, the more you are improving your energy efficiency.

I genuinely suggest reading the interview. Points in the direction where whole industry is going.
Koyoot, I respectfully suggest that you're getting hung up on this. Yes, there is absolutely a mind shift towards green, efficiency, economy. These types of products will be adopted by (or forced upon) the masses. However, there always has been, and always will be, those who are at the forefront of production, or who can just f&*(ing afford it, and will want the best!!!! Efficiency or cost be damned. You're going to meet a lot of those folks on this board man. You won't change their minds.
 
  • Like
Reactions: ssgbryan
Koyoot, I respectfully suggest that you're getting hung up on this. Yes, there is absolutely a mind shift towards green, efficiency, economy. These types of products will be adopted by (or forced upon) the masses. However, there always has been, and always will be, those who are at the forefront of production, or who can just f&*(ing afford it, and will want the best!!!! Efficiency or cost be damned. You're going to meet a lot of those folks on this board man. You won't change their minds.
We are talking about HPC market here. And this market will not be affected by those regulations. Efficiency will touch every single "typical" computer we know today. NUC, AiO, desktop, high-end desktop, Workstations. Those will be affected. But not HPC, data centers, compute clusters, etc, whatever will be the next big thing in HPC.
 
You must be talking about real-time storage. Yes, very difficult. We wrestle with these problems here in Hawai'i, looking for the best answer. We're trying hard to cut down our consumption of fossil fuels, all of which have to be shipped here.

We have a lot of residential solar and that means that at unpredictable intervals, a lot of energy is going into the system. I appreciate the technical difficulties, for sure.
Don't you guys have a volcano around there somewhere you can tap for energy?
 
Where did I say the next Mac was gonna be ARM based ? Oh. I didn't.

So this earlier post happened in an alternative universe?
" .... Not sure but they just licensed ARM..

Laugh all you want but I see ARM in the future for Macs. "
https://forums.macrumors.com/threads/waiting-for-mac-pro-7-1.1975126/page-70#post-23421280

If there is the normal follow on of a A10X Fusion then yes iOS is going to take a bigger chunk out of the lower end spectrum of Mac laptops if Apple doesn't do something substantive on the Mac side.

But splitting the Mac line up between ARM and x86 is highly dubious. There is ARM embedded in lots of chips not just CPU packages:

14739575754581404965384.jpg

http://www.anandtech.com/show/10684/arm-research-summit-research-roadmap-keynote-live-blog

There is already been some ARM in the Macs now. Big deal. Apple could do a gimmick demo project. It is probably a reasonable "scare intel for Halloween" project. What is deeply lacking as to what the motivation would be to go their the whole effort of moving the entire MadOS ecosystem yet again. "Catching up to" has never been a motivator for the previous arch switches. They were already already ahead. Apple doesn't have the internal skillset for Rosetta ( even for Rosetta era ... that was licensed tech which are not probably going to get from IBM at this point at bargain prices. . )

More robust A series solutions are far more likely to allow iOS to simply consume macOS userbase than being a port mechanism.
 
  • Like
Reactions: ssgbryan
What's the difference between "there will be no future Macs" and "there will be no next Mac"? ;)

Seriously - doesn't the failure to update any of the "Mac" line (laptops, desktop, semi-pro workstation) make you worry?
I'm definitely concerned. Luckily most of my workflow is platform agnostic. I'll sure miss the aesthetics, but no doubt I'll switch if necessary. I reckon updates are coming though...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.