Igor's Lab believes there's something wrong with the scheduler that can't be fixed with drivers. It's been covered by Linus and Moore's Law is Dead.I don't think ARC's hardware is gimped. They for sure have driver problems though.
Igor's Lab believes there's something wrong with the scheduler that can't be fixed with drivers. It's been covered by Linus and Moore's Law is Dead.I don't think ARC's hardware is gimped. They for sure have driver problems though.
We also use optane in our SAP HANA Databases.Oracle was a user of Optane and they would have a customer base of Oracle Cloud systems for it. I guess that it wasn't enough though.
I guess we will know if they send out another die revision for A380 that is different than what is in folks hands now.Igor's Lab believes there's something wrong with the scheduler that can't be fixed with drivers. It's been covered by Linus and Moore's Law is Dead.
I'd very much like Intel to succeed with Arc. Having an extra competitor in the GPU space would be great. I'm just not sure that they have the institutional fortitude for it. Larrabee followed a similar pattern, with expectations of a gaming card, but it ended up being shoved into the data center. The spiritual forefather of Alchemist, the i740, never made it to version two. Hopefully, Arc breaks the curse.I guess we will know if they send out another die revision for A380 that is different than what is in folks hands now.
Sure they can, somebody doesn't remember Itanium!By knowledge I mean that there’s no way for AMD or Intel to know exactly the systems their solutions are expected to go in. Apple’s chip makers know the dimensions of the systems (and even has say in it) they know ALL the compilers that will be used to write software (and has say in that) and even aware of the specific API’s their solution is expected to support (again, with input to that as well). AMD and Intel will never have that level of knowledge. And, they’ll never have the flexibility to do something like announce “no more 32-bit instructions. They will always have to support the years and years of cruft that are out there.
Which failed because folks wanted backwards compatibility (which Titanium doesn't offer).Sure they can, somebody doesn't remember Itanium!
I mean they are committed to 3 generations at least, let’s see what happens. I think the Intel Arc GPUs will be quite helpful for Intel in the laptop space especially the integrated and cheap gaming laptops spaceI'd very much like Intel to succeed with Arc. Having an extra competitor in the GPU space would be great. I'm just not sure that they have the institutional fortitude for it. Larrabee followed a similar pattern, with expectations of a gaming card, but it ended up being shoved into the data center. The spiritual forefather of Alchemist, the i740, never made it to version two. Hopefully, Arc breaks the curse.
If AMD didn’t exist at the time, if Intel REALLY had the market control that would mean that customers would continue to stick with them as they removed the backwards compatible stuff and made the replacement more performant, Intel would be in a better place today with a more streamlined simplified and efficient instruction set. As it is, they were forced to backpedal and here we are, at a point where the chip is so complex, it needs assistance from the OS scheduler to hit it’s full performance.Sure they can, somebody doesn't remember Itanium!
Eh, I think some of that is due to Intel not making consumer IA64 chips like AMD made consumer AMD64 ones. I think Intel believed they could get the server space to move, then maybe the consumer side. It didn't work out that way (clearly).If AMD didn’t exist at the time, if Intel REALLY had the market control that would mean that customers would continue to stick with them as they removed the backwards compatible stuff and made the replacement more performant, Intel would be in a better place today with a more streamlined simplified and efficient instruction set. As it is, they were forced to backpedal and here we are, at a point where the chip is so complex, it needs assistance from the OS scheduler to hit it’s full performance.
Its gimped. If you think a mobile gpu from Apple plays games perfectly fine, then you clearly have a very loose definition of what a good performing GPU is.Games designed for Apple Silicon, then yes, perfectly fine. However, I suspect you're tongue-in-cheek referring to game availability, not the functions of the actual silicon. If the hardware were gimped, then they'd be in the same situation that Intel is apparently facing with Alchemist.
Until we have a native AAA title on macOS with Metal 3 we cannot judge gaming pref on any M1. Resident Evil Village should give us a GREAT example on how Apple's GPU's perform when they are not hobbled back by old OpenGL or Rosseta translation.Its gimped. If you think a mobile gpu from Apple plays games perfectly fine, then you clearly have a very loose definition of what a good performing GPU is.
Right, what I said was, “If AMD didn’t exist at the time”. And, if AMD didn’t exist, they wouldn’t have been making consumer AMD64 anything. Without AMD, the industry would have been pissed with Intel, sure, but there literally wouldn’t have been any other way forward and they’d just make the best of Intel’s offerings. Just like the industry made the best of the Skylake situation.Eh, I think some of that is due to Intel not making consumer IA64 chips like AMD made consumer AMD64 ones. I think Intel believed they could get the server space to move, then maybe the consumer side. It didn't work out that way (clearly).
Apples GPUs are capable of very good performance for their power envelope. Expecting them to perform on par with GPUs with several times their power draw is clearly unreasonable (particularly on code targeting the PC offerings).Its gimped. If you think a mobile gpu from Apple plays games perfectly fine, then you clearly have a very loose definition of what a good performing GPU is.
On one hand it is cool that the M2 can out perform the 680M, on the other since it has more ALU's it seems like it should outperform the AMD part right?Until we have a native AAA title on macOS with Metal 3 we cannot judge gaming pref on any M1. Resident Evil Village should give us a GREAT example on how Apple's GPU's perform when they are not hobbled back by old OpenGL or Rosseta translation.
Apple's GPUs are good. They on par with RDNA 2 iGPUs. View attachment 2037299
Its gimped. If you think a mobile gpu from Apple plays games perfectly fine, then you clearly have a very loose definition of what a good performing GPU is.
we can't do a proper comparasion with both these chips until M2 has a game that is fully optimised with the API.On one hand it is cool that the M2 can out perform the 680M, on the other since it has more ALU's it seems like it should outperform the AMD part right?
we can't do a proper comparasion with both these chips until M2 has a game that is fully optimised with the API.
That's why RE: Village is going to be a good benchmark because it will made using Metal 3. So we can finally see just how good the GPU in the M2 is compared to the 680M.
It's interesting just how close that was to happening. According to Cliff:Right, what I said was, “If AMD didn’t exist at the time”.
What else other than opteron would we have worked on? I mean, the *original* K8 was a whole other thing, but we had no choice but to do what became Opteron because almost the entire design team resigned and what you had left were around 15-20 folks between circuit design, logic design, and architecture. And even Opteron (which we called sledgehammer) almost didn’t happen - there was a dinner at La Papillon that ended in a vote. The alternative was we go work elsewhere.
So, most of the design team left AMD, only 15-20 were left, and they had a vote whether to stay. In some alternate timeline, we're all using Itanium, like it or not.Anyway, what would have happened if Opteron wasn’t a thing? Well, either Merced/Itanium would have succeeded, for lack of alternative, OR PowerPC/Power would have succeeded, or maybe AMD would have done something else that would have pulled the industry along (there is some possibility it could have been ARM - it would have HAD to have been some form of RISC since we didn’t have manpower to do something huge. But whatever it would have been would have required Microsoft’s buy-in. That is often overlooked - we would never have done Opteron if Microsoft didn’t agree to put Windows on it.)
Maybe we would have been better off if x86 had died back then.It's interesting just how close that was to happening. According to Cliff:
So, most of the design team left AMD, only 15-20 were left, and they had a vote whether to stay. In some alternate timeline, we're all using Itanium, like it or not.
Perhaps, we'll never really know. Cliff says that Intel "let the HP PA-RISC guys make a science project" and the end result was Itanium. Given his time at Intel's competitors, I understand why he's not a fan of team blue, but we'll ultimately never know. The market simply wouldn't accept non-x86 mass market CPUs at that time because AMD provided an alternative, and it's only now that we're getting some measure of competition with a different ISA. It's reminiscent of the 80s, when there were multiple competing designs.Maybe we would have been better off if x86 had died back then.
Igor's Lab believes there's something wrong with the scheduler that can't be fixed with drivers. It's been covered by Linus and Moore's Law is Dead.
You could probably make toast and eggs on the lid of that laptop!And on battery power!
Curious, how does Optane compare with RAM or SSDs power-wise? Both of those get bloody hot if using high throughput. Is it actually worse?
Larrabee was actually a supercomputer chip first and foremost, something intended to compete with the then-new phenomenon of GPGPU in supercomputing. The ability to repurpose the chip as a GPU that would rasterize graphics was kind of a side project / goal.Larrabee followed a similar pattern, with expectations of a gaming card, but it ended up being shoved into the data center.
I've never worked for Intel or its competitors and I'm more than willing to just say it: the Itanium ISA was extremely stinky dog poop. It was so bad it just doesn't seem likely it could have taken over from x86 on technical merits, even though x86 itself is not an ideal ISA.Perhaps, we'll never really know. Cliff says that Intel "let the HP PA-RISC guys make a science project" and the end result was Itanium. Given his time at Intel's competitors, I understand why he's not a fan of team blue, but we'll ultimately never know.