Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TopToffee

macrumors 65816
Jul 9, 2008
1,070
992
YES! very much so.
APPLE just lost it in the corporate world. BIG BUSINESS very much rooted in Intel and AMD. NOT ARM.
Not good to trust an Asian Chip manufacture for all your business. plus build secrets always stolen.

AMD and INTEL will surpass and fly by this whole 3nm ARM thing for sure.

Then the whole Mac line is doomed.

Windows 11 and the PC Take over. And Linux for servers only.
Wait a minute… aren’t you the same guy who thinks eating only lean protein is a good idea?
 

Ignacio Russo

macrumors newbie
Jan 10, 2021
21
2
The power is there, but AMD and even possibly Intel will catch up in time. I fear the future market fragmentation, with developers having to develop specifically for Apple Silicon ARM and just not having the time to do so.

Not to mention that games for Apple Silicon are just not a thing, and gaming is a huge part of the PC market, and realistically will probably never be a thing, since Apple and gaming just don't work together.

Even the new 10nm Intel CPUs will be much better than before, and AMD is already doing great in raw power.

The idea of Apple controlling both software and hardware is great, something they've been trying to do for decades, but the big question is how the support from the developers will be.

I look forward to the power, but I'm just not so sure about the future.

I am a complete noob and have no idea what I'm talking about in this area, but I'm just wondering what other people here think.
Competition forces the industry to improve.
 
  • Like
Reactions: bobcomer

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
Competition forces the industry to improve.

Apparently not fast enough for Apple. AMD provided strong competition but Intel was happily making chips appropriate for toaster ovens. Apple is in the picture and they're still doing that. And now AMD is joining in on the fun.
 

TrueBlou

macrumors 601
Sep 16, 2014
4,531
3,619
Scotland
Can’t say as I’m particularly worried about the future of Apple Silicon. Switching from X86/64 wasn’t a decision that would be made lightly, and certainly not one to be made if they weren’t confident about the future of the platform.

There’s no doubt whatsoever that both Intel and AMD will make great improvements to their product roadmap over the next several years. But while they’re doing that, Apple won’t be resting on their laurels.

Ever since their first in-house designs, with the A series, Apple have pushed hard and fast every year to improve upon it. To a point where in just 7 or so years, they’ve been able to produce a platform that outperforms the Intel chips it replaces, often by quite a margin. That it does so at such low TDP as well, is quite an impressive feat.

One thing that’s for sure, Apple switching to their own design, has quite possibly accelerated the plans of the other chip companies. The winners out of all of this, regardless of the platform you choose, will be the consumers, who should get better, more powerful and much more efficient silicon in the years to come.

It’s going to be an interesting journey.
 
  • Like
Reactions: JMacHack and pshufd

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
this is wrong on many levels. if anything C is about “ trust the programmer” while C++ will prevent the programmer to do certain quite stupid things ?

What will C++ prevent the programmer from doing? Last I checked, you can cast anything to anything and do anything you want with pointers.
 
  • Like
Reactions: lcubed

Slartibart

macrumors 68040
Aug 19, 2020
3,145
2,819
What will C++ prevent the programmer from doing? Last I checked, you can cast anything to anything and do anything you want with pointers.
but not with references (as in aliases for the variables and point to the same memory location as a variable). But C++ will prevent other things, like strictly enforcing that all functions must be declared before they are used, you cannot implicitly assign from a void* to any other type, and so on.

We might agree on that C is a hands-on language and we can program it in whichever way we want. And C++ consists of some high-level object-oriented programming constructs that help to code high-level programs.

Thus if one can say C is easy then C++ is also easier to code.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
but not with references (as in aliases for the variables and point to the same memory location as a variable). But C++ will prevent other things, like strictly enforcing that all functions must be declared before they are used, you cannot implicitly assign from a void* to any other type, and so on.

We might agree on that C is a hands-on language and we can program it in whichever way we want. And C++ consists of some high-level object-oriented programming constructs that help to code high-level programs.

Thus if one can say C is easy then C++ is also easier to code.

In some sense C++ is easier, sure. It’s also harder, in that you need to understand classes, templates, protected variables, etc. It’s definitely easier to maintain. Not sure it’s easier to code. Haven’t really thought about it, though.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
In some sense C++ is easier, sure. It’s also harder, in that you need to understand classes, templates, protected variables, etc. It’s definitely easier to maintain. Not sure it’s easier to code. Haven’t really thought about it, though.
I think the crust of the initial point that "C++ is more complicated" also really just comes down to it being such a big language with so much to learn.
Bjarne Stroustrup once said:
"C will let you shoot yourself in the foot. When we write C++ in a C++ style we try to engage the safety on the gun, but if we misfire we lose the whole leg"
or something to that effect.

C++ gives you a lot of tools for structuring your code better, but if you don't engage the safety so to speak, you can shoot yourself just the same
You can use a goto in either language. You shouldn't, but you can.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
They’re really simple if you’re a CPU designer :)
I think they can be simple to anyone once they get their mind around it. That said, not always simple to get right. It can be a tad tricky to debug where things go wrong when all you know is that you don't wind up at the correct memory address after following 4 pointers :p (trauma from an edge case bug when implementing memory paging for an OS, haha. Pro tip write a bunch of 0xDEADBEEF around the addresses you want to get through. It's easy to spot in a memory overview, so you can easily compute that you're 24 bytes off of where you wanted to be or whatever)
 
  • Like
Reactions: jdb8167

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
I think the crust of the initial point that "C++ is more complicated" also really just comes down to it being such a big language with so much to learn.
Bjarne Stroustrup once said:
"C will let you shoot yourself in the foot. When we write C++ in a C++ style we try to engage the safety on the gun, but if we misfire we lose the whole leg"
or something to that effect.

C++ gives you a lot of tools for structuring your code better, but if you don't engage the safety so to speak, you can shoot yourself just the same
You can use a goto in either language. You shouldn't, but you can.

I was recently reviewing a very large amount of C code, and found multiple instances of goto.

I vomited a little bit in my mouth.
 
  • Like
Reactions: casperes1996

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
I think they can be simple to anyone once they get their mind around it. That said, not always simple to get right. It can be a tad tricky to debug where things go wrong when all you know is that you don't wind up at the correct memory address after following 4 pointers :p (trauma from an edge case bug when implementing memory paging for an OS, haha. Pro tip write a bunch of 0xDEADBEEF around the addresses you want to get through. It's easy to spot in a memory overview, so you can easily compute that you're 24 bytes off of where you wanted to be or whatever)
On a long-ago embedded system that had loadable modules, I made the OS print 0xDEADBEEF to the console when the divide by zero interrupt vector (on an Intel 80186 that only had real-mode) was overwritten because the developers who were writing the loadable modules kept insisting that their code was correct and my OS was broken ;). I would ask them why they printed 0xDEADBEEF to the console then. They would grumble and go find their uninitialized pointer error.

Edit: I might be misremembering what int 0x00 on Intel real-mode is. I think I remember divide-by-zero but now I'm thinking it might be the weird non-maskable interrupt. I don't feel like looking it up.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
I was recently reviewing a very large amount of C code, and found multiple instances of goto.

I vomited a little bit in my mouth.
I'm very impressed you could contain it that well.
*Hides face in shame that I've written a few goto lines in the past year - though admittedly for projects that would never outgrow 120 lines total - just experiments*
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
C is a great language. It's great for teaching young kids as you can disassemble their programs to show them what it looks like in assembler. It's also great because it's really easy to screw up so that kids get exposed to crash dumps.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
On a long-ago embedded system that had loadable modules, I made the OS print 0xDEADBEEF to the console when the divide by zero interrupt vector (on an Intel 80186 that only had real-mode) was overwritten because the developers who were writing the loadable modules kept insisting that their code was correct and my OS was broken ;). I would ask them why they printed 0xDEADBEEF to the console then. They would grumble and go find their uninitialized pointer error.

Edit: I might be misremembering what int 0x00 on Intel real-mode is. I think I remember divide-by-zero but now I'm thinking it might be the weird non-maskable interrupt. I don't feel like looking it up.
Haha that's great :p How'd you detect when it was overwritten? Did you just regularly check that the value of something matched expectation? Did you have clock interrupts so you could regularly check? Or did you just wait on sys calls?
And you were correct about 0x0 being div/0. 0x02 is the non-maskable interrupt :)
 

Joelist

macrumors 6502
Jan 28, 2014
463
373
Illinois
If you really want a version of C that protects from programmers then I would suggest C#as it is the one that uses a managed framework. Of course that comes with its own limitations...
 
  • Like
Reactions: Wizec

Joelist

macrumors 6502
Jan 28, 2014
463
373
Illinois
As far as Apple Silicon goes, remember it not only outperforms in PPW terms Intel and AMD but for years it has been the fastest mobile SOC in existence and not by a small margin. Basically since A7 what Apple has been doing is putting a desktop/laptop class SOC into phones and tablets (as opposed to the claim that with M1 they are putting phone/laptop class processors into laptops and desktops).
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
Haha that's great :p How'd you detect when it was overwritten? Did you just regularly check that the value of something matched expectation? Did you have clock interrupts so you could regularly check? Or did you just wait on sys calls?
And you were correct about 0x0 being div/0. 0x02 is the non-maskable interrupt :)

We used BADF00D for RAM returned by MALLOC to try to catch uninitialized memory segments.
 

playtech1

macrumors 6502a
Oct 10, 2014
695
889
Gamer types always overestimate their relevance.
Nvidia, which is looking to buy ARM, was built on gaming revenue and it's still a huge part of its business. If the mainstream corporate world remains closed to Apple, I think Apple also passing on AAA gaming is leaving another big slice of the market to Windows.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,450
I was recently reviewing a very large amount of C code, and found multiple instances of goto.

I vomited a little bit in my mouth.
That's awful - real programmers use setjmp/longjmp.... :)

OTOH, back in the good old days we didn't have them new-fangled try/catch statements, so goto does have some uses.
 
  • Like
Reactions: bobcomer

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
OTOH, back in the good old days we didn't have them new-fangled try/catch statements, so goto does have some uses.
goto breaks the principles of structured programming and you can no longer trust the lexical structure of the program to match the execution structure. State. Dijkstra wrote about the problems with goto all the way back in 1968. As I alluded to in a prior comment I'll use goto if I'm writing some really short, temporary, experimental code and just need to test out something real quick and it's the fastest way to do to, but goto is a big "code smell"

I can see the case you're making for emulating exception behaviour with long jumps/goto but really I'd prefer a restricting that doesn't "throw" instead. Set an errno and return a null pointer or a special value that indicates an error. I'd go as far as to say I'd prefer you malloc an int and give me the pointer to it so we can make null an error number instead of returning an int directly if errors can occur. It'll effectively make it a form of "optional"

Regardless of the approach I feel like there's always a better alternative - Or at the very least almost always
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.