Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

poorcody

macrumors 65816
Jul 23, 2013
1,339
1,584
[...]
So you are always working with these constraints, so in theory you are always “correct by construction.” Except it was massively difficult to fight the rules of physics to actually meet the constraints. You’d move logic across cycle boundaries, mess around with clocks to make some pipeline stages have a little more time than others, duplicate signals to avoid loading effects, duplicate logic, get very clever with logic design, sometimes create custom cells or custom circuits, force the metal router to do what you want by pre-routing a bunch of wires by hand, adjust where cells are located, etc. etc.
When people ask what makes the M1 better than other chips, I think this story provides some good insight into why the quality of the chip-design team is still an essential component.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
When people ask what makes the M1 better than other chips, I think this story provides some good insight into why the quality of the chip-design team is still an essential component.

A lot of the folks who I worked with, for, or who worked for me are now at Apple. The entire Intrinsity team was formerly EVSX, which was a spinoff of Exponential, which was the first place I worked in CPU design. Lots of folks I worked with at AMD are now also at Apple. These folks know how to do this stuff by hand. A lot of the PA Semi folks were from DEC. DEC went even beyond what I was talking about, doing the entire design at the transistor level instead of the gate level. (A lot of DEC folks came to AMD, too, where they heavily influenced what we were doing on the Opteron/Athlon 64 project). The short of it being that many of the most competent designers in the world, with long track records of designing the fastest chips in the world, ended up at Apple.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Imagine doing that on a 16 billion transistor M1...

A lot of those 16 billion transistors are in regular arrays that make it not so difficult. But, yeah, I prefer to do as much as possible gate-by-gate. I don’t have scientifically-derived numbers, but I would guess that hand optimizing the size, shape, and location of each transistor buys you maybe 10%. But the labor and schedule costs are very high. Hybrid approaches, like doing the ALUs by hand, make more sense, and for all I know Apple may be doing that. On the UltraSparc I worked on at sun (very briefly - I resigned after 2 months), I was working on doing the reservation stations at the transistor level, but doing the scheduling logic using gates. But they had no design methodology to speak of, so everyone was doing whatever they felt like.
 

ADGrant

macrumors 68000
Mar 26, 2018
1,689
1,059
Standards are important for programmers because your correctly performing code might actually be buggy and stop performing correctly when you use a different compiler (version) or a different platform. The mostly unknown corner cases of the C/C++ standards and various expectations that exist between different players are the main reason why C/C++ ecosystem is such a terrible mess.

Yes, which is why C++ programmers should try to build their code bases using multiple compilers and ideally on multiple platforms. If on Linux you can build and test your code using both Clang and GCC. Adding a Windows build with the Microsoft C++ Compiler can help flush out additional issues.
 

ADGrant

macrumors 68000
Mar 26, 2018
1,689
1,059
For example, I always assume that the system is little-endian. I don't spend any effort trying to support hypothetical big-endian systems I don't have access to – how would I even know if the code works on them? I assume that 64-bit integer operations are fast, and I only use smaller integers in limited situations. When I design a file format, I care more about making the data in a memory-mapped file directly usable than about reading and writing such files on unusual systems. And in order to even support those memory-mapped files, I already have to use system-dependent code.

The real portability challenge with C/C++ is linking to dependencies. { Linux, macOS } x { gcc, clang } x { x86-64, ARM } x { libstdc++, libc++ } is already 16 separate platforms. Trying to make your code fully portable and standards-compliant is a waste of effort, when it's already difficult to support all these common platforms at the same time.

I guess you never worked with Solaris. I used to work on a C++ code base on Solaris Sparc and Solaris x86 (parts were ported to Windows too). Sparc is big endian and x86 of course is little endian. That caused some issues passing data between various processes where developers had not accounted for the difference.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
I guess you never worked with Solaris. I used to work on a C++ code base on Solaris Sparc and Solaris x86 (parts were ported to Windows too). Sparc is big endian and x86 of course is little endian. That caused some issues passing data between various processes where developers had not accounted for the difference.
To be fair though there's extremely few big endian systems in the wild anymore. And if you work with one of them, you likely know. The vast majority of code written will just be run on little endian systems. PowerPC, z/Architecture, SPARC - all together they don't make up that big a market share. If your code is likely to run on a big IBM Z Mainframe, it's good to support, but most code isn't. So little endian gets you both x86(-64) and ARM. (ARM technically being both big and little endian, but yeah) - Thus I think it's fair enough for JouniS to say they tend to just ignore it
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
To be fair though there's extremely few big endian systems in the wild anymore. And if you work with one of them, you likely know. The vast majority of code written will just be run on little endian systems. PowerPC, z/Architecture, SPARC - all together they don't make up that big a market share. If your code is likely to run on a big IBM Z Mainframe, it's good to support, but most code isn't. So little endian gets you both x86(-64) and ARM. (ARM technically being both big and little endian, but yeah) - Thus I think it's fair enough for JouniS to say they tend to just ignore it

If you are in a portable shop, then you should have macros or libraries to hide endian-ness.
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
Yep. But if you're specifically targeting, let's say macOS, then you don't really need to think about that :)

Yes. But the thing about assuming that you're ever only going to run on one platform is if things change down the road.

I worked at a place that had some 50 different ports over several decades - so portability was really important. Sometimes hardware vendors would come to us and pay us to do a port and we weren't going to turn down the business.
 
  • Like
Reactions: casperes1996

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
Yes. But the thing about assuming that you're ever only going to run on one platform is if things change down the road.

I worked at a place that had some 50 different ports over several decades - so portability was really important. Sometimes hardware vendors would come to us and pay us to do a port and we weren't going to turn down the business.
True that. On the flip side though there'll also (almost) always be something not so portable in there anyway. Especially if you work with GUIs - although I guess there are libraries that try to do that in a cross-platform way too
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
True that. On the flip side though there'll also (almost) always be something not so portable in there anyway. Especially if you work with GUIs - although I guess there are libraries that try to do that in a cross-platform way too

GUIs can be an issue.

So is low-level stuff. So you have some assembler in your portability layer.

With debugging, you're down to the LCD. You can't really make assumptions on what tools you have when you go to debug a problem on a non-core platform.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I guess you never worked with Solaris. I used to work on a C++ code base on Solaris Sparc and Solaris x86 (parts were ported to Windows too). Sparc is big endian and x86 of course is little endian. That caused some issues passing data between various processes where developers had not accounted for the difference.
I think I may have written some Java code on Solaris, but that over 20 years ago.

These days, the world is little-endian. You don't create a big-endian system or buy one unless you are deliberately trying to be weird. It's like having 6-bit bytes or 36-bit words. And if you are trying to be weird, rewriting the software so that it works on your system is your responsibility.

There may be some niche cases where supporting big-endian systems makes sense for the developers. I've never worked on such cases.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
A lot of those 16 billion transistors are in regular arrays that make it not so difficult. But, yeah, I prefer to do as much as possible gate-by-gate. I don’t have scientifically-derived numbers, but I would guess that hand optimizing the size, shape, and location of each transistor buys you maybe 10%. But the labor and schedule costs are very high. Hybrid approaches, like doing the ALUs by hand, make more sense, and for all I know Apple may be doing that. On the UltraSparc I worked on at sun (very briefly - I resigned after 2 months), I was working on doing the reservation stations at the transistor level, but doing the scheduling logic using gates. But they had no design methodology to speak of, so everyone was doing whatever they felt like.
Can you think of any examples of what goes into hand-optimization? I’m curious.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Can you think of any examples of what goes into hand-optimization? I’m curious.
I’m not sure exactly what you’re asking? Do you mean what kinds of circuits are hand optimized, or do you mean how is it done?

On every chip I’ve ever worked on, certain things are optimized transistor by transistor. Things like PLLs (used to generate and synchronize clocks), off-chip I/O drivers and receivers, and RAM structures. You start out with a schematic that you draw, transistor by transistor, showing the connections and the W/L ratio of each transistor, and you simulate it in SPICE, which is a dynamic circuit simulator. When you get it the way you want, you start laying out the structure (drawing the polygons), extract the exact transistor parameters and parasitics (resistance, capacitances) from the layout, feed that back into your simulations, and see how you are doing. You keep that cycle repeating until you’re happy.

At DEC, they did this for almost everything - ALUs, etc. At the places I worked, we did it for various other structures as well, but not for things like the ALUs (multipliers, adders, shifters) as far as I can recall.

Of course, the standard cells are also hand-optimized, but once you have a 2-input NAND gate with a certain drive strength, you use it wherever it is needed - you don’t create a new one each time, unless there is some special need, in which case you add the variation to the library. It does happen, but it’s rare.

For things that are not on the critical timing paths, like random control logic, you almost never bother hand optimizing it.
 

ChrisA

macrumors G5
Jan 5, 2006
12,919
2,173
Redondo Beach, California
Yes I have. You're a talented person but a lot of devs couldn't handle C pointers. In C++ that became easier. Even in Objective-C people struggled with pointers in array elements. It's no coincidence most of these became structs in Swift.
Yes, in the world, uneducated amateurs outnumber those with degrees in computer science. But are these amateurs wring the appestat most people use? Are they writing the game engines? Web browsers? device drivers? No. I have worked in this business from the days before C was invented (I bought a K&R book when it was new) in all that time I never once worked with a developer who did not understand pointers. Yes there is much low-end work done for small web sits and such but I've not seen such incompetence in "real work"
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
Yes, in the world, uneducated amateurs outnumber those with degrees in computer science. But are these amateurs wring the appestat most people use? Are they writing the game engines? Web browsers? device drivers? No. I have worked in this business from the days before C was invented (I bought a K&R book when it was new) in all that time I never once worked with a developer who did not understand pointers. Yes there is much low-end work done for small web sits and such but I've not seen such incompetence in "real work"

Serious question, because I don’t know - do people who professionally develop only in Swift and haven’t coded in any other language know about pointers? I figure by now there must be a bunch of such programmers.
 
  • Like
Reactions: ader42

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Serious question, because I don’t know - do people who professionally develop only in Swift and haven’t coded in any other language know about pointers? I figure by now there must be a bunch of such programmers.
I don’t know about Swift but I’ve definitely worked with talented developers who don’t have a clue about pointers and the various memory addressing modes of low level programming. These were JavaScript/Java developers and they are quite good programmers within their domain.
 
Last edited:

grandM

macrumors 68000
Oct 14, 2013
1,520
302
Serious question, because I don’t know - do people who professionally develop only in Swift and haven’t coded in any other language know about pointers? I figure by now there must be a bunch of such programmers.
Swift also has reference types.
 

pshufd

macrumors G4
Oct 24, 2013
10,151
14,574
New Hampshire
You used to start out in college with a course in data structures, with linked lists and hash tables and whatnot, with extensive use of pointers. Those courses were often used as weedout courses: they were so hard that anyone that couldn’t handle the mental challenge of a CS degree would give up, which was a good thing, because if you thought pointers are hard, wait until you try to prove things about fixed point theory.

All the kids who did great in high school writing pong games in BASIC for their Apple II would get to college, take CompSci 101, a data structures course, and when they hit the pointers business their brains would just totally explode, and the next thing you knew, they were majoring in Political Science because law school seemed like a better idea. I’ve seen all kinds of figures for drop-out rates in CS and they’re usually between 40% and 70%. The universities tend to see this as a waste; I think it’s just a necessary culling of the people who aren’t going to be happy or successful in programming careers.



We had an MIT EECS grad and he was given a small task involving simple C string operations. He looked at it for a few days and couldn't figure it out and asked the project leader for help. It was quite a surprise to me but I started using C back in the 1980s I think. I still have my ancient K&R and an updated version. It was very easy to find on shelves in the office as this was way back before the public internet when we had shelves of books and manuals for reference.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Swift also has reference types.

Which is far from being the same as pointers...

Pointers do show up in Swift when dealing with APIs that require data transfer, such as Metal, but in general I wouldn't expect someone who grew up on Python, Java or Swift to have a good understanding of close-to-the-metal programming.
 

casperes1996

macrumors 604
Jan 26, 2014
7,599
5,770
Horsens, Denmark
It has pointers, too, but I imagine most swift programmers don’t use them, or use them only to interface with objective C (and maybe don’t know how they work?)
I don't know. As with any language I think there are people who work with it professionally at all sorts of skill levels, but I think to be really good with Swift you need to understand what happens behind the scenes too. While I've programmed in all sorts of languages, including a lot of C, Swift is my primary language, and I feel like understanding these things is quite necessary at times. While Swift is primarily an app-development centric language, it is also a language that in theory can be used for systems development. I wouldn't advice it, but it (at least used to be) is in the mission statement for the language. It also has quite good inter-op with C and while it's syntax for pointers is a bit heavy to discourage directly using them where not necessary, to fully comprehend the language's possibilities you need to know how to use UnsafeMutableRawBufferPointer. Furthermore if you work with Metal, which fair enough also adds the Metal Shading Language which is sort of C++, but anyway in that case you'll have to deal with "pointers". I put it in air quotes because for Metal, Swift has some types that make it a bit more than just a pointer, but it is effectively just pointers still, and you need to think about byte alignment across CPU/GPU boundaries and such.
I think as of right now still, a lot of Swift developers are old Objective-C developers, so I'm not sure there are that many who are "only" Swift developers, honestly.
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
I’ve used Swift code to interact directly with memory mapped hardware registers on the Pi for a small project around the house. I actually thought it was pretty straight-forward coming from C/C++. But there’s definitely some stuff Swift does that isn’t well documented, but if you are aware of it, can lead to some surprisingly clean code if you can properly define your types. There’s definitely some cool quirks for systems development in the language that even lets you start leaning on modern type safety features, even if they aren’t the focus.

But yes, almost everyone I’ve worked with in Swift is also an Obj-C developer and/or a C/C++ developer. But in the space I’m in, it seems like large companies want to TypeScript all the things. :(
 
  • Like
Reactions: casperes1996
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.