Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ArtOfWarfare

macrumors G3
Original poster
Nov 26, 2007
9,671
6,212
I am currently of the impression that C's macros are borderline useless (you will almost never see a #define which is good code) while Lisp's macros are pretty freaking sweet when you need a DSL.

A typical example when people are talking about Lisp's macros is that you can make new loop syntax which resembles the syntax used in Python? Is it similarly possible to create a C macro which allows for more powerful loop syntax?

I'm feeling right now the best possible macro in C to do this may as well be a function, although I haven't actually tried writing such a thing, I'm just imagining that's how it would turn out.
 
A typical example when people are talking about Lisp's macros is that you can make new loop syntax which resembles the syntax used in Python? Is it similarly possible to create a C macro which allows for more powerful loop syntax?

I'm feeling right now the best possible macro in C to do this may as well be a function, although I haven't actually tried writing such a thing, I'm just imagining that's how it would turn out.

Macros in C and handled by cpp so they are pretty much just search and replace (with a few extras of course). Consequently you can do all sorts of things if you put your mind to it but over use of the preprocessor is rarely a good idea. Perhaps you need to have a careful think how to approach your problem.
 
Macros in C and handled by cpp so they are pretty much just search and replace (with a few extras of course). Consequently you can do all sorts of things if you put your mind to it but over use of the preprocessor is rarely a good idea. Perhaps you need to have a careful think how to approach your problem.

This is purely a theoretical question. I'm wondering if there's some way to write something like:

Code:
for i in array {
    // Do stuff...
}

And have that be valid C code after it has been run through by the CPP. I imagine the CPP would have to do something like...

Code:
#define foreach(i, array) for(int counter = 0, i = array[counter]; i < sizeof(array)/sizeof(int); counter++, i = array[counter])

Which is then used as

Code:
foreach(i, array) {
    // Do stuff...
}

And... yeah, does that as I wrote it work? Can Xcode 6's playgrounds handle C and CPP, or are they restricted to Swift?

The advantage of what I wrote over just having a function is it looks easier to read, I think (if it even works).
 
C macros are crap.

What can try to approximate a bit the power of Lisp are C++ templates.

Look at the Boost library to see the kind of things that can be done.
 
You posted a macro definition of foreach(). I think the simplest way to find out if it works is to compile it and see what happens.
 
You posted a macro definition of foreach(). I think the simplest way to find out if it works is to compile it and see what happens.

Python's REPL and Swift's Playground has made me lazy. I can type code but actually run a compiler? That's like, 10 key presses. Too many.

I guess I probably won't be using C... possibly ever again. Is C still going to be around in a few years? I know its demise has been forecasted for a long time, but I feel like Apple's shift from C to Swift and Google's shift from C to Go are going to be the deathblows. Embedded work seems to be shifting from C to C++, from what I've seen.
 
I can type code but actually run a compiler? That's like, 10 key presses. Too many.

That's strange. In all IDEs I've ever worked with, compiling (and running) is just a single keypress. BTW: Your macro won't work, as you already had statements in it after the for loop; the block in your example later on would not be recognized as "the" block for the loop.

Is C still going to be around in a few years? I know its demise has been forecasted for a long time, but I feel like Apple's shift from C to Swift and Google's shift from C to Go are going to be the deathblows. Embedded work seems to be shifting from C to C++, from what I've seen.

My guess: yes, it will be. It's a better question whether Swift and Go will still be around by then :)

For (tiny) embedded stuff (think microcontrollers) it seems C just won't die. We're developing some new hardware boards here with various types of MCUs on them and each one is programmed in C.
 
I guess I probably won't be using C... possibly ever again. Is C still going to be around in a few years? I know its demise has been forecasted for a long time, but I feel like Apple's shift from C to Swift and Google's shift from C to Go are going to be the deathblows. Embedded work seems to be shifting from C to C++, from what I've seen.

I think C will be around for awhile yet. If Linus Torvalds ever decides to let C++ code in the Linux I'll take that as a turning point (but he won't since he hates C++ with a passion - there is rant of his somewhere where he basically called all C++ idiots in true Torvalds diplomatic style).

Plus for small emdedded systems such as 8 bit AVR chips C++ is just too heavy. Even C feels like you can only use a subset of features. I guess assembly and C are going to be very hard to displace in this market.

As for normal development, the vast majority of open source code still uses C. Just look at Linux, OpenBSD, FreeBSD, NetBSD, most of the software in the GNU project etc etc.

I also can't think of the top of my head any operating system that is currently popular that is written in C++ or the majority of code is C++. I think even the Windows kernel is primarily C from what I have heard.

C will be around for a long time yet. It has some advantages over C++ (and some disadvantages granted) and those are going to make sure it sticks around. Plus its a cool language! I love mucking around in C. It feels quite pure to me playing around with it. But maybe that is just me.
 
I think C will be around for awhile yet. If Linus Torvalds ever decides to let C++ code in the Linux I'll take that as a turning point (but he won't since he hates C++ with a passion - there is rant of his somewhere where he basically called all C++ idiots in true Torvalds diplomatic style).

http://harmful.cat-v.org/software/c++/linus

Huh - never realized Linus had such strong opinions.
 
Just what the !@#$%^& is a DSL amigo? This would make your post more valuable to us here programmers out here in TV land.

You're a dilettante. You dabble in many programming languages but you are a master of none. What you want to do doesn't make much sense in C - a low-level but very effective language when used my someone that has mastered it.
 
Just what the !@#$%^& is a DSL amigo? This would make your post more valuable to us here programmers out here in TV land.

You're a dilettante. You dabble in many programming languages but you are a master of none. What you want to do doesn't make much sense in C - a low-level but very effective language when used my someone that has mastered it.

You should be careful with acronyms. They are subject to multiple interpretations.

I apologize for calling you a "dilettante". That was uncalled for.

I've actually written many things that are like what you are trying to do. Parsers are not as difficult to write as you might think. Many of them are recursive. I've also written lots of these for tagged markups (sgml - Standard General Markup Language, html - Hypertext Markup Language and xml - eXtensible Markup Language). These are easy for machines to deal with but hard for humans. Markups like these define the structure of data but not its appearance. They store data in trees. These are often called fuzzy trees because for example they can have multiple heads, a node can be a child of its own type and they have various numbers of children. In xml the specifiers for what children a node has are specified in a dtd (data type description). Trees are not terribly difficult programing problems.

Recursion and trees are common and these are covered in books.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.