Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

PKBeam

macrumors regular
Original poster
Jul 24, 2015
230
173
NSW, Australia
I'm quite new to coding (~1 week) but I think I'm making good progress on Python. When I get a new Mac later in the year I want to immediately get into iOS/OS X programming as a hobby - to create useful or entertaining apps. I understand that Swift is different to the C variants but maintains compatibility.

My question is, as someone with no prior experience in either, would it be better for me to learn Objective-C before Swift or vice versa?
 
My opinion is that unless you plan on working within ObjC for some reason, you should learn Swift. It's much more marketable, and it's clearly the way forward for Apple. That's not to say that ObjC isn't important, but start with the most modern language, and work backwards.
 
  • Like
Reactions: neutrino23
I'm quite new to coding (~1 week) but I think I'm making good progress on Python. When I get a new Mac later in the year I want to immediately get into iOS/OS X programming as a hobby - to create useful or entertaining apps. I understand that Swift is different to the C variants but maintains compatibility.

My question is, as someone with no prior experience in either, would it be better for me to learn Objective-C before Swift or vice versa?


For new Mac apps Swift is the way to go. That said, learning to write non-trivial software is not about learning a programming language.

As an analogy learning to be a get fiction writer, to write novels or movie scripts is not the same as learning French or Spanish. Good story and plot writing skills can be used in any language. Same with software. So stick with now now-native language, Python until you can write stuff that other people would actually want to use. You will find that AFTER you get to that point learning a new programming language is easy.

My first programming language was FORTRAN back in the mid 1970's and then whatever was popular at the time PL/1, COBOL, Algol, C, C++, TCL/TK, and now Python. Most of the learning is about how to construct systems that are easy to build, and MAINTAIN and test. Coding is the smaller part of any non-trivial software project.

You can write software in Python on a Mac. What you don't want to be is the guy who knows English, French and Chinese but can't write anything worth reading in any language.
 
Thanks for all the advice. I'm currently concentrating on familiarising myself with Python. I'll start looking at Swift when I get a new Mac, and after getting better at Python, I'll work on Swift harder.
 
One of the issues with programming for any device is the API involved.

Learning a programming language is relaively easy when compared with how one must write to interface with a device and its user.

If you plan on writing for OS X and iOS then you must learn the language(s) which are primarily used for that OS. Then next step is learning all the calls and "tricks" one must use in order to have a real user friendly functioning program. Every new programmer can write "Hello World" programs, but to design a program/app where the user can see their world is a whole lot more complicated.

Personally, if your end game is to design and write OS X/iOS stuff, my recommendation is to drop Python and start working on either OC or Swift. Of course before you get much further you will also have to learn Apple's xCode development tool and its nuances. Basically the days of using a simple text editor, compiler and linker are history.
 
  • Like
Reactions: ccamelot
i'm with ChrisA on this. Same era. MASTER one language and the subsequent ones will come easily. Also helps to work on substantial problems, because building things that are reusable elsewhere is a good way to leverage your skills on bigger projects.

My first three were PL/I, Assembler (in school) and Basic for a summer job. The substantial problem was chess, which a couple of guys and I worked on for a couple of years at Duke.

That knowledge base led to job offers in NYC and Chicago in 1973, but I think the key to getting in the door was showing how I attacked the substantial problem.
 
Here's another vote for finishing learning Python. You've already started, which is farther that a lot of people get. And there's a vast amount to learn on the path to being able to create and debug full-featured programs or apps in Python. After you learn to code in Python (including some OOP), learning Swift should be a lot easier.
 
  • Like
Reactions: mscuthbert
Basically the days of using a simple text editor, compiler and linker are history.
I am going to guess that you are talking about an Integrated Development Environment vs individual tools there, since editing and compiling/linking/building are pretty much what Xcode does (although it definitely isn't simple).

As already mentioned, the particular programming language(s) used these days is just a small part. You also have coding styles and best practices, the various APIs such as Cocoa, Cocoa Touch, Android, etc, and then there is that monstrosity known as Xcode, which is a topic itself. No matter what languages(s) are used, you will spend quite a bit of time just looking stuff up on the web and in the documentation, so being able to at least read code snippets and samples becomes fairly important. Depending on what you are doing, this could involve everything from XML/HTML to C/Objective-C, in platforms/environments from Android to OS X, so learning some general principles will help even more.

As for me, I'm not a professional programmer, nor to I really intend to be one, but I am interested in a wide variety of topics and have been dabbling with these new-fangled machines since about 1978. My programming environment is apparently historical and tends to lean toward simpler tools that I don't have to spend a lot of time to keep up with: a text editor (BBEdit, although there are several about), the Terminal, a scripting language that can also scale to more serious applications as needed (not sure what the hubub is about Python, I prefer Ruby), RubyMotion (a toolchain that uses the compiler/linker provided by the Xcode developer tools), and a few additional tools such as Dash.
 
  • Like
Reactions: superscape
Basically the days of using a simple text editor, compiler and linker are history.

Actually, it's fairly straight forward to use Vim and xcodebuild (or even a makefile !) to create OSX apps (both command-line and GUI, Swift , Objective C or wrapped JavaScript/HTML5, and etc.)

But your are right, one big difference isn't the tools; it's the need to learn event driven control flow, human usable UX/UI design, and some usable subset of the vast number of APIs available. But those can be learned separate from the basics of problem solving using code and debugging the result. And thinking in code is the actual hard part, IMO.
 
Last edited:
Basically the days of using a simple text editor, compiler and linker are history.

Don't know where you're getting that. I do this literally all the time. If you have Xcode, you have standard command line build tools available, so you should be able to do all you need to do from a terminal. Whether you prefer to or not is a different story.
 
  • Like
Reactions: mscuthbert
Fire and red get it and 46 explains the changes.

When I read questions about which language to learn I get the feeling is the real question is something else. I think many are interested in finding that one golden egg which will make them a coding success forever. The very fact today's coding world has so many different languages available should scream loudly that it ain't that simple.

The world of coding is huge and complex these days and just simply knowing how to do a for loop or allocate some memory won't get one very far. The tools, that include debuggers, playgrounds, simulators, and more, are far more robust and complex. In the day our debuggers were printf and statements which required the program to stop and wait for a simple keystroke before moving on.

Learning the logic and syntax of coding is just a start, the real work begins when one has to actually produce a product which can be used by a person who doesn't have a clue what you were thinking.

If you want to be good (not even great) the learning and growing is a life time process.
 
Last edited:
If you want to be good (not even great) the learning and growing is a life time process.

I am a definite NOVICE programmer....I learned FORTRAN, BASIC ans PASCAL back in the 1980s while in high school and college (aerospace engineering degree), and then I did some rudimentary MATLAB and MATHEMATICA programming in biomedical engineering grad school, all programming came to a screeching halt when I started my 3rd year of medical school in 1996 and all of my time was spent working at the hospital and memorizing medical trivia just to survive.

But now, as an internist in his late 40s with an 8-year-old daughter who loves math and science, I am FASCINATED by what computers can do and what computer programming has become. What used to be a handful of common languages that people used has turned into dozens of commonly used languages (and hundreds of less commonly used languages) for purposes ranging from simple "Hour of Code" projects where you get BB-8 to roll around the desert and pick up scrap metal to full iOS and Mac OS X application development.

I'm learning Python right now, mostly because I asked around like the OP about what I should learn first, and people usually told me to learn Python, Ruby or JavaScript FIRST, then move on to C and/or Java, and THEN to either Swift or Objective C. And the more I've read on this and other forums, the more online courses I've taken, the more books I've read and the more tutorials I've done, the more I realize that bjet767 is absolutely correct...like all disciplines, it's a lifelong learning process and the people who are great are the ones who are fully committed to learning and practicing all they can. Sure, you can probably learn to COPY what other people are doing and apply it to your own CLONE of some other simplistic app, but are you really going to ever make anything great doing that? Also, will you ever be able to solve any original problems if you only learn by mimicking other programmers? It seems to me that the trick is to devote at least an hour or two per day, EVERY DAY, to learning and practicing programming. Master one language as much as you can (I've chosen Python), to the point where you can solve most programming problems you encounter. THEN, when you feel you've achieved relative mastery, move forward and try to learn 3 more things: XCode, Swift and Objective-C. But if you look for short cuts and ways to quit corners, I think you will only be cheating yourself and your knowledge and understanding will suffer. Study it like you'll be taking some kind of licensure exam at the end...then you will UNDERSTAND it, and that's really what this whole thing is all about.

Sorry for the stream of consciousness, but bjet767's comment really struck a chord with me, as I believe it's true for every discipline, not just programming.
 
Thanks for all the advice. I'm currently concentrating on familiarising myself with Python. I'll start looking at Swift when I get a new Mac, and after getting better at Python, I'll work on Swift harder.

Good plan.
90% of Python concepts will translate directly to Swift. A couple of years down the line & you could probably start with Swift, but I think its still maturing (there's a few incompatible changes in the pipeline). The Swift "playground" idea - that continually runs the code while you edit it - could be a lovely learning environment... but I quickly found that it has a few warts (e.g. if you even in passing create an infinite loop, it hangs Xcode). There's lots of beginner's support for Python, but Swift support currently seems to concentrate on people switching from ObjC or other languages & some of the foundation libraries are still a bit ObjC-tinged.

Forget Objective C - it has its strengths but it takes its Object Oriented Programming model from Smalltalk which is very different from Python, Java etc, Although technically it is a cross-platform language, its use is predominantly for iOS and OS X programming, and Apple's focus has now shifted to Swift.
 
  • Like
Reactions: mscuthbert
Definitely stick with Python for now. I 100% agree with others saying how you master one language, the others became easy. My first exposure to programming was learning Java. That's what taught the basics such as primitive types and object-oriented techniques. When I learned Python a while later, it was amazing how quickly I ramped up on it. It was basically just learning the syntax and the nuances of Python because I already understood the basic concepts.
 
Just perusing the threads and thought I would comment on this:

you should be able to do all you need to do from a terminal. Whether you prefer to or not is a different story.

My question is, why would anyone want to go back to using command line tools?

I remember writing batch files, grep, other command line stuff and the integrated tools are far better today. Theses kind of tools (integrated suites) go way back to the days of MS C and Boreland C products and have really matured. Command line stuff was used in the days of K&R coding.

However, knowing how to write batch files makes understanding the build tools a whole lot easier.
 
If you already have some experience with a C-like language, learning Objective-C might be interesting and useful – specifically, you might end up feeling it gives you some insights into the backstage workings of Swift. Swift is very tidily object-oriented and it's really safe because a lot of potential runtime errors are defined away by the language's syntax.

I don't see Objective-C ever having a practical use in the future for anyone able to learn Swift, though – so I'd only learn Objective-C at this point if you feel like your nerdiness compels you.

Objective-C's syntax is very unlike any other language I'm aware of. It was my own first language for native app development, and that made it's unusualness seem normal to me at the time (and made C-like languages eventually look all foreign and messy to me), so it could also be a fun exercise in attempting to clear your mind like a person learning an actual second language, kind of thing.
 
My question is, why would anyone want to go back to using command line tools?

If you know what you're doing, using the command line can actually be faster than IDE command keys and/or taking your fingers off the keyboard to try and accurately point at some tiny graphical icons.
 
... Objective-C's syntax is very unlike any other language I'm aware of. ...
Its message-send syntax was patterned after Smalltalk. Smalltalk uses square brackets for a completely different purpose; they're only for Obj-C's benefit.

The enclosing square brackets around a message-send happen to simplify parsing. First, in plain C, they're only used for subscript references. Second, it turns out that distinguishing message-send syntax from subscript-reference syntax is pretty straightforward; there's no possibility of ambiguity, because a pointer-expression must occur before a subscript-ref, and a pointer-expression CAN'T occur before a message-send (doing so would be incorrect syntax). This means a C compiler will reliably fail to compile all Obj-C message-sends, rather than silently compiling something unintended and wrong.
 
My question is, why would anyone want to go back to using command line tools?

...because they can be extremely useful and efficient when you hit the limits of what your IDE can do automatically. Graphical user interfaces are fine up to a certain level of complexity, beyond which they get a bit cumbersome and limited, and its nicer to have a script or text-based config file that can be annotated, have sections commented out etc.

Even if your IDE can handle your basic edit/compile/debug cycle there might be other steps you want to automate (building directory structures, creating data files, creating/initialising databases, building packages for distributions) and script files (using command line tools) are ideal.

Plus, IDEs are fine if you are, say, doing an OS X or Windows app in Xcode or Visual Studio and can keep all the build details in a proprietary project file, but if you're writing for the web or cross-platform you can't rely on always having the correct IDE available to do the build. For web development you're often deploying to a remote server with no GUI.
 
Go with Swift, it's a new language that's evolving very fast.
There are tons of ways to learn it, but the one I found to be the best for a beginner is a course by Rob Percival at Udemy.com
 
  • Like
Reactions: albebaubles
My opinion is that unless you plan on working within ObjC for some reason, you should learn Swift. It's much more marketable, and it's clearly the way forward for Apple. That's not to say that ObjC isn't important, but start with the most modern language, and work backwards.
I wouldn't say it's more marketable. Still going to need object for a bit, me thinks.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.