There is no right or wrong answer really, and a lot depends on the person.
Me, started with BASIC on my Commodore Vic20 (and then on Amstrad CPC464, something us Brits used to be able to get). That was pretty much self taught from the manuals that came with the machines.
Moved onto Pascal at school when I was 15-18. That taught me how to write proper procedural code (though, to be fair, I was doing a reasonably good job at writing psudo-procedures in BASIC).
Pascal led on to Object Pascal/Delphi at university first year.
C for my second university (plus a bit of Prolog). So, that's where I learnt about memory management.
I decided to use Java (1.0 and 1.1) for my third year project.
In the mean time, at work, done Java, C, Visual Basic and PL/SQL.
The point is though... those latter work-place ones were relatively easy to learn, because I'd learnt about procedural languages, OO languages, memory management, bits of GUI stuff and basic good algorithms stuff.
Could I be a better programmer if I'd done it some other way... who knows.
My recommendations. Hmmm, I think there is an advantage to maybe starting in an
Interpreted Language* rather than a Compiled one. Ruby would be a good start, because Ruby is well supported on OS X. Ruby is a better language than BASIC was when I started.
If the original poster is willing to stick with it, then it's time to move
onto something compiled*. So, then you are looking at Java or "C/Objective C". My gut feeling is, is that if you've got the gist of learning how to program, say based on experience with Ruby, then I don't think it would be that difficult to learn Objective C (and thus C at the same time).
Of course, YMMV!
* I've supplied Wikipedia links so that the original poster knows what is being talked about...