Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

gnasher729

Suspended
Nov 25, 2005
17,980
5,566
You indeed can have a separation of the implementation from the interface in Java - write the interface as an interface "public interface MyAPI { ... }" and the implementation as "public class MyAPIImpl implements MyAPI { ... }" - so I'm not sure what forces you to see the code (versus the API alone). Yes, there's a lot of programmers that don't create the interface separate of the implementation, or don't even think much of what the interface means or its importance in keeping these concerns separate and the interface hardened and defined as much as possible, but this is by choice. The language does allow you to keep the API separate from the implementation (and a lot of good source code I've seen, e.g. JBoss AS, does do this indeed for every class they produce).

Java does not insist you mix the code w/the interface - the programmer does however :) (unwittingly or not).

You can have Java interface with either C or C++, without a problem, via JNI, and it works quite well.

The first point means I have to add an abstraction layer that doesn't serve any purpose at all except creating an abstraction layer (I think there is a book about "anti-patterns" that would have a word or two to say about that).

And JNI forces me to put another implementation layer between the caller and existing libraries, serving no purpose other than making code accessible that I could use at no cost in Objective-C.

In both cases, there is another set of maintenance tasks added, since every change in any interface now requires more changes in more places than it used to.
 

jsw

Moderator emeritus
Mar 16, 2004
22,910
44
Andover, MA
I complain that few people are learning assembler because they are also not learning how computers truly work, and have very little respect for optimization and how to properly organize code to get the job done without constantly faulting.
Certainly, this is important in many areas. I wouldn't want to hire IDE jockeys to do RTOS work and whatnot. But in a world with multiple 2+GHz CPUs and multiple GB of RAM, optimization just isn't nearly as important as it used to be.
You're missing something there... Its true that I don't know how to do most of those things, and I can make good toast, HOWEVER I am not in the field of producing toasters. If you are making toasters, you probably know quite a bit about those things. Similarly, programmers SHOULD understand how to get to their end result.

Remember, tools are nice, but if you replace your education/knowledge with your tools and drop the first part, your worth and ease of getting a better job goes way down.
I didn't mean to imply that poor programmers become good programmers via IDEs. All I meant to say was that it's the end result that matters. If someone depends on an IDE to recall all the packages and appropriate method parameters, but yet they produce good results, I see no problem with that at all.

It used to be that my interview questions included lots about language specifics to see how well someone knew the language. However, now that IDEs do so much of the detail work as well as provide semi-automated testing, I'm far more interested in the approach someone would take - the high-level, nearly language agnostic approach - because I find that to be a better predictor of ultimate results than someone's detailed knowledge of memory usage.

I believe that one needs to be a critical thinker and good problem solver as well as good communicator to be a good programmer. I simply disagree that the amount someone depends on an IDE to do language-specific things usually matters for most projects. I think you can take a good Language A programmer and turn them into a good Language B programmer quickly with the right IDE if they understand the basics of good design instead of just the intimate details of Language A.

So all I really meant was that - except for cases where intimate language knowledge is important - IDEs allow more people to program than before and do a better job of it.

I know it's the manly thing to use vi and never see anything except as text on the console screen, but a good IDE can make some one immensely more productive. Refactoring is cake, for example.

But, absolutely, if someone can't come up with the right way to do something, all the IDE will do is let them come up with a language-acceptable piece of junk code.
 

chelsel

macrumors 6502
Original poster
May 24, 2007
458
232
I know it's the manly thing to use vi and never see anything except as text on the console screen, but a good IDE can make some one immensely more productive. Refactoring is cake, for example.

No joke, as a matter of fact I use Eclipse and viPlugin http://www.satokar.com/viplugin/ "'cause editing still is the way to code"

Great tool for Eclipse developers who still like to edit with vi. I see way too many programmers reaching for the mouse to highlight text, press the copy keys, move the mouse, press paste... if you know how to use bookmarks and a few simple commands you can be very productive with just the keyboard.
 

Macgenie

macrumors newbie
Jan 8, 2008
14
0
Suffolk, UK
With XCode & Objective C we can all create a quick and dirty app that runs OK - it could leak memory like a sieve but still work, after all once the app stops running OSX clears up and all is well so why worry ?

The point is if a programmer knows and understands the language he/she should write code that obeys the rules and control memory use and generally behave itself - the application of reusable code such as the Cocoa / Appkit frameworks helps greatly to reduce errors and conflicts since we are building on stuff that has been tested and has been shown to work. I see that the latest version of Xcode with Leopard features garbage collection so we move ever closer to that golden day when we don't have to worry about memory at all.

With the inclusion of the bindings system, delegate methods activated from within the system, dynamic runtime binding etc. etc. there has never been a better time to get involved with Objective C - all software systems should be written this way (if we ever get to true multi-platform capability everyone will discover the benefits of Objective C and planet Earth will never be the same again !)

Most of us can remember the bad old days of Windows machines forever needing software reloads and PCs crashing due to hardware conflicts (ever tried playing with the Registry - what a nightmare that was)- the point of Objective C is that it builds on a solid foundation - when you subclass you can be sure that your class instance will work - if there are known bugs, methods can be overridden with your own methods anyway, you just have to keep up to date with the documentation.

New versions of software can never be completely bug free since there isn't enough time to try out all the things that can go wrong - but in the main Macs have fewer problems than some systems I can mention and I would sooner play around with Objective C and the appkit any day - what a tremendous amount of power and flexibility you have under your finger tips !

As for not learning assembler anymore - the whole point of a language like Objective C is that it frees us from basic level I/O issues, hugely complicated port/ chip architecture issues, procedural issues where you write in a very low level of abstraction (i.e. you would have to know a machines' innards absolutely like the back of your hand) - I could go on and on here. Intel chips these days are vastly more complicated than the humble old 6502 cpu, by the time I had absorbed all the implications of assembler on today's machines I would probably have a long grey beard.

Back when I started programming it would take a couple of weeks to write a 2K (yes 2,000 bytes) assembler program for a 6502 cpu - you would have to know the cpu architecture inside out as well as the configuration of I/O ports etc - the thought of having to grapple with the innards of a Mac in this manner would send me deeeeply back into my cave.
Long live Objective C !

Having said that, starting out with 6502 assembler does give you an appreciation of today's level of technology. If you study a high level language like Objective C and learn it as you should then there's no reason why anyone should write shoddy software.

:)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.