Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't see how it's about the same? Android can be coded on much cheaper computers and the dev environments are free and or cheaper too. You need a pretty modern Apple computer to do iOS development. Android can be written on a $300 laptop.

Plus, to publish iOS apps you need to be a registered Apple developer and that costs money too.

I AGREE!

I have been using more than 5000$ over a short time to get totally prepared for IOS development. But btw android is easier and cheaper opinion...
 
Note the you may need to spend a lot more on test devices for Android if you want to test your app on at least the most popular 50% of your potential customer's device models. Developing for iOS you can probably get by with only access to around a half dozen or so different iOS devices. Android, you might need several dozen. That increases the cost and test plus QA time for Android significantly.
 
As I understand it, there are 3 tiers with iOS:

Free - download Xcode and run in the simulator

Paid - $99/yr in order to have test devices and be able to upload to app store

Paid Enterprise $??/yr in order to put your app devices for a company

.........
Enterprise is $300/yr
https://developer.apple.com/programs/ios/enterprise/

As to why you have to pay $99/yr, FWIW, it does better ensure better quality apps are out there. However, many AppStore developers are hard pressed to make their money back to break even on this ($142 after Apple takes their cut). This isn't attractive to me. I'll be going with Android as my first forage into mobile development.

I would not let "which is cheaper to program" make the choice of which platform I will be using.

You need to pick the platform/hardware that fits your needs the best and go from there!:cool:
Let it be a combination of them then? While this is certainly true, it doesn't do you any good if the tools are only something a small or larger business could afford (few thousand $$, or even 5-digit cost tools)

If the app is for in-house use, then there is no need to publish on the app store, hence no need for a registration fee.
I don't believe this is correct. I've heard from several Apple devs that you need to pay to get your apps onto iOS devices. Period. (as opposed to Android where you can test it out on devices, and if it pans out, pay your $25 to Google Play and put it up for sale). Being able to do what you said would circumvent this...
https://developer.apple.com/programs/ios/enterprise/

And if i am right for android you only pay $25.00 one time and it not like iOS.:)
Not quite true. You don't even need to pay Google anything. You can always sell your apps without Google Play, and do so on your own via something like your own website. However, many folks would prefer GP since it's more trusted, and makes installing apps a breeze.

This is how Palm OS apps were sold (and still are, as some of the ESD's and personal webpages still allow you to purchase them!), as well as with Windows Mobile apps (that came out around 2000 to 2008... NOT to be confused with Windows Phone of today).

You can also sell on Amazon Marketplace, but they charge $100 a year and take a 30% cut. So it's like the worst of both worlds? (paying Apple fees, but being on Android marketplace?) :confused::eek:


The $99/annum is also a small percentage of the cost of a recent Mac plus iOS device (or two), so doesn't really make that much difference to someone who can afford a Mac and is serious about app development. Not a big deal in a metro area where a moderately experienced dev can charge around the neighborhood of $100++/Hr or $35k++ per iOS app.
Those who want to try their hand at iOS development do have a higher barrier to entry. I talked with one person I've known for years who took a week-long course in iOS development to get started... it IS an investment.

OTOH, an episode on The Tablet Show suggested getting a $400 to $600 or so MacBook (that can handle iOS development of course) and trying it out. If it doesn't work, you're out $100 for the 1st year annual developer fee, and sell the MacBook at about $100 loss. Not too large a price to pay to see if iOS development is something you'd like to do. And yeah, they suggested to NOT bother with using emulators for OSX unless you got a "Mac Whiz kid" that can help troubleshoot that environment for you.
 
No, never built any games except some Cocos2d tutorials. I've heard they [xbox...] charge thru the nose.
There used to be a charge of $10K per update on an Xbox Live Arcade game*. 1st one was free. I hear Sony also had something similarly outrageous. Dungeon Defenders was used as a case study for one article, and this was cited as why the Steam version got more updates, and ended up getting most of the extra characters (6 more of them IIRC), extra levels, items, and other content.

It wouldn't explain why the Android and iOS versions also lagged behind, but I suppose they had their reasons.


*too little too late I hear. One podcaster talked with an indie developer for XBLA. His quote was "Microsoft didn't drop the ball. They threw it".


I guess the whole thing works, except the discovery of new apps, but this would have been the case with or without the app store.
What would being charged to put up apps have to do with this?

If I were to have someone write and application for me, is it cheaper to have them code it for iOS or Android, in general, or is it about the same. :apple:

I'm thinking the actual cost will be greatly influenced by developer experience and tools on hand.
 
Last edited:
What would being charged to put up apps have to do with this?

In theory, if the cost of putting an app on the app store were higher, people would be less willing to put up apps that are lower quality.

Consider: If Apple chared $10,000.00 per app per year + 50% ... Developers would think hard about a simple "copy cat" app. When the cost is so cheap, people put something out there just to see how it does, hoping to get lucky and go viral.

Much like the web during the 90's ... Business put up web pages because they saw it as free (cheap) advertising.

Example: I've been working on a new business that involves manufacturing. The cost of a die for injection molding is about $10K ... This would be for 1 of about 4 that I would need. This make me rethink the whole thing. It make me think if I do the work and spend the money, will some company knock it off and how will I protect it (if I can protect it) and how much would that cost. Costs makes people think (usually).
 
Last edited by a moderator:
In theory, if the cost of putting an app on the app store were higher, people would be less willing to put up apps that are lower quality.

Consider: If Apple chared $10,000.00 per app per year + 50% ... Developers would think hard about a simple "copy cat" app. When the cost is so cheap, people put something out there just to see how it does, hoping to get lucky and go viral.
Even though the point still stands, at that rate, they wouldn't get many takers since only large companies could really afford to be devs. Worse yet, the little guys would then consider developing for And or any other competition. From my observations, Palm OS and Windows Mobile (again, NOT to be confused with Windows Phone) died out since some of the ESDs (electronic software distribution) sites take a 65% cut of all apps!

AFAIK, some of the annual cost for being an iOS developer may get undone by how the AppStore appears to be more of a "to go" place for apps then the Android platform. That attracts more of those types of des.
 
Even though the point still stands, at that rate, they wouldn't get many takers since only large companies could really afford to be devs. Worse yet, the little guys would then consider developing for And or any other competition. From my observations, Palm OS and Windows Mobile (again, NOT to be confused with Windows Phone) died out since some of the ESDs (electronic software distribution) sites take a 65% cut of all apps!

AFAIK, some of the annual cost for being an iOS developer may get undone by how the AppStore appears to be more of a "to go" place for apps then the Android platform. That attracts more of those types of des.

IMO, It's really an issue of balance. Apple and Android have flooded app stores. In the rush to have big numbers of apps, they've left out quality. Although Apple is known for having far fewer crapps, it's still flooded. 65% of users download zero apps in a month.

Developers can't get discovered without 1st rate marketing or real luck. Which in itself is a hidden cost of having a profitable app. Many are looking for every trick in the book to find cheap, effective marketing.

IMO, Apple introduced Swift to attract new developers, yet the odds of a profitable app keeps getting slimmer and slimmer.
 
I don't see how it's about the same? Android can be coded on much cheaper computers and the dev environments are free and or cheaper too.

Many of the highly experienced Android developers I've talked to recently use MacBooks anyway, so it's the same cost either way. At least one, who does lots of multi-platform mobile development for startups, charges more for Android than iOS apps because he finds performance tuning and testing takes a bit longer. But it's not cheap either way. Up-front dev tool costs are insignificant compared to several weeks/months of $50 to $200 per hour consulting fees.
 
IMO, It's really an issue of balance. Apple and Android have flooded app stores. In the rush to have big numbers of apps, they've left out quality. Although Apple is known for having far fewer crapps, it's still flooded. 65% of users download zero apps in a month.

Developers can't get discovered without 1st rate marketing or real luck. Which in itself is a hidden cost of having a profitable app. Many are looking for every trick in the book to find cheap, effective marketing.
Never mind the junk... it appears there's too many good pieces of software that they all simply can't get noticed. Pundits and commentary-alike say that even good apps fail to make it, if nothing else because they just couldn't get noticed. Nothing to do with the quality of them (those in question were good).

When I first got my IpT3 back in 2010, I was on a "honeymoon phase" where I spent a lot of $$ on apps. $120 the first year, $100 the 2nd year (and yeah, I did record all of my purchases b/c I knew it could get out of hand otherwise). Now I've been on an IpT5 for about 1.5 years. I've probably purchased about $10 worth of stuff this year. Has the quality of apps gone down? I wouldn't say so. However, I just have so many games in queue that unless they offer it for free, it's not even worth spending $1 on something unless I REALLY want it, as $1 here and there on apps you won't touch (pun intended I suppose) is still wasted $$.

IMO, Apple introduced Swift to attract new developers, yet the odds of a profitable app keeps getting slimmer and slimmer.
I thought the bigger reason for Swift is so that Apple now has a dev tool to completely call their own. With that, they'll leverage the appropriate advantages from that. IIRC, even though there's a learning curve (like with any new programming language/script really), I hear it's easier to learn/work with vs. Objective-C?


Many of the highly experienced Android developers I've talked to recently use MacBooks anyway, so it's the same cost either way. At least one, who does lots of multi-platform mobile development for startups, charges more for Android than iOS apps because he finds performance tuning and testing takes a bit longer. But it's not cheap either way. Up-front dev tool costs are insignificant compared to several weeks/months of $50 to $200 per hour consulting fees.
I've been keeping watch on Android vs. iOS apps around the time period from 5 years ago to 2 years ago. Whenever a game got released for both platforms, I'd compare their price points and features. I'd lean towards iOS since my IpT is primarily for gaming (not to mention using up a gift card), whereas my Galaxy s2/4 is for the essential stuff. If the former runs out of battery, then no more fun collection of games. However, if the latter runs out of battery, no more cellphone nor GPS, which can be a big deal.

I've noticed the discounts on Android tend to be better... $4.99 to $0.99 on iOS vs. $5 to 25c for Android. Also bought several indie games that were available on both platforms, and the And ones were either the same price, or $1 to $2 cheaper. The developers that did respond to my inquiry cited that the cost of developing on both lead them to be able to charge less on And. AFAIK, for And dev., there may be more hardware and software, but you can easily prune down those environments by going with Android 4.x. Back even mid-2013, 50%+ were on And 4.x. Now, it should be even better. If you can dev. for 2.3 without much work, then go for it. Otherwise, these days, those still on devices bef. 4.x probably got their phones for free and wouldn't have been interested in your apps anyways.
 
First of all, regarding licensing:
I really can't stand people that complain about $99/year to test their apps. This is a VERY good deal. The simulator is fine enough for you to try out your app enough to know it works. Allowing you to make an .ipa without the $99 fee would mean thousands of applications finding their way onto devices as spam or potentially thousands of repaired devices through Apple's warranty. Here's the issue~ Android doesn't warranty your Galaxy SX Alpha Mega whatever. Samsung does. Google doesn't require you to install apps exclusively through the Play Store (it's manufacturer discretion), but Apple does and for good reason. Their locks on the developer gateway to device installation is the reason iOS revenue generating applications are punching out 85% higher revenue per app basis. Despite having 50% less downloads per quarter, iOS generates nearly DOUBLE the revenue of Google Play.

Apple is not just the software system, it's the hardware system. And they can't jeopardize their hardware repair costs to satisfy developers that can't even afford $9 a month.

So if you put some crazy spammy badly written .ipa on your own iPhone under your $99 developer account and your device crashes, no problem. It's an isolated incident and the $99 cost ensures that you're a serious developer. But let's remove the $99 cost? Now thousands and thousands of people are loading projects of varying quality onto their devices and mostly people who don't want to spend $99 on a developer account. If you charged $14.99 for a movie screening at one theater, and then had another theater with "free", which place would be a bigger mess?

So now Apple has hundreds of iPhones and iPads coming into their stores every day from amateur developers who don't want to pay $99 for a developer fee, blew up their iPhone with memory leaks, poor optimization and crashes and Apple has to front the bill of their costs because they did an iTunes restore to hide their trail. In order to combat that, they'd have to invest hundreds of thousands of dollars into implementing some kind of flagging system to an IMEI/serial # device so that the Apple store knows it was using experimental software. And all for what? So that a bunch of cheap people don't have to pay for a developer account?

And to the people saying, "They still require a name and contact information" as a response to the potential spam. Are you seriously going to sit there and tell me that requiring a NAME is as viable a defense as a credit card charge? Seriously? You believe that people on the internet don't use FAKE names?

There's no reason for Apple to do this except so Apple haters can stop comparing Android SDK to iOS SDK. Well here's a news flash for you-

Me and the rest of the iOS developers who publish our software and don't mind the cost, don't care if you don't want to pay for the developer license. It's a small price to pay and it's necessary.

Apple iOS, despite both OS's being around for about 7 years, is much more profitable than Android. iAds revenue alone on free apps is roughly double on average than Android ad revenue.

I'm poor. I owe the government around $50,000 for student loans and $15,000 for my car loan. And I have no issue paying $99 a year because I know that the entry means there's a LOT more oversight and protection of the ecosystem. I wish Google would implement a lock from installing APK's from internal memory so that Android developers could have better protection from piracy. Until then, I'm iOS only.
 
Never mind the junk... it appears there's too many good pieces of software that they all simply can't get noticed. Pundits and commentary-alike say that even good apps fail to make it, if nothing else because they just couldn't get noticed. Nothing to do with the quality of them (those in question were good).

Exactly right, with all the development going on, even if 80% were junk, the others are still lost in the flood.

IMO, this is very sad because now it's an issue of development and marketing and maybe more marketing than development.

I thought the bigger reason for Swift is so that Apple now has a dev tool to completely call their own. With that, they'll leverage the appropriate advantages from that. IIRC, even though there's a learning curve (like with any new programming language/script really), I hear it's easier to learn/work with vs. Objective-C?

IMO the language really doesn't matter that much. It's all the same (loop, selection, iteration) that ends up calling the APIs. ObjC has a great runtime and has all the advanced functionality you need. Most consider Swift to be easier to learn, but the truth of the matter is that developing most apps is much more involved than just the language. If someone was 1/2 thru learning ObjC, there'd be little reason to learn Swift.

It could be that Swift is better suited for some types of apps, IIRC, Apple said it was setup better for game dev.

I see computer languages like written languages. You can write a novel in English, French, or Spanish... the story isn't in the language of the book.

This doesn't address run-time VM languages vs native as far as power/speed goes, just language in general.

I've heard Swift apps are HUGE compared to ObjC, this might be a movement towards VM type language like Java.

One strike against Swift is that clearly Apple had a chance to be more platform independent, yet they opted to go with a new language that does nothing toward code-reuse on other platforms.

We as developers can't ignore Android forever, some opted to go with other products that offer cross-platform, Apple ignored this need.

Apple clearly wants the great apps on their platform and to keep them off Android. This comes at a cost to the developers.

Imagine if Dell/Compact/IBM/Acer PC's all required different code sets...:rolleyes:
 
Exactly right, with all the development going on, even if 80% were junk, the others are still lost in the flood.

IMO, this is very sad because now it's an issue of development and marketing and maybe more marketing than development.



IMO the language really doesn't matter that much. It's all the same (loop, selection, iteration) that ends up calling the APIs. ObjC has a great runtime and has all the advanced functionality you need. Most consider Swift to be easier to learn, but the truth of the matter is that developing most apps is much more involved than just the language. If someone was 1/2 thru learning ObjC, there'd be little reason to learn Swift.

It could be that Swift is better suited for some types of apps, IIRC, Apple said it was setup better for game dev.

I see computer languages like written languages. You can write a novel in English, French, or Spanish... the story isn't in the language of the book.

This doesn't address run-time VM languages vs native as far as power/speed goes, just language in general.

I've heard Swift apps are HUGE compared to ObjC, this might be a movement towards VM type language like Java.

One strike against Swift is that clearly Apple had a chance to be more platform independent, yet they opted to go with a new language that does nothing toward code-reuse on other platforms.

We as developers can't ignore Android forever, some opted to go with other products that offer cross-platform, Apple ignored this need.

Apple clearly wants the great apps on their platform and to keep them off Android. This comes at a cost to the developers.

Imagine if Dell/Compact/IBM/Acer PC's all required different code sets...:rolleyes:

This doesn't really apply to game developers, as most Indie developers use Unity or Unreal, which have their own language systems and support exporting to various systems. I develop on Unity and I've only used Swift for maybe one or two samples so I can learn it for when I import a Unity project.

It's hardly anything game breaking. (See what I did there?)
 
IMO the language really doesn't matter that much. It's all the same (loop, selection, iteration) that ends up calling the APIs. ObjC has a great runtime and has all the advanced functionality you need. Most consider Swift to be easier to learn, but the truth of the matter is that developing most apps is much more involved than just the language. If someone was 1/2 thru learning ObjC, there'd be little reason to learn Swift.

I disagree. Syntax is one thing, but languages handle things different.

The main languages I program are C, Objective C, Java, Perl and Swift.

Objective C required developers to understand memory management and reference counting so there weren't memory leaks or null references.

Swift goes the Java route where developers are not concerned with those details. It's a lot easier to code in Swift and Java. It's like driving an automatic car vs a stick-shift car. Let the car/language do the dirty work of shifting gears/managing memory.

In Objective C, for example you'd have to know name conventions of functions so you'd realize that some functions hold references to objects, some don't - all depending on the name. Then you'll run into code where the developer named things incorrectly from what is convention, then you've got memory issues. Swift avoids all that.

Some languages are type-safe, others aren't. Swift is much more type-safe than Objective-C, nazi-like sometimes. And that gives Swift coders headaches.

Languages like Perl are a different beast altogether. Say, you're dealing with two variables. One contains a 5, the other the letter D. Add those together what do you get? Perl will tell you it's 5D. Most other languages will give an error because you can't add a number and a letter!
 
Apple clearly wants the great apps on their platform and to keep them off Android. This comes at a cost to the developers.
To be fair, Google has set things up where developing Android apps has its perks and its conveniences, but they too make it difficult for you to stray away from them. I suppose I can't say I blame either of them.

Imagine if Dell/Compact/IBM/Acer PC's all required different code sets...:rolleyes:
There's always the analogy of if your car requires a specific type of gasoline that isn't readily available at most gas stations. And no, not talking about octane levels.

First of all, regarding licensing:
I really can't stand people that complain about $99/year to test their apps. This is a VERY good deal.......

$99/yr isn't that much. It isn't trivial either. Simply take it into your consideration as part of the bigger picture if you go into development.
[shrug]
 
This doesn't really apply to game developers, as most Indie developers use Unity or Unreal, which have their own language systems and support exporting to various systems. I develop on Unity and I've only used Swift for maybe one or two samples so I can learn it for when I import a Unity project.

It's hardly anything game breaking. (See what I did there?)

You're right, Unity really (from what little I know of it) hits the mark! One code set, many platforms. Maybe someday I'll dig into games just to check it out :cool:

----------

I disagree. Syntax is one thing, but languages handle things different.

The main languages I program are C, Objective C, Java, Perl and Swift.

Objective C required developers to understand memory management and reference counting so there weren't memory leaks or null references.

Swift goes the Java route where developers are not concerned with those details. It's a lot easier to code in Swift and Java. It's like driving an automatic car vs a stick-shift car. Let the car/language do the dirty work of shifting gears/managing memory.

In Objective C, for example you'd have to know name conventions of functions so you'd realize that some functions hold references to objects, some don't - all depending on the name. Then you'll run into code where the developer named things incorrectly from what is convention, then you've got memory issues. Swift avoids all that.

Some languages are type-safe, others aren't. Swift is much more type-safe than Objective-C, nazi-like sometimes. And that gives Swift coders headaches.

Languages like Perl are a different beast altogether. Say, you're dealing with two variables. One contains a 5, the other the letter D. Add those together what do you get? Perl will tell you it's 5D. Most other languages will give an error because you can't add a number and a letter!

I can't claim to be an expert, but as I understand it, ref counting is pretty much a thing of the past with ARC. However, there's still all the Strong, Weak, Static, etc...

Part of the point is that each program as an amount of code that does work and an amount that just calls API functions/methods.

In other words, you can do a pretty nice program with not too advanced coding in the language, just making API calls and the language just controls the calls.

Other programs can require very advanced knowledge of the language because the developer wants to do things a certain way that's not covered by the API or some "settings".

This seems to change over time. I can remember writing "button" routines many years ago... now buttons are drag-n-drop and check a few boxes.

IMO, this is really an issue of what you want the program to do vs what comes stock in the APIs.

But I do agree that ObjC has more to learn than most, it didn't seem to catch on till iOS caught on. I really can't compare to Swift, as I've only read part of one book.
 
If I were to have someone write and application for me, is it cheaper to have them code it for iOS or Android, in general, or is it about the same. :apple:

I believe the question is too vague to answer.

The platform is not the only factor that determines the cost. As a developer myself I would charge differently if I am asked to implement a fancy address book or a 3D game. There is a different degree of difficulty in each and there are also different areas/fields where each developer feels more comfortable with.

The size of your project also plays an important role. If your project requires months of implementation, then the rest of the costs mentioned in this thread (Apple subscription, or buying a mac) would be insignificant compared to the development cost.

If you provide more details I think you have a better chance to get more helpful answers.

Hope that helps.
 
While I've heard Unity helps with making cross platform for And and iOS from the get go (especially now that the modules to export to them are free), I heard for non-game apps, it may not be the way to go.
Worth noting is you'll still need access to an OSX environment, and the iOS developer subscription.
 
One issue I haven't heard discussed here is a comparison of the APIs. Does that mean that Android and iOS APIs are about equal in there power/ease of use?

I've heard scathing reports about Androids dev environment, but I don't hear anyone mentioning that here either.

Maybe someone that's made an app for both platforms can chime in on the native tools offered. (eclipse vs Xcode)
 
One issue I haven't heard discussed here is a comparison of the APIs. Does that mean that Android and iOS APIs are about equal in there power/ease of use?

I've heard scathing reports about Androids dev environment, but I don't hear anyone mentioning that here either.

Maybe someone that's made an app for both platforms can chime in on the native tools offered. (eclipse vs Xcode)

I think it is difficult to make a comparison of the APIs since this means different things for each application (i.e. window kits APIs might be different, opengl APIs are pretty much the same etc)

AFAIK Android environent was Eclipse until last year but now they have moved to their own IDE (Android Studio) which is based on IntelliJ IDEA. I have experience from Java and Eclipse but not Android studio.

Eclipse is a general purpose java dev environment and has plugins to handle the specifics of a framework. Xcode on the other hand is a tool focused specifically to iOS and Mac applications. It really depends on what each developer is used to and his level of experience.

I am about to start porting my iOS game to Android so I would probably have more personal experience on this soon :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.