Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Swift apps have the whole runtime with them so updates will not break them. This will get better when the language is stable.
One of the problems with large app size is that devices have limited storage and people don't like large downloads. There's an advantage to a quick "I'll check it out" vs a long download that might be on someones data plan.

I don't think it'll ever happen, but a shared runtime like Windows would really cut down on the app size. Maybe the OS can have more API stuff in it.
 
But I searched for differences and come up that Swift is actually seems to be slightly slower in execution than OC.

Swift is now open source. If the inputs to the two benchmarks are actually logically identical (they might not be), and you found a place where one compiler lacks a given optimization, you can contribute a fix to the compiler to make the results identical (same machine code). This will continue to happen over time (via other programmers, if you aren't capable of such).

And if you think the two languages and programs are logically identical, look extremely carefully at the precise definition of each language token and statement, as the odd quirky corner cases are usually what slows down a compiler and its output.
[doublepost=1456251269][/doublepost]
I don't think it'll ever happen, but a shared runtime like Windows would really cut down on the app size. Maybe the OS can have more API stuff in it.

A stable ABI might be on the roadmap for Swift 3.0 or 4.0. But Apple is leery of certain types of shared libraries due to security and version dependency hell issues.
 
Last edited:
A stable ABI might be on the roadmap for Swift 3.0 or 4.0. But Apple is leery of certain types of shared libraries due to security and version dependency hell issues.
That's exactly the problem, security and version dependency. I can't see Apple doing that. The good news is that the devices will probably have more storage an people can get higher data transfer speeds so that'll cover things.
 
One of the problems with large app size is that devices have limited storage and people don't like large downloads. There's an advantage to a quick "I'll check it out" vs a long download that might be on someones data plan.

I don't think it'll ever happen, but a shared runtime like Windows would really cut down on the app size. Maybe the OS can have more API stuff in it.

From my experiences, most users do not care much about an apps size as long as it is reasonable. If they run out of space, they'll focus on generated content + app sizes 100mb+ since those will be presented to them first in the Usage menu; most won't spend the time to wipe a gig worth of 40mb apps. At the same time, app sizes are shrinking and caps, bandwidths, & speed are going up.

There are a few practical reasons that generally outweigh the need for space conservation for the Swift API - one of the bigger coming to mind being the rapid changing API would reap havoc on existing apps in the Store during changes. The impact would be devastating given the current App Store Review process and to mute those changes would basically kill the essence of Swift.

The storage limitations are being presented like the RAM ones were during the birth of iOS, and I think incentives like App Thinning, while not all encompassing, were aimed to provide solutions for that convenience given to developers.
 
From my experiences, most users do not care much about an apps size as long as it is reasonable. If they run out of space, they'll focus on generated content + app sizes 100mb+ since those will be presented to them first in the Usage menu; most won't spend the time to wipe a gig worth of 40mb apps. At the same time, app sizes are shrinking and caps, bandwidths, & speed are going up.

There are a few practical reasons that generally outweigh the need for space conservation for the Swift API - one of the bigger coming to mind being the rapid changing API would reap havoc on existing apps in the Store during changes. The impact would be devastating given the current App Store Review process and to mute those changes would basically kill the essence of Swift.

The storage limitations are being presented like the RAM ones were during the birth of iOS, and I think incentives like App Thinning, while not all encompassing, were aimed to provide solutions for that convenience given to developers.
I think you're right, it's more about what they want than it is about the size.
The only impact I can see is that some larger unknown apps might be passed because of size, but that's no big deal because most unknown apps get passed anyways :D
 
Swift runtime weights around 8 megabytes (compared to around 120KB for objective-C), but then the generated Swift binaries are smaller. 8MB is the size of a modern web page filled with ads so it is not that big of a deal. Still smaller is better.
 
Hold it!

Didn't another poster state the problem with objective C was the runtime and now another points out that Swift has a huge runtime when compared to Swift.

This thread points out the problem with opinions. Objective C is a very mature, robust and widely used language while Swift is beginning to be the next coding language pushed on people for a multitude of reasons.

Here's what I am reading:
Swift seems to be about corporate agreements, IBM AND Apple
It really isn't a better or more efficient language at all, just another good one.
It is marketing based to draw newer and hipper young coders into the fold.
In the end coding is easy while design is very difficult.
Most of us know enough about the inner workings of the two OSs, OSX and iOS, to be dangerous but not accurate.

So keep calm and code on.

I've learned a bit more about stuff through this thread, thank you all.
 
@KarlJay

You were 100% correct in your initial assessment of Swift.

Swift would be a fine replacement for backend languages like Python, Java, C#, etc. but it doesn't belong anywhere near Cocoa. It simply is not flexible enough, it doesn't have the dynamism and power of ObjC (it was written by a compiler guy, what do you expect?). The "seamless integration between Swift and ObjC" is anything but seamless. I would say that it is safer to write a program in Objective-C because you no longer have to worry about the unexpected behavior of the seamless-Swift-ObjC bridge.

There is no way that Cocoa could have been built without ObjC. If you tried to build the equivalent of Cocoa in Swift, the end result would not be very appealing, because of the way the language tries to limit what you can do (kind of like having excessive government regulation if you can follow that analogy)

ObjC was fast enough to create a very nice UI experience on the original iPhone which only had 128 MB of RAM. I don't know that the same could be said of Swift. People don't seem to realize how bloated iOS has become with iOS7+.
 
@KarlJay

You were 100% correct in your initial assessment of Swift.

Swift would be a fine replacement for backend languages like Python, Java, C#, etc. but it doesn't belong anywhere near Cocoa. It simply is not flexible enough, it doesn't have the dynamism and power of ObjC (it was written by a compiler guy, what do you expect?). The "seamless integration between Swift and ObjC" is anything but seamless. I would say that it is safer to write a program in Objective-C because you no longer have to worry about the unexpected behavior of the seamless-Swift-ObjC bridge.

There is no way that Cocoa could have been built without ObjC. If you tried to build the equivalent of Cocoa in Swift, the end result would not be very appealing, because of the way the language tries to limit what you can do (kind of like having excessive government regulation if you can follow that analogy)

ObjC was fast enough to create a very nice UI experience on the original iPhone which only had 128 MB of RAM. I don't know that the same could be said of Swift. People don't seem to realize how bloated iOS has become with iOS7+.
Now that scares me as I'm hoping that Android adopts Swift as a first class language. There's been rumors to that effect and I'd LOVE to see one language work native on both Android and iPhone. I'd like to be Cocoa/ObjC but Swift would be fine too.

I've been keep track of the job market and it seems that Swift is gaining a pretty solid following. It's sad to hear that it's not all hammered out by this point.

I've using ObjC now and have been for a while, I'm not really looking forward too much to adopting a new language that isn't both strong and stable.

I'm wondering why it's having trouble because it's open source now. Being an open source language that could cross between the two most popular mobile platforms should make it a great project for the open source people.
 
but it doesn't belong anywhere near Cocoa. It simply is not flexible enough...

Flexibility is no longer the priority. Cocoa is mature and has more built-in. And even the Apple Watch is far more powerful than the original iMac or iPhone. Safety from exploitable bugs and optimized performance for battery life are now more important to Cocoa and Cocoa Touch apps, and Swift is at least slightly better at those requirements than ObjC. It may also become easier to learn, once the language becomes stable.

All programming languages s*ck compared to a programming language you know well. True even if the only language you currently know is BASIC or Forth. Maybe not true for Asm.
 
Flexibility is no longer the priority. Cocoa is mature and has more built-in. And even the Apple Watch is far more powerful than the original iMac or iPhone. Safety from exploitable bugs and optimized performance for battery life are now more important to Cocoa and Cocoa Touch apps, and Swift is at least slightly better at those requirements than ObjC. It may also become easier to learn, once the language becomes stable.

All programming languages s*ck compared to a programming language you know well. True even if the only language you currently know is BASIC or Forth. Maybe not true for Asm.
There does seem to be a pattern with programmers and the language they know.

IMO, it has to do with the amount of time it takes to learn one system vs another. I hated ObjC (still not my fav) because I came from C/C++/C# world and was used to that. Didn't like (still don't) named parameters.

It sucks to go from being functional in one language and have to start over with some other language.

Imagine learning English, then having to switch to Latin, then to French. Each time having to learn all the new rules.

I pray for the day when Android and iPhone use the same native code language. Is that too dam much to ask for?

We've landed on Mars and we can't make a native language for Android and iPhone? Who's in change here?
 
@KarlJay

You were 100% correct in your initial assessment of Swift.

Swift would be a fine replacement for backend languages like Python, Java, C#, etc. but it doesn't belong anywhere near Cocoa. It simply is not flexible enough, it doesn't have the dynamism and power of ObjC (it was written by a compiler guy, what do you expect?). The "seamless integration between Swift and ObjC" is anything but seamless. I would say that it is safer to write a program in Objective-C because you no longer have to worry about the unexpected behavior of the seamless-Swift-ObjC bridge.

There is no way that Cocoa could have been built without ObjC. If you tried to build the equivalent of Cocoa in Swift, the end result would not be very appealing, because of the way the language tries to limit what you can do (kind of like having excessive government regulation if you can follow that analogy)

ObjC was fast enough to create a very nice UI experience on the original iPhone which only had 128 MB of RAM. I don't know that the same could be said of Swift. People don't seem to realize how bloated iOS has become with iOS7+.

I can't remember if I said it in this thread, or another... I still think that with the way Swift is designed, we're eventually going to get all-new system frameworks that aren't designed particularly for OOP. Or at the very least, frameworks that are like a new family of Cocoa. Someone else disagreed with me.

The Obj-C bridge introduces problems and other annoyances that seem like they would be odd to carry forward and into the future/indefinitely. I suspect that because of this reality, and all of the other features and intentions surrounding Swift, eventually Apple will make a break from the past and completely reimagine many of the concepts and ideas and design of their existing frameworks.

It could be Swift-native/friendly versions of Cocoa. It could be an entirely new concept inspired from the thinking of frameworks like React.

As and aside: Months later and I still think that in many ways, Swift is a bag of hurt. If you have REALLY looked into the language in-depth (read a book on it or completed a video course), it can get pretty complex and at times I thought that the type system was becoming a ridiculous impediment to writing sensible code.

Slightly off topic, have any of you been following the "sealed class" Swift controversy that just unfolded?
 
Last edited:
I can't remember if I said it in this thread, or another... I still think that with the way Swift is designed, we're eventually going to get all-new system frameworks that aren't designed particularly for OOP. Or at the very least, frameworks that are like a new family of Cocoa. Someone else disagreed with me.

The Obj-C bridge introduces problems and other annoyances that seem like they would be odd to carry forward and into the future/indefinitely. I suspect that because of this reality, and all of the other features and intentions surrounding Swift, eventually Apple will make a break from the past and completely reimagine many of the concepts and ideas and design of their existing frameworks.

It could be Swift-native/friendly versions of Cocoa. It could be an entirely new concept inspired from the thinking of frameworks like React.

As and aside: Months later and I still think that in many ways, Swift is a bag of hurt. If you have REALLY looked into the language in-depth (read a book on it or completed a video course), it can get pretty complex and at times I thought that the type system was becoming a ridiculous impediment to writing sensible code.

Slightly off topic, have any of you been following the "sealed class" Swift controversy that just unfolded?
The more I hear from you and others, the more I question the wisdom of the whole Swift rollout.

IMO, it's been rushed and that's odd coming from Apple. They've been known for great rollouts, yet Swift seem to have been a "we have to hurry up and put something out there before we lose any chance of getting traction."

Somewhat simple things like interface with Cocoa, how could that be overlooked?

They fumbled with the Maps deal back when, they clearly needed to cut the cord from Google maps, but it flopped then flew.

It's a bit different with a programming language that controls the apps and the apps are such an important part of the whole package.

You mentioned OPP and how Swift introduced function based programming back into the mix. This is interesting because I started with function based programming and had no problem with it. It was actually very easy to pass data around and no issues with calling functions/methods as you do with OOP.

I've always seen OPP as a set of rules that the programer makes using classes and private/public. The compiler then enforces these rules. I see this as a solution to the crappy programs of the past that were poorly written and hard to follow.

However, we've traded complex source code for a complex / non-standardized solution that many find confusing.

Hard to know if we've actually made things better or not.

At least Swift is open source, someday it might become great and adopted on many platforms like C/C++.
 
Somewhat simple things like interface with Cocoa, how could that be overlooked?

I suspect that what went on is that Swift was like a little internal skunkworks project. Apple compiler engineer(s) made it as an experiment or for something like their company yearly performance goals or something. The team took it, kept building on it and playing around with it to suit this environment (which didn't need to consider Cocoa at all) and it just morphed into their new language. They later tacked some interoperability with Objective-C on when it was decided that it was needed in order to ship it as a language useful to Apple devs.

My imagining of it, at least.

They fumbled with the Maps deal back when, they clearly needed to cut the cord from Google maps, but it flopped then flew.

Yep!

It's a bit different with a programming language that controls the apps and the apps are such an important part of the whole package.

Agreed 100%.

You mentioned OPP and how Swift introduced function based programming back into the mix. This is interesting because I started with function based programming and had no problem with it. It was actually very easy to pass data around and no issues with calling functions/methods as you do with OOP.

I've always seen OPP as a set of rules that the programer makes using classes and private/public. The compiler then enforces these rules. I see this as a solution to the crappy programs of the past that were poorly written and hard to follow.

It's more than just public/private and subclassing/inheritance. Object-Oriented Programming is more about object design patterns than anything else. Take NSNotificationCenter... that is a design pattern known as "pub-sub" or "observer." Delegation like you see in UITableView and NSTableView, another design pattern. Protocols (interfaces in other languages like Java and C#), yet another design pattern. MVC, MVVM, MVC-N, singleton, class-clusters... the list goes on. There's got to be like 20-30 well-known patterns by now.

Cocoa is just like, all design patterns. If you are a Cocoa programmer, or interested, check this book out: https://www.amazon.com/Cocoa-Design...9156731&sr=8-1&keywords=cocoa+design+patterns

(I read most of that book, it's pretty good.)

In languages like C++ with a true, static typing system, object design patterns are needed in order to achieve decoupling and abstraction, and to let you work around the typing system.

Swift takes the static route (academic), like C++. Objective-C took the dynamic route (pragmatic) like Python and JavaScript. People on this thread (and others) and blogs have been arguing about this since Swift's release... whether Swift is more academic than pragmatic and useful.

Slightly off topic, there was a recent (big) blowout about the Swift static typing system and older programmers remembering the "bad old days" were they had to route click events through switch statements and debating back and forth on this issue. And even more recently was the whole "sealed class" thing I posted about above.

However, we've traded complex source code for a complex / non-standardized solution that many find confusing.

Hard to know if we've actually made things better or not.

Agreed. We have traded one set of problems for another. I think at this point, the only thing that can be argued is which set of problems is easier to deal with.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
I suspect that what went on is that Swift was like a little internal skunkworks project. Apple compiler engineer(s) made it as an experiment or for something like their company yearly performance goals or something. The team took it, kept building on it and playing around with it to suit this environment (which didn't need to consider Cocoa at all) and it just morphed into their new language. They later tacked some interoperability with Objective-C on when it was decided that it was needed in order to ship it as a language useful to Apple devs.

My imagining of it, at least.



Yep!



Agreed 100%.



It's more than just public/private and subclassing/inheritance. Object-Oriented Programming is more about object design patterns than anything else. Take NSNotificationCenter... that is a design pattern known as "pub-sub" or "observer." Delegation like you see in UITableView and NSTableView, another design pattern. Protocols (interfaces in other languages like Java and C#), yet another design pattern. MVC, MVVM, MVC-N, singleton, class-clusters... the list goes on. There's got to be like 20-30 well-known patterns by now.

Cocoa is just like, all design patterns. If you are a Cocoa programmer, or interested, check this book out: https://www.amazon.com/Cocoa-Design...9156731&sr=8-1&keywords=cocoa+design+patterns

(I read most of that book, it's pretty good.)

In languages like C++ with a true, static typing system, object design patterns are needed in order to achieve decoupling and abstraction, and to let you work around the typing system.

Swift takes the static route (academic), like C++. Objective-C took the dynamic route (pragmatic) like Python and JavaScript. People on this thread (and others) and blogs have been arguing about this since Swift's release... whether Swift is more academic than pragmatic and useful.

Slightly off topic, there was a recent (big) blowout about the Swift static typing system and older programmers remembering the "bad old days" were they had to route click events through switch statements and debating back and forth on this issue. And even more recently was the whole "sealed class" thing I posted about above.



Agreed. We have traded one set of problems for another. I think at this point, the only thing that can be argued is which set of problems is easier to deal with.
One of the issues that doesn't seem to be talked about much is that OOP and MVC (and others) are setup in such a way that you have to take some time to not only understand how to use them, but why they are designed the way they are.

The results of this "additional learning" to really understand things, is that you have a group of "I have a million dollar idea for an app, do I have to program to make an app?" people. They see these rules like some see traffic laws.

Traffic laws are an example of having both the solution and the problem at the same time. In advanced problem solving, there's a condition where you have both the solution to the problem and the problem.

Consider: traffic laws are designed to stop accidents, yet we still have accidents. Why? Many accidents are caused by humans not obeying the traffic laws.

Just the same, you have rules for how to design classes, methods and databases. Some don't want to be "bothered" with these rules because they have a "million dollar idea for an app" mindset. Much like the guy that doesn't want to sit at the red light, so he runs the red light.

I've read many books at this point and I saw one not long ago that looked like a great painting in how the classes/methods where laid out. I looked at it for a good while and thought wow, which I had known that before :D

Given that, I wasn't impressed when I learned that the Swift people were removing the "for" loop because it "made things easier"... It's a signal that they are going the wrong way. It's not about the syntax, it's about getting the rules easier to understand. The For loop has been around for a long, long time and it works. Focus on making things stable and easier to understand implementation of OPP.

Catering to the "I have a million dollar app idea, do I have to program" people might draw in more people, but that's like saying more people should drive cars and we end up with gridlock.

In this we have the most popular graveyard in the world... the App Store. The place people go to have their hopes and dreams crushed.

http://www.readygadgets.com/the-apple-app-store-graveyard/

app-store-graveyard1-728x409.jpg
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: PizzaBoxStyle
whether Swift is more academic than pragmatic and useful.

Funny how everything goes around in circles. Pity that most people don't stick around long enough to recognise (and appreciate) them.

We had these very same sorts of debates 30 years ago. Back then it was PL/1, Fortran, Modula-2, Pascal, C,...

PL/1 had a built-in function for every conceivable computing problem at the time. Fortran had a code library for every conceivable computing problem at the time. Modula-2 and Pascal filled the programming text books. C filled the machines with code.
 
Funny how everything goes around in circles. Pity that most people don't stick around long enough to recognise (and appreciate) them.

We had these very same sorts of debates 30 years ago. Back then it was PL/1, Fortran, Modula-2, Pascal, C,...

PL/1 had a built-in function for every conceivable computing problem at the time. Fortran had a code library for every conceivable computing problem at the time. Modula-2 and Pascal filled the programming text books. C filled the machines with code.
I started a software company using Borland's Turbo Pascal when I was in college. I stumbled on a "make com file" option in the menu and realized I could put a program onto a disk and sell it to people.

I used to go around and sell custom business software written in Pascal, then moved over to a dBase compiler (Clipper) and did well with that for a number of years. It was the solutions of the time back then.

I remember when Microsoft said Visual FoxPro would be the business development platform of the future.

I'd say that it really hasn't gotten easier over the years. We can now "drag and drop" table browsers, but the people always want more.

I'd be happy if they'd just focus on universal OPP. Can't see a benefit to one language having a different set of rules for private/public method creation or communications between classes.

The more time you have to spend learning the ins and outs of one language is time that could have been spent making better products.
 
  • Like
Reactions: dotnet
There comes a time when you learn a new programming language either instinctively, or not at all (because you disagree with the engineering approach of its creator).

Turbo Pascal 3 was a watershed moment in programming for me, it defined the moment when programming became so much fun that it was addictive and never tiring. The integrated edit-compile-edit-compile-edit-compile-run cycle was just too cool. I got a Z80 card for the Apple-II just so I could run Turbo Pascal.
 
  • Like
Reactions: Mascots
There comes a time when you learn a new programming language either instinctively, or not at all (because you disagree with the engineering approach of its creator).

Turbo Pascal 3 was a watershed moment in programming for me, it defined the moment when programming became so much fun that it was addictive and never tiring. The integrated edit-compile-edit-compile-edit-compile-run cycle was just too cool. I got a Z80 card for the Apple-II just so I could run Turbo Pascal.
I actually had to buy both Mac and IBM version because I has a customer that wouldn't switch. Never got the sale, too busy to complete the job.

Yea, they used to brag about how fast it was, hence the name Turbo. It's pretty amazing that a company like Borland would end up dying off after being a true competitor to Microsoft. Just show how things can change so quickly.
 
  • Like
Reactions: Mascots
One of the issues that doesn't seem to be talked about much is that OOP and MVC (and others) are setup in such a way that you have to take some time to not only understand how to use them, but why they are designed the way they are.

The results of this "additional learning" to really understand things, is that you have a group of "I have a million dollar idea for an app, do I have to program to make an app?" people. They see these rules like some see traffic laws.

Agreed! And that's why I cringe every time Apple execs get on stage telling the world how easy it is to program and develop apps. I guess it's good for hype though.

Throw in a article or two like this: http://www.wired.com/2014/07/apple-swift/ ... And make even more hype. Stopped reading Wired mag after they published that one.

Given that, I wasn't impressed when I learned that the Swift people were removing the "for" loop because it "made things easier"... It's a signal that they are going the wrong way. It's not about the syntax, it's about getting the rules easier to understand. The For loop has been around for a long, long time and it works. Focus on making things stable and easier to understand implementation of OPP.

Yep! Despite Objective-C's strange looking syntax, it does have a pretty small rule set. Have long-held the notion that Swift syntax and rules are there mostly because it makes language purists feel good when looking at it. Doubt Swift's engineers and developers at Apple are considering the cognitive side of things for Swift developers (new and those experience in other language). An internal culture of "Works/Make sense to me!"

(Could be wrong on the last point, but would be incredibly surprised!)

In this we have the most popular graveyard in the world... the App Store. The place people go to have their hopes and dreams crushed.

Turns out developing applications and selling them is just as hard a trying to be a successful musician or actor. I think people have lost sight of that and think that the App Store is a magical place where these realities don't apply. It's not always so easy or so clean.

The more time you have to spend learning the ins and outs of one language is time that could have been spent making better products.

Again, agreed!
 
  • Like
Reactions: 1458279
Flexibility is no longer the priority. Cocoa is mature and has more built-in. And even the Apple Watch is far more powerful than the original iMac or iPhone. Safety from exploitable bugs and optimized performance for battery life are now more important to Cocoa and Cocoa Touch apps, and Swift is at least slightly better at those requirements than ObjC. It may also become easier to learn, once the language becomes stable.

All programming languages s*ck compared to a programming language you know well. True even if the only language you currently know is BASIC or Forth. Maybe not true for Asm.

When I was learning ObjC, I actually never thought it was a bad language, I was coming from a C background but I was also interested in Lisp. I understood the syntax fine but not the "philosophy". I started using a modified Nu by Tim Burks which is a lisp interpreter written in ObjC, and used that to build apps, because I thought it would let me develop on the phone live without having to re-compile. Then I made a Javascript-ObjC bridge using JavascriptCore (it's actually really easy to do) and I tried using that to make apps. Then I started to dig into the ObjC runtime and finally I realized how flexible it was, I could write my own scripting language in maybe a few hundred lines of code. I learned that ObjC was inspired by Smalltalk, which I didn't really understand at first. I now see Lisp as a high-level assembly but I really like the design of Smalltalk, and I became interested in Alan Kay.

I've never had any of these thoughts while learning Swift. I remember thinking why is it so hard to process JSON? How do I do Key-Value Observing? I already unwrapped that optional, you mean I have to unwrap it again? Why is the \() string interpolation syntax so hard for me to read? Why can't I use a String object as my delegate? Why can't I use a Class object as my delegate? Why is a String a struct and not a class? Why do I always seem to have to look up the name of every single Cocoa method I want to use? Swift doesn't make my life easier, but I bet it makes the compiler writer's life easier.

I think flexibility should always be a priority, because without flexibility there is no progress or innovation. Jailbreaking would never have become as popular as it did if it were not for the dynamic nature of ObjC and the ability to swizzle methods and perform introspection, because that's what allows you to create tweaks. Swift doesn't have that ability.
 
When I was learning ObjC, I actually never thought it was a bad language, I was coming from a C background but I was also interested in Lisp. I understood the syntax fine but not the "philosophy". I started using a modified Nu by Tim Burks which is a lisp interpreter written in ObjC, and used that to build apps, because I thought it would let me develop on the phone live without having to re-compile. Then I made a Javascript-ObjC bridge using JavascriptCore (it's actually really easy to do) and I tried using that to make apps. Then I started to dig into the ObjC runtime and finally I realized how flexible it was, I could write my own scripting language in maybe a few hundred lines of code. I learned that ObjC was inspired by Smalltalk, which I didn't really understand at first. I now see Lisp as a high-level assembly but I really like the design of Smalltalk, and I became interested in Alan Kay.

I've never had any of these thoughts while learning Swift. I remember thinking why is it so hard to process JSON? How do I do Key-Value Observing? I already unwrapped that optional, you mean I have to unwrap it again? Why is the \() string interpolation syntax so hard for me to read? Why can't I use a String object as my delegate? Why can't I use a Class object as my delegate? Why is a String a struct and not a class? Why do I always seem to have to look up the name of every single Cocoa method I want to use? Swift doesn't make my life easier, but I bet it makes the compiler writer's life easier.

I think flexibility should always be a priority, because without flexibility there is no progress or innovation. Jailbreaking would never have become as popular as it did if it were not for the dynamic nature of ObjC and the ability to swizzle methods and perform introspection, because that's what allows you to create tweaks. Swift doesn't have that ability.

Interesting point about making life easier. I learned a very harsh lesson many years ago. Many years ago I wrote programs where were "correct" meaning that they did the job they were supposed to do. The problem is that the user didn't like the programs. The reason was that it was too hard to use. If a person made a mistake, it was too hard to fix the mistake. It was a harsh lesson for me be cause I had "done my job" and yet the customer wasn't happy.

I realized how there is a balance between what work that the programmer does and what the end users has to do to work with the program. Later, I started working a lot more on data validation at the entry point and showing more information on the screen so that people didn't have to remember so much.

This is an important and often NOT talked about area of software development. People are taught how to make the program do this and that but not so much about making the program work well for the people that use it.

Look at all the "To Do" list apps that hit the app store, yet few really made any difference at all.

I downloaded a PDF reader years ago from Adobe, every time I want from page to the next, the zoom level would go back to 0 and I had to reset it. I found another reader and I wouldn't see the documents unless I sent them by email or set them up in iTunes, and the instructions sucked.

Point: there's much more to programming than just "how do you make it do this or that". Same thing with a language.

From what I hear Swift doesn't do swizzle and doesn't play well with the runtime and cocoa. This would be Apple's fault. It's looking like they might have tried too hard to get more programmers into the mix.

Another issue is that as a company becomes more and more advanced, they get further and further from the core of the language as they write more and more routines that bring them to a higher level.

It's like writing sort algorithms, you probably shouldn't be writhing sort algorithms at this point, you should be up at a higher level.
 
Interesting point about making life easier. I learned a very harsh lesson many years ago. Many years ago I wrote programs where were "correct" meaning that they did the job they were supposed to do. The problem is that the user didn't like the programs. The reason was that it was too hard to use. If a person made a mistake, it was too hard to fix the mistake. It was a harsh lesson for me be cause I had "done my job" and yet the customer wasn't happy.

I realized how there is a balance between what work that the programmer does and what the end users has to do to work with the program. Later, I started working a lot more on data validation at the entry point and showing more information on the screen so that people didn't have to remember so much.

This is an important and often NOT talked about area of software development. People are taught how to make the program do this and that but not so much about making the program work well for the people that use it.

I think this is one of the things that Steve Jobs understood so well, that others although they may understand the concepts they cannot seem to duplicate the results.

I don't know, there's that communication aspect of using a computer. If the computer is doing something and doesn't respond to you, it gets really frustrating, because it's not communicating. I think that's one thing I really like about Smalltalk, you don't call functions, you send an object a message, I find it to be a powerful distinction. Functions are rooted in math, and math is also a language, but not many people like to speak it. I feel like the programming languages of the future should be more like natural language than like math. If you really want to expand the user base of programming, I say you have to make it more like a natural language, but it still has to be precise, like a legal document or something. Plus you factor in speech recognition, I think this is the way to go.

Looking back, iOS 6 really was a major turning point for Apple. I kind of wonder if Swift would have been adopted had Scott Forstall not been ousted, I tend to think we would still be using ObjC, and my 4S would still be fast when running the newer versions of iOS. Now I look at screen shots of iOS and Android and it is difficult to tell the difference. iOS won't even tell me if something is a button or not anymore, and I find myself having to use Google to find out how to do simple things. It's frustrating because for a while it seemed like every decision Apple made was the correct one, but now it's completely different. The Apple of today is a quite different than from 5 years ago. I feel like abadoning ship and trying to recreate iOS 6 and Snow Leopard on Linux. I'm not sure if I am going to upgrade OS X to macOS, it seems to be getting worse and flakey, I get weird crashes and my computer stops responding at times but resumes if I wait it out. I've been waiting forever to buy another Macbook Air to replace my Air from 2012, all Apple had to do was release the same exact laptop but with a retina display and I would have bought it without thinking about it, but it looks like they never will. It's just frustrating
 
  • Like
Reactions: hxlover904
I think this is one of the things that Steve Jobs understood so well, that others although they may understand the concepts they cannot seem to duplicate the results.

I don't know, there's that communication aspect of using a computer. If the computer is doing something and doesn't respond to you, it gets really frustrating, because it's not communicating. I think that's one thing I really like about Smalltalk, you don't call functions, you send an object a message, I find it to be a powerful distinction. Functions are rooted in math, and math is also a language, but not many people like to speak it. I feel like the programming languages of the future should be more like natural language than like math. If you really want to expand the user base of programming, I say you have to make it more like a natural language, but it still has to be precise, like a legal document or something. Plus you factor in speech recognition, I think this is the way to go.

Looking back, iOS 6 really was a major turning point for Apple. I kind of wonder if Swift would have been adopted had Scott Forstall not been ousted, I tend to think we would still be using ObjC, and my 4S would still be fast when running the newer versions of iOS. Now I look at screen shots of iOS and Android and it is difficult to tell the difference. iOS won't even tell me if something is a button or not anymore, and I find myself having to use Google to find out how to do simple things. It's frustrating because for a while it seemed like every decision Apple made was the correct one, but now it's completely different. The Apple of today is a quite different than from 5 years ago. I feel like abadoning ship and trying to recreate iOS 6 and Snow Leopard on Linux. I'm not sure if I am going to upgrade OS X to macOS, it seems to be getting worse and flakey, I get weird crashes and my computer stops responding at times but resumes if I wait it out. I've been waiting forever to buy another Macbook Air to replace my Air from 2012, all Apple had to do was release the same exact laptop but with a retina display and I would have bought it without thinking about it, but it looks like they never will. It's just frustrating

I wondered about iOS changing to a flat look. I kinda like the blurred stuff, but I also like the 3D realism look more. It might be the case that Apple is realizing that it must do something. This is a problem when you come out with an awesome phone/OS, you then have to come out with a better one next year and an even better one next year.

For the first time in 13 years, Apple didn't grow. It gets to the point when the Camera, battery life, OS, chip speed, etc... isn't enough for you to upgrade. Apple's business model is mostly based on people updating. Once that update cycle slows, they have a problem.

I think part of Apple's plan was to push the iPhone over iPodTouch and Mini. I prefer the iPT or Mini for dev because they become outdated and I don't need another phone. They short change those, yet charge a premium price compared to other offerings.

Maybe some of the OS updates are just to do something different, good or bad, just make it new and different to get people to buy more and keep the ball rolling.
 
Sometimes the hardest thing to do is to do nothing. And sometimes that's the correct decision.

Perhaps they felt a need to create something that they could call their own, so in that sense it had to be different. But I think they went way overboard. It became an emotional response, they weren't able to remove their own emotion from the equation. And people in positions of power generally are not able to admit that they may have made a mistake until it's too late.

One thing I've learned about software, is that it doesn't always get better.
 
  • Like
Reactions: 1458279
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.