Was Objective-C really a hindrance to Apple software development? [on hold]Were there any commercially available graphical interfaces before the apple Lisa?What was the first language with regexes?Unknown Apple to VGA adapterWhat was this Apple external CRT monitor that looked like an iMac G3?A good method for formatting a modern device in HFS (Not HFS+)What Apple computers are shown in this MacBook Pro Reveal video?Help identify old Apple game from 80s (Apple II, Apple IIGS)First language designed to be embedded?What exactly did Sony contribute to the original Apple PowerBook?How was it back then in 1984, when the Apple II had color, and the new Macintosh didn't?
Why is it that the natural deduction method can't test for invalidity?
What language was spoken in East Asia before Proto-Turkic?
How to have a sharp product image?
Will tsunami waves travel forever if there was no land?
Is there a way to get a compiler for the original B programming language?
Any examples of headwear for races with animal ears?
Noun clause (singular all the time?)
How to pronounce 'C++' in Spanish
Will a top journal at least read my introduction?
Please, smoke with good manners
Reducing vertical space in stackrel
Phrase for the opposite of "foolproof"
How come there are so many candidates for the 2020 Democratic party presidential nomination?
How can I practically buy stocks?
Is there any limitation with Arduino Nano serial communication distance?
What are the potential pitfalls when using metals as a currency?
Why do games have consumables?
What is the relationship between spectral sequences and obstruction theory?
Are Boeing 737-800’s grounded?
What is the difference between `command a[bc]d` and `command `ab,cd`
What is the strongest case that can be made in favour of the UK regaining some control over fishing policy after Brexit?
How would one muzzle a full grown polar bear in the 13th century?
Sci fi novel series with instant travel between planets through gates. A river runs through the gates
What makes accurate emulation of old systems a difficult task?
Was Objective-C really a hindrance to Apple software development? [on hold]
Were there any commercially available graphical interfaces before the apple Lisa?What was the first language with regexes?Unknown Apple to VGA adapterWhat was this Apple external CRT monitor that looked like an iMac G3?A good method for formatting a modern device in HFS (Not HFS+)What Apple computers are shown in this MacBook Pro Reveal video?Help identify old Apple game from 80s (Apple II, Apple IIGS)First language designed to be embedded?What exactly did Sony contribute to the original Apple PowerBook?How was it back then in 1984, when the Apple II had color, and the new Macintosh didn't?
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
put on hold as primarily opinion-based by Raffzahn, tofro, isanae, Brian Tompsett - 汤莱恩, pipe Apr 24 at 7:51
Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise. If this question can be reworded to fit the rules in the help center, please edit the question.
|
show 8 more comments
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
put on hold as primarily opinion-based by Raffzahn, tofro, isanae, Brian Tompsett - 汤莱恩, pipe Apr 24 at 7:51
Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise. If this question can be reworded to fit the rules in the help center, please edit the question.
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
Apr 23 at 15:30
4
Probably because Jobs wasn't a programmer :)
– dashnick
Apr 23 at 15:32
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
Apr 23 at 15:59
4
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
Apr 23 at 16:04
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
Apr 23 at 16:21
|
show 8 more comments
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
I have heard stories from some of the greybeards I have met on the Internet that Objective-C was by all accounts a nightmare to work with. Was that just a thing about all the low-level languages of its era or is there some sort of feature that made working with it hard?
Did Objective-C hamper software development for Apple software, or is this just the random experiences of someone on the Internet?
programming apple
programming apple
edited Apr 23 at 21:35
Warren Young
36825
36825
asked Apr 23 at 12:07
Neil MeyerNeil Meyer
289110
289110
put on hold as primarily opinion-based by Raffzahn, tofro, isanae, Brian Tompsett - 汤莱恩, pipe Apr 24 at 7:51
Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise. If this question can be reworded to fit the rules in the help center, please edit the question.
put on hold as primarily opinion-based by Raffzahn, tofro, isanae, Brian Tompsett - 汤莱恩, pipe Apr 24 at 7:51
Many good questions generate some degree of opinion based on expert experience, but answers to this question will tend to be almost entirely based on opinions, rather than facts, references, or specific expertise. If this question can be reworded to fit the rules in the help center, please edit the question.
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
Apr 23 at 15:30
4
Probably because Jobs wasn't a programmer :)
– dashnick
Apr 23 at 15:32
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
Apr 23 at 15:59
4
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
Apr 23 at 16:04
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
Apr 23 at 16:21
|
show 8 more comments
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
Apr 23 at 15:30
4
Probably because Jobs wasn't a programmer :)
– dashnick
Apr 23 at 15:32
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
Apr 23 at 15:59
4
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
Apr 23 at 16:04
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
Apr 23 at 16:21
7
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
Apr 23 at 15:30
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
Apr 23 at 15:30
4
4
Probably because Jobs wasn't a programmer :)
– dashnick
Apr 23 at 15:32
Probably because Jobs wasn't a programmer :)
– dashnick
Apr 23 at 15:32
4
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
Apr 23 at 15:59
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
Apr 23 at 15:59
4
4
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
Apr 23 at 16:04
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
Apr 23 at 16:04
2
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
Apr 23 at 16:21
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
Apr 23 at 16:21
|
show 8 more comments
10 Answers
10
active
oldest
votes
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
9
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
|
show 2 more comments
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
7
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
4
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
Apr 23 at 18:03
2
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
Apr 23 at 20:09
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
|
show 2 more comments
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
2
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
1
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
2
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
5
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
3
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
add a comment |
10 Answers
10
active
oldest
votes
10 Answers
10
active
oldest
votes
active
oldest
votes
active
oldest
votes
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
9
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
|
show 2 more comments
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
9
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
|
show 2 more comments
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
Swift was introduced only in mid-2014 so I think perhaps some of those people's beards have greyed out very rapidly! That aside, Objective-C attempts to fuse two different languages: Smalltalk and C. So it's a compiled language, like C, that for object types also supports dynamic dispatch and introspection, like Smalltalk.
It's actually a strict superset of C: all standard C libraries are directly callable from Objective-C, and many of the very central parts of Apple's libraries are written directly in C.
Object types are dynamic enough that you can look up available methods and their types, and declared properties, and their types, and at runtime access either by name if desired. This functionality is central to Apple's UI libraries: e.g. to tell a button which action to perform when pressed you tell it the identity of the object it should call plus the name of the method, and the runtime does the necessary method routing. So things like the UI designer don't generate any code. There's no hidden mapping file full of comments that it was automatically generated and please don't edit.
There's at least one problem that stopped being a problem a long time ago: into the OS X era, memory was managed manually — you were responsible for remembering to take a small number of idiomatic steps to ensure proper memory allocation and deallocation. But they were so idiomatic that no thought was really required, and indeed that the compiler was able to assume responsibility for them circa 2010.
There were also problems of style: object syntax is almost LISP-esque in requiring matched pairs of outer brackets. Square brackets rather than round but it still used to mean a lot of hopping back and forth on a line. This also improved a lot towards the end of Objective-C's primacy as Apple started using Clang itself directly for in-IDE code processing, including predicting where automatically to insert this opening brackets.
But the main problem was this: at runtime, Objective-C provides duck typing. That is, you're free to pass any object type to any method of any other object type and things will just work if the code has been written to expect it. So e.g. there's only one array type, which can hold any mixed list of objects.
When the first versions of the framework were built for NextStep machines with low quantities of megabytes of RAM, that was a huge bonus for many of the complicated data types: e.g. there's also just one version of a dictionary, so having it be entirely typeless means having only one set of code in memory for all applications. Compare and contrast with a generics-based language like C++: each instance of a std::map has distinct code generated at compile time that is specific to the types of the keys and values. Which is faster and safer but a greater burden on memory footprint.
There are 'lightweight generics' now in Objective-C that declare an intended type for each collection variable so that the compiler can look to provide probable misuse warnings, but they're post-Swift and, honestly, primarily for its benefit — they help at the boundaries between the languages because the newer prefers the safety of types.
Trying to draw this ramble back to a concrete point: I'd say that no, Objective-C was never much of a hindrance. It offers all of C plus a bunch of reflection that is useful for UI programming. There's also empirical evidence to back this up: officially-supported languages for building OS X languages from day one were Objective-C and Java. The latter was deprecated only a few years later after market forces selected the former.
I think the language's major crime is oddball syntax; it is also unfortunate that some of the problems that being typeless solved are no longer problems, making it undesirable as an axiomatic feature.
answered Apr 23 at 13:26
TommyTommy
16.7k14883
16.7k14883
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
9
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
|
show 2 more comments
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
9
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
7
7
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
Oh, also from the anecdata pile: check out the feelings of Carmack et al to early-'90s era Objective-C: overwhelmingly positive. The original Doom toolset, and the first version of engine itself, were written within NextStep. The engine itself was then ported to DOS manually, rather than cross-compiled — I have no direct knowledge but I'll wager it acquired some assembly sections.
– Tommy
Apr 23 at 14:25
4
4
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
... and further to the great-environment-that-history-moved-beyond meme: WorldWideWeb, Tim Berners-Lee's original browser/editor was also a NextStep original.
– Tommy
Apr 23 at 14:28
3
3
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
NeXTStep offered an amazing development environment at the time. Anecdote: The "grey beards" at my Uni insisted on buying a lab full of Sun's, but also allowed a solitary NeXT Cube. Guess which lab station students ended up competing for time on... I thought Obj-C was akin to most other "good" languages - easy to learn, hard to master.
– Brian H
Apr 23 at 14:49
2
2
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
"all standard C libraries are directly callable from Objective-C" - super useful, although the C++ side lacks
– Maury Markowitz
Apr 23 at 15:23
9
9
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
"1986 - Brad Cox and Tom Love create Objective-C, announcing "this language has all the memory safety of C combined with all the blazing speed of Smalltalk." Modern historians suspect the two were dyslexic." -- James Iry, A Brief, Incomplete, and Mostly Wrong History of Programming Languages
– Mason Wheeler
Apr 23 at 15:27
|
show 2 more comments
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
7
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
4
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
Apr 23 at 18:03
2
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
Apr 23 at 20:09
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
|
show 2 more comments
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
7
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
4
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
Apr 23 at 18:03
2
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
Apr 23 at 20:09
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
|
show 2 more comments
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
Objective-C was by all accounts a nightmare to work with
I loved it. Loved it.
Some background: in the 90s I worked for a developer here in Toronto with a Mac and Win app. I wanted to work on the dev side but I had no formal training, and I found the barrier to entry to be too high for my interest level. To do anything useful, you had to learn the OS, the IDE, the language and the library, each of which was some level of dismal. For instance, the text editor widget on the Mac couldn't handle 32k, and the various libraries just called it. If you wanted to edit more text, well, have fun!
In 1998 Apple sent me a copy of OpenStep, or as they called it, Rhapsody Preview. After some install issues (lack of drivers, had to replace the CDROM drive with one it knew) I had my first real program running in a day. Real program.
Because unlike the Mac or Win of that era, the OS was the library, and the library was f'ng amazing. Text editor? How about one that fully supported Unicode, was limited only to 32-bit int in length, automatically paged data as needed (because that how the whole system worked), did complex layout like columns and flowed around graphics and such, and had a built-in spell checker. The entire library was like this, the base objects were super-powerful out of the box and tightly integrated with each other and the entire OS as a whole. I hate to use this word, but it had synergy that had to be used to understand.
Contrast with, say, Win + MFC... gebus. It was like Lisp Machine vs. PDP-8. .Net helped, and C# is better than Obj-C (I'd say it's my favorite language), but it was decades before .Net got close to OpenStep of the 90s, and even today its base objects still suck - why can't the get an array type right after 20 f'in years?! Every time I use it I end up wondering why some totally base object is missing some totally obvious feature, or why they have five objects to do the same thing, each with their own set of dumbness.
Obj-C was no worse than other languages, except perhaps in syntax (perhaps). It had two super-amazing advantages though. Extensions let you add code to existing compiled objects, so you could add spell checking to someone else's text editor for instance, and the handling of nil was wonderful.
Swift... well I like some things and don't like others. The whole class/struct thing they boast about is, to me, a crock. Yes, I know it's more efficient etc, but it really is much less flexible than just declaring a class and using it. I also hate hate hate putting the type after the declare, int c=0
is simply easier to read than var c:Int=0
, and int doSomething()
is lightyears better than func doSomething() -> Int
. Bah! Swift also lost the wonderful nil handling, and I can't for the life of me see an upside - everyone just puts !
on everything.
Overall, yes, Swift is an improvement. But Obj-C was pretty great too. At least in the 90s. It collected a LOT of cruft when it moved to Mac/iOS, and much of that was ugly and just totally bolted-on. So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer.
answered Apr 23 at 15:22
Maury MarkowitzMaury Markowitz
2,934626
2,934626
7
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
4
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
Apr 23 at 18:03
2
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
Apr 23 at 20:09
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
|
show 2 more comments
7
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
4
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts!
on everything" is doing it completely wrong.
– par
Apr 23 at 18:03
2
In defence of Swift. You mention stating the type asvar c:Int=0
in this case, you can do without the type completely and infer it by doingvar c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of!
is a sign of poor coding standards, and definitely not "everyone puts!
on everything", especially in enterprise code
– Ferdz
Apr 23 at 20:09
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
7
7
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
"So early Obj-C and Swift were pretty similar in ease-of-use IMHO, while late Obj-C was indeed getting to be a real downer." - I think this is the crux of the issue. You'll get wildly varying answers depending on what timeframe poeple think of.
– Ruther Rendommeleigh
Apr 23 at 16:07
1
1
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
It's a little sad that NeXTStep was already 10+ years old by the time it started to mainstream via OS X. I think a little longer that Window would have been closed forever [pun intended].
– Brian H
Apr 23 at 16:13
4
4
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts
!
on everything" is doing it completely wrong.– par
Apr 23 at 18:03
I loved Obj-C, but to be fair, I love Swift too, and anyone who "puts
!
on everything" is doing it completely wrong.– par
Apr 23 at 18:03
2
2
In defence of Swift. You mention stating the type as
var c:Int=0
in this case, you can do without the type completely and infer it by doing var c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of !
is a sign of poor coding standards, and definitely not "everyone puts !
on everything", especially in enterprise code– Ferdz
Apr 23 at 20:09
In defence of Swift. You mention stating the type as
var c:Int=0
in this case, you can do without the type completely and infer it by doing var c = 0
. This is always the case when you declare & assign a value to a variable in a single line. Also, as @par mentioned, abuse of !
is a sign of poor coding standards, and definitely not "everyone puts !
on everything", especially in enterprise code– Ferdz
Apr 23 at 20:09
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
See softwareengineering.stackexchange.com/questions/316217/… for reasoning behind putting the type last in languages like Swift. Go's reasoning is also very compelling, especially when talking about function types: blog.golang.org/gos-declaration-syntax
– Logan Pickup
Apr 23 at 21:34
|
show 2 more comments
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
add a comment |
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
add a comment |
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
It's a very subjective matter. Programming languages and programmers need to pair up: some programming languages are more suited to the way a programmer is thinking than others. So if a developer is working with a language that seems to get in their way, they surely do not like it.
I for one liked Objective-C when I started working with it back in 2007 (already had almost 20 years of programming experience in various languages at that time). Still like it. Even back then, it had a lot of nice features and pretty consistent APIs but it's syntax is unusual in the C family of languages.
It was a hindrance insofar that Objective-C is almost solely used for iOS and macOS development, so you are unlikely to come across it when working with other OSs. This limits the people that have experience with it and thus the available resources like documentation and source code when compared to, say, Java which is available everywhere. At the same time this also leads to the advantage of providing a consistent experience for all developers who worked with Objective-C.
Almost equally important are the available APIs (building blocks) provided to the programming language. The ones provided by Apple were pretty consistent even back than (with a few dark, dirty corners here and there) and have (mostly) improved; the need to coexist with Swift has helped in this regard. And like the programming language itself, if an API gets in the way of what a programmer is doing they don't enjoy it. The APIs provided by Apple a very verbose, some names can become very long. Some people love it, some people hate that.
edited Apr 23 at 15:38
answered Apr 23 at 15:25
DarkDustDarkDust
24016
24016
add a comment |
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
2
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
1
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
2
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
2
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
1
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
2
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
add a comment |
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
Did objective-C hamper software development for Apple software or is this just the random experiences of someone on the internet?
Do you really expect an objective answer here? Languages are a matter of heart and opinion, not really anything factual. Even more so when asking about the truth of an opinion like the one mentioned.
A short comparsion might be useful
The main difference is for what goal C got extended. Both (C++/Objective C) are meant to speed up execution compared with prior, more 'pure' OOP languages by using the rather simple static compile structure of C. And extending it with OOP features.
Objective C focuses on being a language extension by implementing ways for dynamic object and code handling (reflection) while keeping it a compiled language with a minimum runtime. It's generally geared toward making run time decision about linkage and message parsing.
C++ replaces C by moving toward the use of standard classes. C++ target is a complete static code, all decisions about linkage and message parsing is made at compile time. Metaprogramming with Templates tries to overcome this to some point.
It can be said that Objective C is a a more basic and thought thru attempt on the language side, while C++ adds many features in less than coherent ways, inviting feature creep thru the standard library in often incompatible manner.
In general Objective C may be preferable for larger and more dynamic projects. especially if they are in use over a long time and many instances. C++ got it's merits when its about closed projects and a small footprint.
So what about the 'hampering'?
Not really. C++ allows for much code mangling to get along with prior C knowledge plus acceptance some aspects as helpful, whereas Objective C requires to truely switch for a more clean OOP design.
Preferences may come down to willingness of programmers to learn new ways or jsut mangle thru - the later are of course less expensive to hire and more ready available.
answered Apr 23 at 15:57
RaffzahnRaffzahn
57.6k6140234
57.6k6140234
2
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
1
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
2
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
add a comment |
2
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
1
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
2
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
2
2
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
You should not answer a question while also voting to close. Why would you want to prevent everyone else from answering while sneaking in an answer yourself before it is closed?
– pipe
Apr 24 at 7:53
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
@pipe Do you really think this is about personal games an 'preventing others' and 'sneaking in' (3 hours later)? Serious such a language? Do you think this is some ego game? Do some reality check. For example by looking at the reasoning to close and the 'answer' I wrote you may notice the consistent nature. It does't try to answer but adds background information. The question asks for opinion. Something explicit off-topic for RC.SE. It lead to exactly to flood of less than welcome answers. RC.SE is not a web forum for lengthy chat. So stop trying to play politics and start to care for the site
– Raffzahn
Apr 24 at 8:05
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
I do care about the site. That's why I voted to close the question to prevent people from filling it up with answers - that's the whole point of closing a question. Yet here you are, adding yet another question to the flood while still agreeing that it's a bad thing to do.
– pipe
Apr 24 at 8:07
1
1
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
@pipe Sorry to come back again, but the whole construct presented is a serious twisted idea. I'm still puzzled about how twisted one has to be to come up with a concept of voting vor close just for the purpose to have his own answer 'protected'. Beside being an incredible derogative approach to assume this, the whole assumption is quite faulty and not really. Wouldn't a successful vote to close also invalidate every answer, including the one 'sneaked in'? Doing so would defy all logic, wouldn't it?
– Raffzahn
Apr 24 at 8:25
2
2
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
If you vote to close a question, it surely means you think it is not suitable for the site. Why, therefore do you also answer it? I'm with @pipe although I do not believe you had any malicious intent.
– JeremyP
Apr 24 at 10:16
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
add a comment |
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
Another thing worth noting (as a developer but also a programming instructor at the college and high school levels), Objective-C is INCREDIBLY simple to learn for people new to programming. I have also taught Python and more recently Swift, and despite my increased experience Objective-C seems to be what new programmers pick up most rapidly. My guess is that the language is quite verbose, and intersperses arguments with the function so that function calls become more like sentences, and are more relatable. In the same way it is very different than going from Java to C#, so people that already know one of the canonical languages can struggle, because it doesn't look the way that it "should".
For ease of use I would say that it is relatively easy; in most cases when you make a language simpler you also make it user to write very poor quality software, which can be a nightmare. It's best known with Javascript and other weakly-typed languages, but Objective-C does provide more freedom than Java (and exposes abilities that are present but complicated to use in C), so it is possible for code to be of lower quality.
For my personal bias I really like Objective-C, but I understand how some people can hate it.
New contributor
New contributor
answered Apr 23 at 22:54
ChristopheChristophe
611
611
New contributor
New contributor
add a comment |
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
5
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
3
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
5
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
3
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
add a comment |
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
Seems like a very opinion oriented question, but as someone that's programmed in a lot of different environments (including Objective-C)... IMO, Objective-C could indeed qualify as a nightmare when compared to well, virtually anything else. Personally it's like the worst parts of C and the worst parts of LISP combined and I truly wish that they had gone with something else, really anything else... :-)
answered Apr 23 at 13:26
Brian KnoblauchBrian Knoblauch
365210
365210
5
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
3
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
add a comment |
5
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
3
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
5
5
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
I would be really interested in what exactly you thought were the worst parts of c and LISP that they combined.
– Neil Meyer
Apr 23 at 13:35
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
How much time did you spend in Objective-C?
– Ed Plunkett
Apr 23 at 15:22
3
3
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
I've only done a little Objective-C (and all of it within the past few months) and I've found it to be decent enough.
– Lightness Races in Orbit
Apr 23 at 15:32
3
3
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
I'm not sure how it could be "the worst parts of C" since it's technically all the parts of C (i.e., a strict superset, and some Smalltalk style messaging). So, an equally valid way to put it is that it's "like the best parts of C." As best as I can tell, you don't care for some aspects of the syntax. If you truly feel like "anything" else would've been better, it seems like either your experience with Objective-C (and C) isn't particularly deep or your experience with other languages isn't particularly broad. (I sincerely don't mean that to be an insult).
– D. Patrick
Apr 23 at 16:57
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
add a comment |
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
It should be fairly evident that Objective-C has not hindered the growth of software in Apple's "ecosystem". For this, you only need to look at the success of the App Store.
Recall that iOS (originally, just OS X for the iPhone) started off as a closed development environment. The only official apps for iPhone were those internally developed by Apple. Of course, they were developed using the Cocoa Framework and Objective-C language brought over from OS X. A full year after the iPhone release, the App Store opened a floodgate of new developers adopting Cocoa and Objective-C. From Wikipedia:
The App Store was opened on July 10, 2008, with an initial 500 applications available. As of 2017, the store features over 2.1 million apps.
So, regardless of any developers personal feelings on whether it is a nice development experience, or whether the language has serious shortcomings, the objective evidence proves that software was produced on a grand scale using this platform.
answered Apr 23 at 16:28
Brian HBrian H
18.5k69159
18.5k69159
add a comment |
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
add a comment |
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
I've been programming professionally for 30 years and have worked with plenty of languages, and I hated, HATED, HATED Objective-C. It never made any sense to me. I tried to figure it out, but whenever I thought I had it, I didn't. Finally I gave up and moved on to something else. So, was Objective-C really a hindrance to Apple software development? Yes, it was. It certainly was for me.
A hindrance is not necessarily a barrier, however. The availability of other tools for doing ios development, particularly with C++, has made learning Objective-C unnecessary. But I do believe that plenty of developers were scared off by Objective-C and never even investigated alternatives.
answered Apr 23 at 22:11
MohairMohair
1592
1592
add a comment |
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
add a comment |
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
It's an excellent and extremely powerful language. The syntax needs a bit of time to get used to, but after a week or so you should have no problems whatsoever.
Named arguments are the best innovation of Objective-C. Lots of things that are bad in C++ because a function call is not self-documenting go away in Objective-C. There are observers built into the language. Any property can be observed, that is arbitrary code can say "I want to be notified when this property changes" - great for having the weakest possible coupling between code. There are interfaces, so you are not restricted to subclassing. There are closures. There are class extension - if you ever wished you could extend std::string (add methods to it, not subclass), you can do that in Objective-C.
It's an excellent language. Swift is better - after a non-trivial learning curve, but that's with 20-30 years more experience.
New contributor
New contributor
answered Apr 23 at 20:02
gnasher729gnasher729
1312
1312
New contributor
New contributor
add a comment |
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
add a comment |
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
In my experience, the language itself is no more difficult to learn, than any other language. Yes, it has a quirky syntax that many find unfamiliar but it is not difficult to understand.
The system libraries for OSX and iOS, on the other hand, are like the menu at Cheesecake Factory, very large and full of lots of things you will never consume.
New contributor
New contributor
answered Apr 23 at 23:56
Justin OhmsJustin Ohms
1213
1213
New contributor
New contributor
add a comment |
add a comment |
7
Whatever various people's anecdotes may claim, it's hard to argue with the timing evidence. Objective-C was introduced at Apple when Steve Jobs came back and started stuffing it down everyone's throats, and when it was announced that he had terminal cancer, Apple didn't even wait for him to be dead before they started working on a replacement for it! It's difficult to draw any other conclusion than that Obj-C was something that Jobs personally loved and most of the rest of the company hated.
– Mason Wheeler
Apr 23 at 15:30
4
Probably because Jobs wasn't a programmer :)
– dashnick
Apr 23 at 15:32
4
@MasonWheeler Hmm, looking at Swift, it is noteworthy that the Objective C style of message based dynamic linking was kept, while the C style parts were dropped. Seams like Objective C's merits did outlast Jobs time on the planet.
– Raffzahn
Apr 23 at 15:59
4
@Raffzahn Yeah, they kind of had to keep support for the infrastructure that all the OS APIs were built on...
– Mason Wheeler
Apr 23 at 16:04
2
Just as a counter argument, the fact that development was not hindered by Obj-C is self-evident just by looking at the massive success of the App Store. After all, the great mass of Obj-C developers are NOT Apple employees; they are 3rd party iOS developers.
– Brian H
Apr 23 at 16:21