I am not an Apple user but I do enjoy the history of computers -- which (obviously) the timeline will reach the likes of Steve Jobs and Apple. As much as people give high praise to Steve and the rise of Apple... I personally value Steve more regarding his work on NeXT more.
To be honest, as someone who was into SGI computers as a kid, accepting the reality I would NEVER be able to own one, I hoped one day I would get to use or try one in the workplace. Of course, by then, technology had moved and these expensive machines are now affordable, simple a Windows PC with a good graphics card!
Getting back to NeXT, I heard "Next" thrown about when reading up on ID Software (Doom, Quake, etc) but never really appreciated what this was until later on. When I finally spent the time to understand that this "Next" was a computer, I started to appreciate how cool it was especially back in 1990! It made me appreciate how far ahead they were with programming/development tools. I mean.. Visual Studio finally caught up with productivity.. like.. 10 years later!
It is great to see the history of NeXT continue -- not just being "merged" into Apple but with OpenSTEP and others.
It also makes me appreciate less known programming languages that, one day, could become "mainstream" at any point. Hardly anyone knew much about Objective-C but became a popular tool for Apple developers with OSX. Thats why I always have a grin when people mock a less known language because "hardly anyone uses it" or "hardly anyone knows about it" -- that could change, you know.
But the RAM requirements were crazy, 8MB RAM in the late 80s which is understandable for what the machine the was offering.
Interface Builder is nice, I remember trying to develop GUIs on Amiga with intuition.library and wished that there was a drag and drop program to build the source code. Running NeXTSTEP 3.3 on Virtual Box and trying to get it running on my early 2000s laptop :)
Now imagine being at the university doing the graduation project porting a visualisation framework from NeXTSTEP into Windows (Objective-C => C++), because it was clear NeXT wasn't going to survive, and some stations were already gathering dust, after being replaced with PCs running Red-Hat Linux.
If folks had guess what turnaroud NeXT would go through, most likely my thesis would have been something else.
Related:
- Oral History of Blaine Garst [PDF]: https://archive.computerhistory.org/resources/access/text/20...
- Oral History of Blaine Garst [Video]: https://www.youtube.com/watch?v=qtEIq7fe_KQ
- Oral History of Steve Naroff [PDF]: https://archive.computerhistory.org/resources/access/text/20...
- Oral History of Steve Naroff [Video]: https://www.youtube.com/watch?v=ljx0Zh7eidE
I highly recommend the Steve Naroff interview. It's 6 hours in two parts, but it's worth the investment.
Not listed here, but also available, is the oral history of Brad Cox and the later Objective-C paper authored by Naroff, Cox, Hsu (the author of the linked blog post) for HOPL IV. Someone else will have to dig up the links.
You can also find interviews by other NeXTSteppers on the CHM site like Avi Tevanian and others. I think the Naroff one is best.
I always found Objective-C really elegant, due to its Smalltalk influence. I once read about Brad Cox explaining that the initial prototype was just one or two pages of C code to add objects and messaging. It was interesting to discover that Erlang had a similar origin story, with a thin layer built on top of Prolog [1]. Does a similar description of early Objective-C exist? I couldn't find a pointer in the videos.
[1] https://www.labouseur.com/courses/erlang/history-of-erlang-a....
https://dl.acm.org/doi/10.1145/3386332 - The Origins of Objective-C at PPI/Stepstone and its Evolution at NeXT by Brad J. Cox, Steve Naroff, and Hansen Hsu, from HOPL IV
Thanks for mentioning this! I watched the first part and just started on part 2. Very interesting!
> By the 1980s, the problems of cost and complexity in software remained unsolved.
We’re well into the 2020s and I’d say those problems have pretty much remained unsolved. ;)
I don't know how realistic is this video but it was interesting
https://www.youtube.com/watch?v=UGhfB-NICzg
A sun vs next rapid development challenge.
Ah, Objective C. The language that combines the famous memory safety of C with the blazing-fast performance of Smalltalk.
Or the speed and flexibility of C, with the high level abstraction capability and malleability of Smalltalk!
And the readability of using both at the same time!
Hey hey, objc_msgsend’s no longer C ABI compatible, [1] so make that the famous memory safety of assembly now…
[1] https://www.mikeash.com/pyblog/objc_msgsends-new-prototype.h...
Why do the words "Steve Jobs" and "programming" in the same sentence very slightly rub me the wrong way. Probably because one thing he wasn't, amongst all the superhuman things attributed to him by Apple zealots, is a programmer.
Steve Jobs was first a computer hobbyist, then later he worked at Atari before he co-founded Apple in 1976. He wasn’t a non-technical founder, although he was certainly less technical than his co-founder. He could program, he could do digital design, he knew his way around a bread board and an oscilloscope.
He isn’t famous for being a programmer, but he could do it, and he certainly understood it.
He actually did know how to code. He and Woz created the classic video game Breakout while working at Atari. (I have no doubt Woz did most of the heavy lifting.) It's not that he didn't know how to program, it's just that coding was not his strong suit.
> He actually did know how to code. He and Woz created the classic video game Breakout while working at Atari. (I have no doubt Woz did most of the heavy lifting.
And Jobs kept most of the money ($4,750), gave Woz $350 and went off to India to achieve enlightenment.
Note that creating an arcade game at that time wasn't writing a computer program, it was designing a finite state machine out of discrete logic chips.
NeXTstep was all about revolutionizing the programming experience. For the time, it was dramatically more powerful and far easier than the completion. We are basically still using that app environment today ( on macOS and iOS ).
The teams he assembled and worked with had a crazy impact on programmers, software development, computer platforms and end users. Even when he way not a developer himself, he looked for very bright people in software development to make it possible to develop new computer platforms. Home computers, early graphical workstations, desktop publishing, UNIX workstation for students, next generation operating systems for personal computers, ...
Elon Musk doesn't build rockets either - but it's fair to associate him with SpaceX progress. We rightly blame the top guy when stuff goes pear shaped, and we should be willing to let them get some credit when it goes well.
Also related, Alan Kay's Viewpoints Research Institute tried to build a system and application software in less than 20KLOC http://www.vpri.org/
This is very narrow history. Basically a history that excludes everything that isn't Jobs walk to glory and perfection and domination. Ignore many important points, problems, accidents, alternatives and so on.
NeXT used 'Display Postscript' a display server that was basically a inferior copy of Sun's NeWS system. This was later changed because NeXT was to small and Adobe didn't want to support Display Postscript anymore. Sun of course killed NeWS because they wanted to be a 'standard'. Next didn't care about standards. They had less applications then CDE Unix, and far lower deployment in the 90s.
Objective C is one of many language that you could use to build UI libraries on top of some display system. Objective C wasn't the best or inherently better then many others. Objective C adoption by Next was kind of a historical accident based on office location.
Having something VM based for UI development isn't actually that much of an issue, when the hardware manufacture delivers the OS with the VM included. And usually it his the hardware manufacture that delivers the OS. And VM bases system can be integrated well with the core OS, object oriented or not. And that VM are inherently to slow is also questionable, specially for UI apps that can use C libraries and the Display Server for the most performance relevant stuff.
Apple itself had a very nice system for UI development on Dylan that was arguable better in many way then the Next system. But when Steve Jobs came and they had Next, that wasn't developed anymore.
What Jobs showed of in the late 90s wasn't exactly revolutionary stuff. But Jobs always presents everything as revolutionary.
IPhone development in 2010 working the same as Next development in 1990 is a sign of 'failure', not of success.
What current development environment and language would you like to put forward as a successor to/alternative of NeXTstep?
Some notable applications done in it:
- Altsys Virtuoso --- this became Macromedia Freehand
- Lotus Improv --- cloned to become Quantrix Financial Modeler
- Doom notably was developed on NeXTstep and the WAD/level editor was never ported (though alternatives were later developed)
- Glenn Reid's PasteUp and TouchType --- two of the nicest apps for working with documents and type I ever had occasion to use
and a number of ports were done quite quickly and were arguably the best ever versions --- WordPerfect for NeXTstep was done in six weeks and FrameMaker on the NeXT stands out for an interface which I missed when using it on Mac or Windows.
In particular, I'd really like to have a successor to/replacement for Touchtype.app (and if that could be grown into a replacement for Macromedia Freehand, that would be great) --- what tool would allow developing this easily, and making press-ready files which perfectly match what is seen on-screen?
A further example, TeXview.app was a very nice interface for editing TeX documents on NeXTstep, and has since been succeeded by TeXshop.app on Mac OS X and arguably even surpassed (it has many more features, and I can't recall any which are missing).
Graphical/integrated .tex editing environments are a frequently encountered application type, and there seems to be at least one for pretty much every programming environment --- I'm not aware of a graphical which has feature parity with TeXview.app and which is similarly consistent with its interface (it did win an Apple Design Award back in the day) --- if that's the case, then how can it be that developing with InterfaceBuilder and Objective-C and the NS-objects was so bad? If that's not the case, what development platform would you like to put forward which should have an even better TeX environment?
I actually used the Apple Dylan environment. IIRC, it required 48MB RAM -- yes, M -- to launch, which was extremely an extremely large amount at the time -- as in, you probably needed to buy more RAM to run it -- and was itself written in Macintosh Common Lisp instead of self-hosting. UI development? Thing could barely run on a contemporary Mac. Every efficiency claim that was made for it was unproven. Maybe it could have been great, but it certainly needed at least a couple of years more bake time. The NeXT runtime existed and its performance characteristics were understood.
Virtual memory was a huge issue for UI development with a retained-mode system like DPS; thrashing window contents is very not fun if you want a responsive UI. Apple spent years optimizing VM for this purpose after the NeXT purchase.
> UI development?
The interface builder was written in Dylan and running inside the Dylan runtime.
The early use of Dylan (actually its precursor language Ralph) was to develop software on a Mac for an external device, like tablets and handheld computers. The development system was an external mainboard attached to a Mac. Apple released eventually a first device with ARM 610 20Mhz CPU, 640Kb RAM and 4MB ROM as a product. A few of the early internal development environments were written in Lisp on the Mac, using a lot of memory (precious and expensive megabytes!).
> Apple Dylan environment
That was a prototype of the Apple Dylan environment, later released as a technical preview. It was also never a product. Apple at that time often developed software prototypes and the released product was then recoded as a more efficient tool. The technical preview was for the Mac and for develop software for the Mac.
Mac OS development introduced me to Objective-C. Quirky, but I grew to like it. I think it was the first time I 'got' object-oriented programming. I had done Windows programming before and while it was 'fine', writing Mac OS GUI code was so nice. Probably a credit to the frameworks as much as the language, though.
What other choice did NeXT have?
C++ was invented about the time NeXT started. Microsoft didn't even release MFC until 1992.
C...yikes.
Pascal. Would Steve had went along with that?
>Pascal. Would Steve had went along with that?
I don't see why he couldn't have. It's a far more elegant language and it and other Wirth languages like Oberon have inspired several current programming languages. The standard programming language of the Lisa and Mac was Pascal, and Apple even added object oriented features to it as Clascal (earlier than Wirth's Oberon and not compatible with it).
I did a bunch of iOS development for about 4 years. I found Objective-C way more productive than C++.
I'm not saying taking Objective-C was wrong.
There were many alternatives.
Pascal is language family. Xerox did Pascal like Cedar at around the same time. Thing Modula-2 existed. Next could have done something along those lines. I'm sure there were commercial versions of that kind of stuff floating around.
Smalltalk was a big family at the time. Lisp OO system were already common. There were lots of commercial version of that they could have licensed.
Sun with NeWS had a very extended PostScript that basically allowed you to write most of the application in it.
There was a lot of stuff going on back then already. History always hides how much stuff was actually happening. Sadly most of it isn't open source, so tons of great stuff goes into a historical blackhole.
My point is, with the amount of money Next was able to raise and invest in these systems, they could have gone a number of different ways. They made a reasonable discussion at the time.
If you're going to bring Sun and NeWS into the conversation, you ought to also mention how after NeWS failed, Sun tried adopting OpenStep for Solaris.
Sadly Sun kind of went of the deep end did itself no favors on the desktop. There is a reason they turned primarily into a server vendor. You could also run Xerox Cedar on Sun machines at the time.
It was the research into systems like Xerox Cedar, among others, that turned me into a believer of systems programming with automatic resource management languages.
Most of these systems failed not due to technical limitations, rather by poor management decisions that ended up killing their adoption.
Likewise in the UNIX world, my favourite ones are the ones that went out of their way to not be yet another UNIX V6 clone with X, like IRIX, NeWS and NeXTSTEP.
I think it's stretching the truth a little to say that Apple "had a very nice system for UI development on Dylan." The Dylan Eulogy site itself claims they hadn't even finished their Interface Builder [1] by the time it was canceled, which NeXT already had at the time.
As a side note, Apple Dylan seems incredibly limiting and opinionated for no reason. I find it interesting the website lamenting its death only shows screenshots [2][3][4] of the "advanced" editor, rather than any successful applications made using Dylan.
Also, how can I incrementally migrate Dylan into existing codebases, which were most likely C, at the time? Does it have an FFI? Also, most software engineers mental models at the time were imperative, and they expected them to learn the intricacies of functional programming and object-oriented at the same time?
That's not even to mention the logistics about running it:
> Keeping your source code in a fully version-tracked, object-oriented database with full metadata is an idea whose time has long since arrived, yet most of us still spend our time editing text files. Apple Dylan could export and import code from text files, but once you’ve used an IDE that takes advantage of fine-grained object storage of your source, you’ll never want to go back. [5]
Does this mean I'd need to shape my entire source control system around Apple Dylan? How would diffs and collaboration work? To use my source control system would I have to "export" it to plaintext every time? Text-based AppKit and Obj-C fit right into existing source control systems of the day.
Also Obj-C and AppKit was already dogfooded heavily within NeXT. The system UI was built using it, as well as all the apps that shipped with the system. You can't say that about Dylan and MacOS 9.
[1] https://opendylan.org/history/apple-dylan/screenshots/misc.h...
[2] https://opendylan.org/history/apple-dylan/screenshots/browse...
[3] https://opendylan.org/history/apple-dylan/screenshots/dynami...
[4] https://opendylan.org/history/apple-dylan/screenshots/index....
The platform which Dylan was originally designed for, the Newton had no C to begin with.
There were two teams fighting for delivering the OS, one using Dylan, other using C++, eventually the C++ team won the internal politics, even though the Dylan one was relatively ahead.
Newton was programmed in NewtonScript, prototype based OOP.
Eventually the SDK allowed for C++ native extensions, and on the later version of the OS, before Newton was killed, there was already a basic JIT in place.
> The platform which Dylan was originally designed for, the Newton had no C to begin with.
These platforms had no development tools. The firmware and software runtimes were created outside. I would guess that there definitely C was involved for much of the firmware and that an on device Dylan runtime had C and assembler code.
Assembler for sure, as even today it is unavoidable in low level firmware coding, at the time Apple was a Object Pascal/C++ shop, usually the C like APIs were extern "C" in many of the toolbox APIs implementation.
The MCL kernel was written in C.
The discussion has various conflations, omissions, and errors. It's been a long time since Dylan on the Newton was what I worked on every day, and my memory is no doubt faulty, but I'll do my best to correct a few things.
"Apple itself had a very nice system for UI development on Dylan that was arguable better in many way then the Next system. But when Steve Jobs came and they had Next, that wasn't developed anymore."
This one is misleading in a couple of respects. First, there was more than one Dylan development system at Apple: Leibniz for the Newton and Apple Dylan for the Mac (I think it was called "Hula" internally, but I may be wrong about which project got that name). Both were written in Common Lisp using MCL, but they were distinct projects with different goals. Neither had tools like NeXT Interface builder. The Lisp-based application builder that rivaled (and in my view surpassed) Interface Builder was SK8, which was unrelated to Dylan. SK8 was also written with MCL.
Dylan was not canceled because Steve Jobs returned with Nextstep. The cancellation happened several years before Steve came back and was part of Apple's normal product and project churn.
"Apple Dylan seems incredibly limiting and opinionated for no reason."
It didn't seem that way to me, but that's about all I can say to a vague subjective impression of this kind.
"I find it interesting the website lamenting its death only shows screenshots [2][3][4] of the "advanced" editor, rather than any successful applications made using Dylan."
There weren't really any successful applications made using Dylan, if you mean shipping products. It never got that far along.
There were projects written with Dylan that were successful on technical merit, but never shipped for reasons that had nothing to do with their technical quality. For example, the final review of bauhaus told us that we had met and exceeded every metric of success, but Apple management just didn't want to ship a Lisp OS for Newton, full stop.
But yes, Nextstep was dogfooded to a degree that Dylan was not, and it was much farther along than Dylan ever was.
"There were two teams fighting for delivering the OS, one using Dylan, other using C++, eventually the C++ team won the internal politics, even though the Dylan one was relatively ahead."
This characterization is misleading in a few ways. I wouldn't say the teams were fighting, and I wouldn't say the Dylan team was ahead, except in maybe a couple of narrow areas.
Initially all Newton development was done in Dylan (called "Ralph" at that time), except for work on the microkernel, which was C++, and the 7 "bottleneck" graphics routines from QuickDraw, which were C.
The Newton team did substantial work on the software system in Dylan, and it worked, but there were some objections. Larry Tesler objected that it was too much like a desktop OS and needed to be redesigned for a handheld device with touch input. Meanwhile, some discussion took place with John Sculley during one of his trips to Japan and when he came back he ordered Larry to move main system development to C++. Larry did so, but reserved a small group of programmers (of which I was one) to continue experimenting with Dylan to see what we might accomplish.
We accomplished too much: we ended up making a whole second OS in Dylan. That's not what Apple management wanted, and they canceled our effort. That was no surprise to me; what surprised me was how long they tolerated our experiment before pulling the plug.
Then again, Apple in those days was a sort of constellation of skunkworks projects. Apple engineers would go off in random directions when not under threat of looming deadlines, and cook up all sorts of wacky things, and it was sort of tolerated because sometimes those wacky projects turned into things that made money.
"> The platform which Dylan was originally designed for, the Newton had no C to begin with. These platforms had no development tools. The firmware and software runtimes were created outside. I would guess that there definitely C was involved for much of the firmware and that an on device Dylan runtime had C and assembler code."
I don't know of a Newton platform for which C tools did not exist.
The Newton originally targeted the AT&T Hobbit (https://en.wikipedia.org/wiki/AT%26T_Hobbit), which was a RISC chip designed to run C efficiently. By the time I was recruited the boards were using ARM chips, but the Newton microkernel was already written in C++. Perhaps there was a time when a Newton existed without a supporting C compiler, but I never saw it.
Now it might be that you meant that no C compiler or development tools ran on Newton hardware, and that much is true. The Dylan development didn't run on the Newton hardware, either. All dev tools ran on Macs and cross-compiled to the ARM platform.
Thank you, Mike for clarifying all of this and providing us with a first-person account of the events described.
I do want to apologize - I meant no disrespect in regard to the technical achievements of the Dylan team. In fact, I share the view that the Dylan project was ahead of its time in many aspects.
Hi, thanks for your comment. My comment about Dylan was because I had talked to somebody who used Dylan for Mac development and said it was a nice way to develop Mac application.
I didn't want to suggest it was as far along as NextStep Objective C based system.
That same person also suggested that Post Jobs, with the move to next this was gone. But it seems that development was canceled even earlier.
I didn't know about SK8, I will look more into that.
Thanks, Mikel, for the explanations!
I guess it would have been better to say, Apple was working on a nice system for Dylan.
NextStep had seen more development and existed earlier and was used more, that much is clear.
I am not saying they made the wrong discussion sticking with it at Post-Jobs Apple.
My broader point was just that the article leaves out a lot of other stuff happening at the same time.
No offense but this comes off as an equally narrow accounting of the historical record.
I didn't publish a long article claiming to go into the 'deep history', I just added a few points. I initially had a longer comment but it go unwieldy.
Just imagine if NeXT had been based on the Acorn Archimedes platform, instead of M68k.
There would be no Java on Android, for sure.
As someone that knows Java since it exists, I fail to see the connection.
Additionally, as someone that ported a visualisation framework from NeXTSTEP to Windows, the Acorn would never be able to run NeXTSTEP.
NeXTSTEP was ridiculously expensive for a reason, in terms of hardware capabilities for the time.
Wow, that's expensive.
There is no way that Steve Jobs could have talked Furber & Wilson into this.
http://www.shawcomputing.net/resources/next/hardware/ns31_co...
(2016)
[flagged]
[dead]
So, this is why the abomination that is Obj-C is/was used for iPhone/Mac apps. I can't overstate how much I hate Obj-C. I'm so sooo happy Swift has pretty much entirely taken over.
Side note... I feel similarly about the Java to Kotlin transition. Sooo much better. Although, I don't hate Java NEARLY as much as Obj-C.
I'm back in Objective-C now for a large mixed codebase for a client and enjoying it. It was my first compiled language starting in 2002 so I've got about equal time with only it as I do with Swift alongside. And I'm spending more time these days in other projects (microcontrollers, mostly) using C so it's a breath of fresh air. I'm finding it generally faster than Swift in Xcode, which is a bonus.
I too am back in Objective-C and being away from it for a long while and I'm also enjoying it. For the most part, I really like Objective-C. I think it's a very pragmatic language. My current project is macOS, but the vast majority of my prior work with Xcode and ObjC was with iOS development.
To each their own. I'm convinced it's just a visceral reaction to the square bracket syntax. Obj-C remains my favorite language of all time (although I haven't written it in years). Having a high level language that allows you to seamlessly drop into C felt like magic.
I'm using Obj-C for the first time in years for a side project. I'm working on making a Wayland backend for cocotron on Linux.
It is magical the way I can just use the C libraries with all the dynamic goodness Obj-C.
There's even more magic: You can even use/include/write/mix and match ObjC with C++, within the same source file - C++ classes calling ObjC methods, ObjC methods instantiating C++ classes, it's all there. No other language comes close to this kind of interop with C++ and it is one feature that Swift will never be able to match (albeit intentionally).
Yes, there is interop at the linking level. I was however explicitly stating that you can mix and match both ObjC and C++ within the same source files and as long as you don't violate the language rules of one with what's allowed in the other (e.g. giving your ObjC variable the name template), then absolutely any features, even language-wise, are supported. You can have C++ RAII destructors call ObjC methods. You can have C++ classes with an NSArray member. You can have a std::vector as a property of your ObjC class. You can mix and match libdispatch, std::call_once, ObjC blocks and C++ lambdas.
No other language comes close to this incredible source-level interop. Here be dragons, but there's also lots of opportunity (and fun, from a programming languace theory perspective).
Someone really ought to do an alternative "skin" for Objective-C. That is, define some reversible transformation that can be applied to any Objective-C program that only changes the surface-level details but otherwise results in mapping the exact same program into the later stages of the compiler (and the section of programmers' mental pipeline that comes after the part where their eyeballs and sense of taste are concerned). It's important that the reversible part be adhered to.
It's really kind of a bummer that we don't have well-known examples of "skinnable" programming languages already. The closest we get are the ones that let you choose your set of keywords for different locales in case you don't want to work with an English-based PL and those block based languages that let you manipulate them as blocks or edit the textual form. I firmly believe that PL skinning would really benefit so many otherwise worthwhile languages that get short-shrift because of negative reactions to the surface-level details. I'm referring in particular languages like those in the Pascal family. (You could also conceive of e.g. a skin for Golang that doesn't have leading caps be the determiner for whether a function is public or not. There are lots of applications for this, and the variations are a non-issue when you have an ecosystem with strong norms about applying a code formatter (like gofmt), which can dictate the format of the canonical on-disk representation.)
Apple did implement this around the Mac OS X transition (I guess, probably Steve Naroff did it?) but it was not pursued, in favor Java bindings (which funnily enough are back in fashion with swift-java).
surely after 20 years this already exists , someone posted a header the other day that makes c syntax python like , macros are enough to accomplish what youre describing? i dont perceive it as efficient to force codebase contributors to relearn their ingrained language syntax , possibly more efficient for those having newly learned a language tho ...
> i dont perceive it as efficient to force codebase contributors to relearn their ingrained language syntax
That's why it's important that the transformation be reversible and what the gofmt-like code formatters are for.
The Four Stages of Objective-Smalltalk
https://blog.metaobject.com/2019/12/the-4-stages-of-objectiv...
Its first two stages are just that, but it has moved beyond, for good reasons.
For personal use, the only issue I take with Obj-C is how it's considerably less "batteries included" relative to Swift.
For collaborative projects on the other hand, it can become a handful to maintain if everybody contributing isn't staying on top of hygiene with null checks, type checks, etc. Swift helps to keep all of that under control.
Interesting, I guess that part was missed on me since I only really ever used it for iPhone apps and never really had a need to use C directly.
Also, you're 100% right. The square brackets are what immediately repulsed me and continued to befuddle me even after years of experience with it. Also, everything just feels "backwards" to me if that makes any sense. Coming from Java/C#/JavaScript everything just seemed unintuitive to me at all times. Also, I think this was heavily compounded by using xCode which (at the time) was incredibly laggy. So, I'd mess up the Obj-C syntax and the IDE wouldn't tell me for what felt like forever. Often I'd make a change and hit "play" before the syntax highlighting caught up and that always felt infuriating.
I last used xCode about 4 years ago and it was still an issue then (even with swift).
I've been an Mac and iOS engineer for over a decade, and none of this makes any sense to me. Everything you listed (besides the brackets) is worse in Swift than in Objective-C (Swift has real problems with live error checking in Xcode in particular, due to the additional compilation complexity since it's a significantly more complicated language).
I've observed some folks have a visceral reaction to having to use Xcode, I don't really understand it myself. I can understand being annoyed at having to use a specific IDE to write iOS and Mac apps, e.g., it's harder to bring your own text editor like you usually can, it's going to make your life a lot harder if you try to to avoid using Xcode. But comparing Xcode to any IDEs like the JetBrains IDEs I've used (mainly the discontinued AppCode), Android Studio (also JetBrains under the hood), or other similarly complex development environments like Unreal or Unity, I don't see any of these as a clear winner. Personally I'd prefer using Xcode to any of those. I suspect this comes down to just whether you like native Mac apps or not, Xcode is a native Mac app and if you like that aesthetic than you'll like Xcode. I suspect most of the dislike for Xcode is really just folks who dislike the Mac platform (e.g., the UI toolkit) overall.
I think it comes from the folks that aren't really into Apple culture, rather they buy Apple because of UNIX, or want to target iDevices without culture background on the ecosystem.
Bingo. I'm tired of people claiming that macOS (XNU more specifically) is just BSD under the hood. It's called X is not Unix for a reason!
I feel like so much awesome engineering in XNU around security and portability, as well as innovations like IOKit are swept under the rug because "it's just FreeBSD."
I still think it's a shame that more people don't take advantage of the Mach side of XNU more. Launchd and Mach ports are a powerful combination IMO.
>"Also, everything just feels "backwards" to me if that makes any sense."
Because it is. Obj-C comes from the Smalltalk lineage by way of Alan Kay, using message passing [0] versus method invocation. It's a subtle difference with huge implications to how you design systems. Method invocation won out mostly because of Java and C++, but there was a time it wasn't clear which was the better OO paradigm.
Message passing belongs up there with lisp, forth and pure functional programming as paradigms that are worth learning for "the profound enlightenment experience you will have when you finally get it." But I often see that my peers in the profession lack the kind of growth mentality that enables a person to see past the alienness of less algol-y languages.
Quote from "How To Become a Hacker" by Eric S. Raymond: http://www.catb.org/esr/faqs/hacker-howto.html
see my comment above: https://news.ycombinator.com/item?id=42132104 i can't tell the difference and therefore i don't see what profound enlightenment experience i am supposed to have.
Java is heavily based in Smalltalk and Objective-C, even if it has a C++ like syntax for mainstream adoption.
https://cs.gmu.edu/~sean/stuff/java-objc.html
Even Java EE was actually a rebooted Objective-C based project done internally at Sun during the OpenSTEP days, aka Distributed Objects Everywhere.
i have learned smalltalk, common lisp, java, python, ruby, perl, php, C, javacript and a few other languages, and i still do not understand the difference between message passing and function invocation. they look the same to me. according to wikipedia the difference is
"In contrast to the traditional technique of calling a program by name, message passing uses an object model to distinguish the general function from the specific implementations. The invoking program sends a message and relies on the object to select and execute the appropriate code."
Method invocation won out mostly because of Java and C++
but according to the wikipedia article java uses message passing.
supposedly the distinction is that i can have a generic method that gets called if a named method can not be found. in smalltalk that's doesNotUnderstand: in ruby it's method_missing. javascript used to have __noSuchMethod__, in php i can overload __call, in pike i do the same with defining a method called `(), and many more.
so are they all using message passing? and then if java is supposed to use message passing and javascript removed __noSuchMethod__ it seems that alone can't be the distinction.
if there is a distinction at all then it look more like an implementation detail that does not actually affect what kind of code you can write, and more importantly, based on that it is not at all clear that method invocation won out.
This is another example of where you've got to read Wikipedia with a skeptical eye. That one in-passing mention of Java is incorrect. I don't know why it's in there, WikiBlame indicates that it's been there since 2004, which was the early days of Wikipedia when it was particularly unreliable.
So, the gist of the difference is this: object-oriented programming is, at its core, about late binding. Specifically, delaying decisions about what code will run when until run-time. But there's still some wiggle room to decide how late certain decisions are made. Most mainstream object-oriented languages like Java and C# more-or-less wait until the start of run-time to decide, but at that point the mapping from argument type to which code is run is pretty much settled. (This isn't necessarily 100% true, but it's the general rule.)
In a system that uses message passing, it's pushed even later, to method invocation time. Basically, each object (actor, whatever) gets to decide what code will be executed to handle a message every time it receives a new message, even for messages of the same type. In practice, most the time it's always the same code. But the point is that this level of dynamicism is a first-class language feature and not just a thing you can accomplish with hacks.
i understand your explanation, but i still don't see how i am supposed to tell the difference as a programmer. to be a first class language feature it has to be something the programmer can see and use consciously. the only feature there is the ability to have catchall methods, but while the existence of such a feature is an indication that very late binding happens, the absence of it does not indicate otherwise. so it goes back to pretty much all dynamic languages with a runtime probably use message passing because they either do have such a feature or could have it. but i don't see that claim supported anywhere. everyone just seems to talk about how smalltalk is somehow different, but i can't see how it is different from javascript, php, pike or other dynamic languages.
and i believe what you say about java and wikipedia. it just shows again that the distinction is not obvious.
i found this discussion on stackexchange: https://softwareengineering.stackexchange.com/questions/3352...
but reading that doesn't help me to tell the distinction between java and smalltalk either. the point appears to be that all OO is message passing, and the only distinction as far as i can tell is that java doesn't have extreme late-binding of all things, but that is something i can't know just by looking at the language. it's a hidden implementation detail. not being able to tell how the message is processed is another feature that message passing is supposed to have, btw.
the stackexchange answer however also shows why i am not seeing any revelation when using smalltalk. if all OO is supposed to be message passing then it's no wonder, it all looks the same to me.
note that i don't want to argue either way. i don't know enough about this to make any kind of argument. and i am trying to figure out the right questions to ask so i can learn and understand more. your comment did help me move forward at least. thanks.
If the "message passing" implementation blows the stack when something sends a message to itself, then you know that they are calling function calls "message passing".
Bona fide message passing is asynchronous. A message is sent and the send operation immediately returns, without blocking for a reply.
Nothing else should be called "message passing".
Nah...IMHO message-passing is a generalization.
The point is that it encompasses all of these things: synchronous, asynchronous, local, distributed, late-bound, early bound, point-to-point, broadcast, multicast.
And ideally, you should be able to select which kind you want on a case-by-case basis. With ease.
Rather than having to choose a different programming language.
i agree with that definition but then smalltalk nor any other language are using message passing for regular method calls because they are not asynchronous unless they use explicitly asynchronous methods.
The difference is NSInvocation. A function pointer doesn't capture the arguments to the function, but a message does.
Also, you can change the destination of a message send at runtime, but you can't change the destination of a function call unless you're dtrace and can do code patching.
ok, so that means message passing needs a runtime, and it can be assumed that pretty much every language that has a runtime uses message passing. more evidence that it is message passing that won out.
and the profound enlightenment ( https://news.ycombinator.com/item?id=42130432 ) is not specific to message passing but about a dynamic language runtime. being able to intercept messages/function calls is just one of the benefits of that.
Not sure if objc_msgSend is a runtime, but it does require malloc, so there are limits to how low level it can be.
> objc_msgSend [...] does require malloc,
Are you sure about that?
]pwd
objc4/runtime/Messengers.subproj
]rg malloc
]
The two variants I implemented way back when also did not require malloc.And of course NeXT used objc_msgSend() in the kernel, to good effect, so that's pretty low-level.
The square brackets can be a huge impediment when first working on Obj-C (it was to me). I found that after some period of time to acclimate they eventually felt 'normal'.
Metal is implemented in a mix of Objective-C (CPU) and C++ (Metal Shaders), Swift only has bindings.
Maybe one it they will rewrite the Objective-C, who knows.
Kotlin transition is a Google thing, outside Android barely anyone notices that it exists.
“Kotlin transition is a Google thing, outside Android barely anyone notices that it exists.”
That’s not true.
It certainly is, one just needs to check any programming language market share analysis that omits Android deployments.
Kotlin hardly does more than 10% of JVM workloads, and it isn't as if any JVM vendor is replacing Java with Kotlin on their JVM implementation.
Besides Google with ART, which is anyway not a JVM proper, not fully TCK compliant to start with.
At least at my company we are increasingly using more and more Kotlin. And that happens in many places. This is Server JVM, not Android. And I know other places too.
Congratulations, you are part of those 10%.
10% is a large number for a new language on a VM.
It certainly is, but it isn't Rewrite In Kotlin, that is so common to hear the community talk about, specially when their creators are quite open that it is a way to sell InteliJ licenses.
I can provide you the public statement on that, if you feel like not searching yourself.
Did you start with Swift or Objective-C?
I started with Objective-C and loved it but I imagine that wouldn’t have been the case if I started with Swift.
Same. I realize that Objective-C is dated by contemporary standards. But when I first learned it 20-ish years ago it felt like a superpower. I bet it was even more impressive when it first appeared another 20 years further back.
I think that one of the tragedies of older programming languages is that they survive long enough for people to forget the technical - and technological - constraints that influenced their design. Swift is great, but there are reasons why there weren't any Swift-like languages being developed in or around 1984.
Similar feelings about Java. It is definitely not my favorite programming language. (I suppose Kotlin isn't either, but it's in the top n.) But it's hard for me to actually hate it. Java walked so that Kotlin could run.
I started the iOS development with Objective-C and hated the language. gp's comment is pretty much my feeling as well. I wonder if I'd enjoy Swift so much if it wasn't in contrast to how utterly horrible Objective-C was. These days I still need to revisit legacy Obj-C codebases occasionally or make Obj-C wrappers to interface with C++ code and every time it feels like diving into a cesspit.
I was first introduced to Objective-C in 2010 when I worked on my first iPhone app. I've come back to it a half dozen times over the past 15 years since then and always hate it. I only discovered swift a few years ago (when I had to do some iPhone stuff) and was so happy to not need to use Obj-C
I did a few things in Obj-C and I only regretted the memory handling stuff. I really liked the performance, reflexibility capabilities (reverse engineering classes is almost trivial) and smalltalk like style. I will not choose it for any project but it gives me the taste that there were more options around C/C++
There has been plenty of options back in the day, it was guaranteed that C and C++ would take over everything as they did during the late 1990's.
Even the Object Pascal to C++ transition at Apple was mostly caused by increased UNIX adoption, and Apple programmers wanting to offer similar experiences, thus MPW was born.
Same on other platforms, until C and C++ eventually won everywhere, at least until several new kids decided to challenge them in various fronts.
One that you seldom see nowadays, outside games, is pure GUI frameworks using C or C++.