"Why do you jackasses use these inferior linguistic vehicles when we have something here that’s so
precious, so elegant, which gives me so much pleasure? How can you be so blind and so foolish?"
That debate you’ll never win, and I don’t think you ought to try.
- Alan Perlis, 1978
In the late 1970s, researchers at Xerox Parc invented modern computing. Of course, there were others
elsewhere - but Parc made a vastly disproportionate contribution.
A large part of that was done in, and based upon, the Smalltalk programming language. Forty years
ago, Smalltalk's dynamic update and reflection capabilities were more advanced than in any
mainstream language today. The language leveraged those capabilities to provide an IDE that in
many ways still puts the eclipses, black holes, red dwarfs and other travesties that currently
masquerade under that term to shame. The Smalltalk image provided a much better Docker than
Docker.
Smalltalk and Smalltalkers invented not only IDEs, but window systems and their related paraphernalia
(pop-up menus, scroll bars, the bit-blt primitives that make them possible) as well as GUI builders,
unit testing, refactoring and agile development (ok, so nobody's perfect).
And yet, today Smalltalk is relegated to a small niche of true believers. Whenever two or more
Smalltalkers gather over drinks, the question is debated: Why?
The answer is unknowable, since we cannot run parallel universes and tweak things to see which
makes a difference
I did describe such an alternate universe in a talk in 2016; it may be the best talk I ever gave.
Nevertheless, I think we can learn something from looking into this question. I'll relate parts of history
that I deem relevant, as I know them. I'm sure there are inaccuracies in the account below.
There are certainly people who were closer to the history than I. My hope is that they'll expand on my
comments and correct me as needed. I'm sure I'll be yelled at for some of this. See if I care.
On with the show.
Lack of a Standard. Smalltalk had (and still has) multiple implementations - more so than much
more widely used languages. In a traditional business, having multiple sources for a technology
would be considered an advantage. However, in Smalltalk's case, things proved to be quite different.
Each vendor had a slightly different version - not so much a different language, as a different platform.
In particular, Smalltalk classes do not have a conventional syntax; instead, they are defined via
reflective method invocation. Slight differences in the reflection API among vendors meant that the
program definitions themselves were not portable, irrespective of other differences in APIs used by the
programs.
There were of course efforts to remedy this. Smalltalk standardization efforts go back to the late 80s,
but were pushed further in the 90s. Alas, in practice they had very little impact.
Newspeak of course, fixed this problem thoroughly, along with many others. But we were poorly funded
after the 2008 crash, and never garnered much interest from the Smalltalk community.
The community's lack of interest in addressing weaknesses in the Smalltalk-80 model will be a
recurring theme throughout this post.
Business model. Smalltalk vendors had the quaint belief in the notion of "build a better mousetrap
and the world will beat a path to your door". Since they had built a vastly better mousetrap, they
thought they might charge money for it.
This was before the notion of open source was even proposed; though the Smalltalk compilers, tools
and libraries were provided in source form; only the VMs were closed source.
Alas, most software developers would rather carve their programs onto stone tablets using flint tools
held between their teeth than pay for tools, no matter how exquisite. Indeed, some vendors charged
not per-developer-seat, but per deployed instance of the software. Greedy algorithms are often
suboptimal, and this approach was greedier and less optimal than most. Its evident success speaks
for itself.
In one particularly egregious and tragic case, I'm told ParcPlace declined an offer from Sun
Microsystems to allow ParcPlace Smalltalk to be distributed on Sun workstations. Sun would pay a per
machine license fee, but it was nowhere near what ParcPlace was used to charging.
Eventually, Sun developed another language; something to do with beans, I forget. Fava, maybe?
Again, dwell on that and what alternative universe might have come about.
Performance and/or the illusion thereof.
Smalltalk was and is a lot slower than C, and more demanding in terms of memory. In the 1980s and
early 1990s, these were a real concern. In the mid-1990s, when we worked on Strongtalk, Swiss
banks were among our most promising potential customers. They already had Smalltalk applications
in the field. They could afford to do so where others could not. For example, they were willing to equip
their tellers with powerful computers that most companies found cost-prohibitive - IBM PCS with a
massive 32Mb of memory!
It took a long time for implementation technology to catch up, and when it did, it got applied to lesser
languages. This too was a cruel irony. JITs originated in APL, but Smalltalk was also a pioneer in that
field (the Deutsch-Schiffman work), and even more so Self, where adaptive JITs were invented.
Strongtalk applied Self's technology to Smalltalk, and made it practical.
Examples: Self needed 64Mb, preferably 96, and only ran on Sun workstations. Strongtalk ran in 8Mb
on a PC. This mattered a lot. And Strongtalk had an FFI, see below.
Then, Java happened. Strongtalk was faster than Java in 1997, but Strongtalk was acquired by Sun;
the VM technology was put in the service of making Java run fast.
The Smalltalk component of Strongtalk was buried alive until it was too late. By the time I finally got it
open-sourced , bits had rotted or disappeared, the system had no support, and the world had moved on.
And yet, the fact that the Smalltalk community took almost no interest in the project is still telling.
Imagine if all the engineering efforts sunk into the JVM had focused on Smalltalk VMs.
It's also worth dwelling on the fact that raw speed is often much less relevant than people think.
Java was introduced as a client technology (anyone remember applets?). The vision was programs
running in web pages. Alas, Java was a terrible client technology. In contrast, even a Squeak
interpreter, let alone Strongtalk, had much better start up times than Java, and better interactive
response as well. It also had much smaller footprint. It was a much better basis for performant client
software than Java. The implications are staggering.
On the one hand, Netscape developed a scripting language for the browser. After all Java wouldn't cut
it. Sun gave them permission to use the Java name for their language. You may have heard of this
scripting language; it's called Javascript.
Eventually, people found a way to make Javascript fast. Which people? Literally some of the same
people who made Strongtalk fast (Lars Bak), using much the same principles.
Imagine if Sun had a workable client technology. Maybe the Hot Java web browser would still be
around.
On the other hand, the failure of Java on the client led to an emphasis on server side Java instead.
This seemed like a good idea at the time, but ultimately commoditized Sun's product and contributed
directly to Sun's downfall. Sun had a superb client technology in Strongtalk, but the company's
leadership would not listen.
Of course, why would they? They had shut down the Self project some years earlier to focus on Java.
Two years later, they spent an order of magnitude more money than it cost to develop Self, to buy back
essentially the same technology so they could make Java performant.
Interaction with the outside world.
Smalltalk had its unique way of doing things. Often, though not always, these ways were much better
than mainstream practice. Regardless, it was difficult to interact with the surrounding software
environment. Examples:
FFIs. Smalltalk FFIs were awkward, restrictive and inefficient. After all, why would you want to reach
outside the safe, beautiful bubble into the dirty dangerous world outside?
We addressed this back in the mid-90s in Strongtalk, and much later, again, in Newspeak.
Windowing. Smalltalk was the birthplace of windowing. Ironically, Smalltalks continued to run on top
of their own idiosyncratic window systems, locked inside a single OS window.
Strongtalk addressed this too; occasionally, so did others, but the main efforts remained focused on
their own isolated world, graphically as in every other way. Later, we had a native UI for Newspeak as
well.
Source control. The lack of a conventional syntax meant that Smalltalk code could not be managed
with conventional source control systems. Instead, there were custom tools. Some were great - but
they were very expensive.
In general, saving Smalltalk code in something so mundane as a file was problematic. Smalltalk used
something called file-out format, which is charitably described as a series of reflective API calls, along
with meta-data that includes things like times and dates when the code was filed out. This compounded
the source control problem.
Deployment. Smalltalk made it very difficult to deploy an application separate from the programming
environment. The reason for this is that Smalltalk was never a programming language in the traditional
sense. It was a holistically conceived programming system. In particular, the idea is that computation
take place among communicating objects, which all exist in some universe, a "sea of objects". Some
of these object know how to create new ones; we call them classes (and that is why there was no
syntax for declaring a class, see above).
What happens when you try to take some objects out of the sea in which they were created (the IDE)?
Well, it's a tricky serialization problem. Untangling the object graph is very problematic.
If you want to deploy an application by separating it from the IDE (to reduce footprint, or protect your IP,
or avoid paying license fees for the IDE on each deployed copy) it turns out to be very hard.
The Self transporter addressed this problem in a clever way. Newspeak addressed it much more
fundamentally and simply, both by recognizing that the traditional linguistic perspective need not
contradict the Smalltalk model, and by making the language strictly modular.
The problem of IP exposure is much less of a concern today. It doesn't matter much for server based
applications, or for open source software. Wasted footprint is still a concern, though in many cases you
can do just fine. Avi Bryant once explained to me how he organized the server for the late, great
Dabble DB. It was so simple you could just cry, and it performed like a charm using Squeak images.
Another example of the often illusory focus on raw performance.
So why didn't Smalltalk take over the world?
With 20/20 hindsight, we can see that from the pointy-headed boss perspective, the Smalltalk value
proposition was:
Pay a lot of money to be locked in to slow software that exposes your IP, looks weird on screen and
cannot interact well with anything else; it is much easier to maintain and develop though!
On top of that, a fair amount of bad luck.
And yet, those who saw past that, are still running Smalltalk systems today with great results; efforts to
replace them with modern languages typically fail at huge cost.
All of the problems I've cited have solutions and could have been addressed.
Those of us who have tried to address them have found that the wider world did not want to listen -
even when it was in its own best interest. This was true not only of short sighted corporate leadership,
but of the Smalltalk community itself.
My good friends in the Smalltalk community effectively ignored both Strongtalk and Newspeak.
It required commitment and a willingness to go outside their comfort zone.
I believe the community has been self-selected to consist of those who are not bothered by Smalltalk's
initial limitations, and so are unmotivated to address them or support those who do. In fact, they often
could not even see these limitations staring them in the face, causing them to adopt unrealistic
business policies that hurt them more than anyone else.
Perhaps an even deeper problem with Smalltalk is that it attracts people who are a tad too creative and
imaginative; organizing them into a cohesive movement is like herding cats.
Nevertheless, Smalltalk remains in use, much more so than most people realize. Brave souls continue
to work on Smalltalk systems, both commercial and open source. Some of the issues I cite have been
addressed to a certain degree, even if I feel they haven't been dealt with as thoroughly and effectively
as they might. More power to them. Likewise, we still spend time trying to bring Newspeak back to a
more usable state. Real progress is not made by the pedantic and mundane, but by the dreamers who
realize that we can do so much better.
Eppur si muove
48 comments:
A typo found: s/Newpeak/Newspeak/
"Perhaps an even deeper problem with Smalltalk is that it attracts people who are a tad too creative and imaginative; organizing them into a cohesive movement is like herding cats."
This reminds of The Lisp Curse. It's so easy to make a slightly different and potentially slightly better Smalltalk that many people end up trying. The resulting noise makes it difficult for substantial improvements to be recognized as such.
Eh... it just felt like an incoherent ramble. The equivalent of Trump-speak tech writing.
What about static typing? Personally I'm unwilling to write dynamically types languages except under duress (and I do write a lot of Javascript under duress :-))
Does any statically typed variant of Smalltalk exist?
@Tom Davies: strongtalk is a statically typed variant of Smalltalk. The types are optional.
Tom,
As Paul points out, Strongtalk had a static type system. That said, the post focuses on the (lack of) adoption of Smalltalk and I don't think static types mattered at all. Javascript, Python, PHP, Ruby (and in the past, Perl) all saw considerable (in some cases, massive) adoption without static types (or performance, for that matter).
I didn't know the story about the proposal to distribute ParcPlace Smalltalk in Sun workstations.
Do you know any references where I can read more details about that story?
I probably heard it verbally from Dave Thomas. I'll ping him and ask. He and Allen Wirfs-Brock are perhaps the people best informed about the history of the Smalltalk industry.
" Alas, most software developers would rather carve their programs onto stone tablets using flint tools held between their teeth than pay for tools" ← It's partly about dollars, and increasingly so with time, as developers lack discretionary budget from their employers. But really it's about the walls that proprietary software throws up; the dollar cost is just a symptom.
Actually, Sun wanted to deploy Smalltalk on embedded systems and not workstations (which was one of Dave Thomas' speciality, so it would be normal for him to have been involved). When the licensing didn't work out they did their own embedded language, which was eventually named Java.
In the same timeframe ParcPlace merged with Digitalk and that raised to entry cost for Smalltalk from $90 to $3000. I complained about that to one of their representatives and the reply was that this was really cheap for the kinds of clients they were interested in. I asked who would their clients hire if students were kept away from the language and they claimed students could use Little Smalltalk.
The initial fragmentation was something Smalltalk shared with Lisp and Forth. I blame Xerox for not making the licensing terms public. Some people have told me I could have licensed it for $20. If that was the case, then I was stupid to have worked on my own version. Others have mentioned values over $100K, in which case I did the right thing.
A fatal blow to commercial Smalltalk was when the merged Digitalk/ParcPlace hired a CEO from the food industry (since it had worked so well for Apple and IBM). He was dismayed that his new company was developing something called Smalltalk when his golf buddies would talk about nothing but some Java thing. The obvious smart thing to do was to reinvent his company as the premier Java shop in the world! Never mind that Borland, IBM and even Sun were losing a ton of money on Java.
Hi Jecel. I'll quibble about a few things. "same timeframe". Java was out by the time of the merger. The merger was, AFAIK, a disaster, as there was no synergy between the products, nor where they ever integrated (to this day!). The Java CEO thing I never heard of. I know the merged company was sold for a song (well 2-3 million $) to a customer who needed to keep it alive, and owns it to this day. The owners never grasped that they had a world class platform in their hands, and could/would not invest in it properly.
Interesting article, thanks. I'm reasonably familiar with APL history. When you mention JIT techniques in APL, are you referring to the 1970 PhD dissertation of Philip Abrams at Stanford (SLAC)? Thanks, C.
Charles; Honestly, I didn't have anything that specific in mind; I just remember the citations for JITs pointing toward APL. Thanks for the reference - I am a big APL fan and see a great future for its ideas as well.
Konrad; I enjoyed reading the Lisp Curse. That said, I don't think this was the problem for Smalltalk. If anything, I was always puzzled by how little language experimentation was done in Smalltalk; and a good thing too, because most such experiments do more harm than good.
The original APL\3000 compiler, developed by HP in 1977, is documented in this 1979 ACM article. It was in fact already adaptive. The "hard compiler", which was run when a line of APL was first encountered, assumed that the types and dimensions of all the variables referred to would always be the same, and emitted a prologue to check this contract followed by the code for the line; this permitted all iterations over a dimension to use inline constants rather than looking at components of the array shape. (APL scalars are arrays with 0 dimensions and 1 element.; types are numbers and characters at the user level, but bits, integers, floats, and characters to the implementation.)
If the prologue's tests failed, the "soft compiler" was run on the line, which assumed only the type and the number of dimensions (called "rank" in APL) of each variable was fixed (this was necessary in order to have nested loops per dimension). If a soft prologue failed, the soft compiler was run again.
Both compilers generated bytecode, which was far more compact on the 16-bit HP 3000, but their principles were the same as those of later JITs. The system also used lazy evaluation (known as "beating" and "dragging") internally in order to minimize memory consumption by intermediate values, though user programs were not able to detect this strategy.
I first heard of the concept of of dynamic, as needed, compilation in a 1976 Software: Practice and Experience paper titled "Throw-away Compiling" by P. J. Brown https://doi.org/10.1002/spe.4380060316
From the abstract: "...This combines the merits of compilation and interpretation. If enough storage is available for a program, it can be compiled in the normal way; if not, the program is stored in a concise intermediate form and compiled dynamically at run‐time, making use of whatever storage is available. When this storage runs out, the previously compiled code is thrown away and the storage is re‐used..."
This paper made a strong enough impression upon me when I read it shortly after publication that it is one of the few papers from that time that I distinctly remember. My recollection (damn paywalls) is that he described its use in an implementation of Basic.
the throw-away compiling comment is from Allen Wirfs-Brock
John; thanks for the reference. I'm currently more interested in APL than you might think.
Allen: thanks for that pointer. I'm eagerly awaiting your promised post on Smalltalk history.
I was at ParcPlace when the SUN thing happened - in fact I was the engineering manager at the time. Jecel is correct that the anticipated deal was for SUN to use PPS Smalltalk for some variety of system tools relating to embedded things - though there was also talk of making admin tools for workstations. It was the sales 'team' that screwed it up by demanding stupid amounts of money.
So very many organisational things went wrong at PPS during and after the push to IPO. Hiring an IBM sales dweeb to replace Adele as CEO was just the first insanity. The merger essentially killed both companies. Then the 'suicide note'... sigh.
I do take umbrage at the suggestion that Smalltalk failed by being expensive when other development tools were free - that simply isn't how it was in the early/mid 90's. Windows was expensive; Microsoft compilers were expensive. Greeh Hills, Lattice ... all expensive basic compiler packages with terrible debugging tools. Higher end tools (and I'm blanking on names - Kee was one?) were *much* more expensive than VW - $50k wasn't unusual. VW was ~$3000 for 'workstations' and ~$700 for Mac/Windows/OS/2.
A significant factor in the 'failure' of Smalltalk was and still is the sheer idiocy of a lot of people in the software world. The lunatics that think writing code with a dumb text editor like vi or nano or emacs, to dump into a dead text file, run through a compiler/linker/etc, try to run, peer at a poorly thought out printf 'debug log' or use a barely capable debugger.. good grief. Just no. That isn't software development, that's abuse.
The good news (and boy do we need some!) these days is that you can currently run fully capable Smalltalk systems (Squeak, Pharo, Cuis, VA Smalltalk, etc, etc) on a $35 Raspberry Pi 4 that will run at around 5 orders of magnitude faster than the best workstations of that era. And if you need more memory than a mere 2Gb, pay $55 for a 4GB or $75 for an 8Gb version.
Tim,
Thanks for the history and insight. Of course, the fact that people simply failed to appreciate how wonderful Smalltalk was (and is) is key. Yet, in the face of human folly, one must adapt. C was free; C++ was free. Many other things were free for academics and students. So I still feel strongly that the business model was a big part of the problem. Of course, as I say, there are multiple factors and in an ideal situation, one might isolate them. Most train wrecks require multiple (i.e > 2) failures. I listed 6. Most important is to look to the future and try to do better.
The biggest problem IMO is analogous to the Lisp curse. Highly mutable environments don't scale in human terms, because they're too flexible.
Giving a lot of expressive power to a developer is great. To a team of developers is fine. To an organization of teams, a lot less excellent, because abstractions evolve differently in different teams, and people can't be swapped between teams as easily, or hired and made productive.
This is why there's a conservatism of frameworks and libraries. Nobody wants to waste time learning some weird thing which is only used in one workplace; and you can't hire for people who understand your weird stuff. So you want inexpressive systems with a lot of lingua franca, so that the amount of innovation is strictly curtailed and the power of individual developers to do a lot of damage (= write code that takes talent to understand) is limited.
It's much easier and more importantly, quicker, to higher 10 mediocre devs than 1 dev who will thrive in such an environment. And actually I think the multiplier is bigger than that, 20x or 50x. So it's a real uphill struggle to promote systems with a lot of power.
Is the Lisp Curse still an issue today? Both Lisp and Smalltalk were developed in a distant past, before the Internet. Building a consensus-oriented community was difficult back then. Clojure, for example, seems to have escaped the Lisp Curse, in spite of sharing all the "required" characteristics with older Lisps.
Well, it's important to remember that C & C++ *weren't* free except by the side route of being included (was C++?) as part of a typical workstation; which were expensive professional machines back then.
Some fascinating examples gleaned from the wayback machine's stash of Byte editions
1990 - Zortech C++ expensive enough that they didn't mention the price in adverts! MS 'Quick C' (which I remember as barely functioanl) $139. WatCom C $1300 ! Also important to point out that you had to pay (much) extra for any useful libraries much beyond clib.
1992 - Metaware High C - $795
Anyway, just as a general cool aside - google "wayback machine byte" for hours of nostalgic fun. They have the Aug'81 smalltalk issue for example.
Tim,
If your Unix workstation (however costly) comes with a C compiler (as it must), the marginal cost of C is zero. As I say, that's just part of the picture. We had free C++ on our machines when I was in grad school in the late 80s/early 90s. Smalltalk was nowhere to be found. Tangent: In the summer of 89 I managed to get hold of a copy of PPS, when I had a job at Evans & Sutherland (I was supposed to use Standard ML, but had already seen that at the University and was not a fan). As a rule it was very hardto find, when it should have been easy.
As promised, I've written a response to Gilad's post: http://www.wirfs-brock.com/allen/posts/914
It ended up being longer than I expected but was fun.
Enjoy,
Allen
If we can trust Wikipedia, GCC (GNU C compiler) was released in 1987. By 1990, GCC supported thirteen computer architectures, and was outperforming several vendor compilers.
I may be some months late to the discussion, but is there a way to unify the following statements:
"Perhaps an even deeper problem with Smalltalk is that it attracts people who are a tad too creative and
imaginative; organizing them into a cohesive movement is like herding cats."
"Real progress is not made by the pedantic and mundane, but by the dreamers who realize that we can do so much better."
My observation is that a standard, while a great thing to have which I wouldn't be caught dead without, is a pretty pedantic and mundane thing. In the case of a programming language standard, be it a hypothetical standard for Smalltalk-80 systems or the non-hypothetical Common Lisp standard, it would be an amalgamation of different behaviours provided by old systems which generally make sense, and rarely anything new.
But past the language, I think people look for consensus or cohesion or a standard where it shouldn't exist, and then make up stuff like a "Lisp curse" when it doesn't work out.
For example, I read someone saying that the "curse" affected two array processing libraries, one which was supposed to look exactly like Python's numpy, i.e. eagerly evaluated with lots of hand-coded special cases; and one which was more like the aforementioned APL\3000 in that it was very lazy about computing, rather building up a dataflow graph to decide what it didn't have to compute, and provided only a few primitives (map, reduce, transform, etc) which it could optimize.
I can't see how the two approaches to writing an array processing library would be unified. The examples you provided of inconsistencies in ST-80 implementations are more fundamental and definitely should be fixed, but most incompatibilities have a right to be incompatible. And even in those "damage-limiting" languages like Java, you get to pick IDEs, build tools, frameworks, so on and so forth with far more combinations than one person could learn. So I am not awfully bothered by it.
A major unsolved problem in computing is to find the right level at which to standardize. In my opinion, the language-as-a-platform level is not the best one. It's too big, including many different aspects that are nearly orthogonal, but which you can only get as a pack.
Your array package example is a nice illustration (BTW, I suppose the first one is numcl, but what is the second?). I agree that the two approaches are sufficiently distinct in their trade-offs that they should exist in parallel. Which they can because they are libraries, not language features (as in APL).
But I want more aspects of a language to be pluggable items. A numeric tower, for example. There's a place for high-performance numerical stacks, offering only fixnums and floats. There's also a place for a more elaborate tower with bignums and rationals. And for yet more elaborate ones including arbitrary-precision floats, computable reals, or other more exotic variants. It's a choice I want to make for a specific software system, without at the same time committing to a specific syntax, library ecosystem, and development tools. I want numeric towers to be standardized components with multiple robust implementations optimized for various platforms, and combine such a component with other components such as a syntax/parser (or several), to make up a problem-specific programming system.
Konrad: The latter library is Petalisp for which there are even two GPGPU backends (disclosure: I wrote the first one). But plugging in language components is interesting!
Usually the Lisp way to optimization, be it in numeric operations or even generic dispatch (c.f. sealable-metaobjects) is to declare types and have the compiler figure out that you are constrained to floats or machine integers, or that only these methods could be applicable given inferred types respectively. But that does not cover when you want more stuff which is (possibly) harder to optimize. Well, generic dispatch is handled by a meta-object protocol, but that is a unique occurence in Common Lisp.
One non-problem is that putting things into libraries and not language standards could be leaving the user at the "tyranny of a single implementation" (hey, there's an Alan Kay quote no one has done to death yet!), but the tower that is being programmed against should be enough of a standard that implementations can be swapped out without much of a hassle. Another is that the "language" is now suddenly larger, but few people use all the features of monolithic programming languages, and libraries bottom out at something well defined (which is another reason I don't feel scared of making up abstractions).
Yes, in Lisp there always is a way to tweak things the way you want, which is what we all (?) love about Lisp. And yet, I don't think you can easily get a C-like numeric tower in Common Lisp, because as far as I know, there is no way to remove the automatic promotion of fixnums to bignums. That leaves implementing a distinct numeric tower with module integer arithmetic and defining a new DSL via macros. Possible but not a pleasure.
More importantly, I'd want my pluggable numeric towers to be usable across languages, because some people just don't want to use Lisp and prefer, say, Python. The programming systems of the 1970s to 1990s, including Common Lisp and Smalltalk, were designed to be foundations of all-encompassing computing universes. That didn't work out. People will never agree in a single language, except perhaps in very narrowly defined application domains. Agreeing on a small number of distinct standardized numeric towers looks much more doable to me.
I know at least SBCL generates code for machine-word-sized arithmetic if you explicitly tell it to compute mod 2^n, and Robert Strandh's call-site optimization would allow for passing unboxed values through function calls. But, yes, that isn't a substitute for an extensible or portable numeric tower.
The mere fact that Python and JavaScript are two of the most popular programming languages on the planet is evidence that Smalltalk can make a comeback, theoretically. Dynamic typing notwithstanding.
And while many languages have borrowed ideas from Smalltalk, Smalltalk remains one of the simplest and easiest programming languages. Its syntax is so simple, you can learn it all within 15 minutes! You cannot say the same about Python, JavaScript, Ruby, Golang, Elixir, Lua, etc.
I mention this because I belong to the school of thought that programming languages ought to be really, really small and simple. And I'm in good company with advocates like the late Per Brinch Hansen and Niklaus Wirth.
Language simplicity offers many benefits. Low cognitive load is among the most important. This lends itself to fewer errors and better reliability.
While Smalltalk developers are not fungible, Smalltalk's renowned productivity should count for a lot. When you can write application software 3-5X faster than in Python, Java, JavaScript, C++, C#, and Ruby, that represents a HUGE cost savings.
Smalltalk's IDE is a major contributor to this productivity. Thus, it makes little sense to employ other, more conventional tools like Visual Studio, IntelliJ, Emacs, etc.
While Smalltalk appears to be not evolving much, this is understandable. Given that the language is syntactically so small and simple, any changes are likely to increase the size and complexity of the language, thus compromising Smalltalk's fundamental qualities.
However, this is not preventing platforms like Pharo and VAST from continuing to innovate and improve.
I like your comment about Smalltalk being a better Docker than Docker. This is absolutely right and should be considered one of Smalltalk's strengths.
Allen Wirfs-Brock wrote, "Technologies that have failed in the marketplace seldom get revived." True, but I'm working hard to revive it nevertheless. I believe Smalltalk can return to the spotlight it enjoyed in the early 1990s. Perhaps I'm like Don Quixote, but I shan't give up so easily.
There are many reasons that contributed to the downfall of Smalltalk, but theres one thing that I dont see many people mentioning it. The smalltalk image, yes I mean the image based system. Name me a popular/mainstream language that is image-based, and of course you cant find any. The issue with the image based system is that its extremely difficult to deploy, also it interacts with the outside world poorly.
A successful language must be file based, its been proven many years ago. Newspeak is like an improved and refined version of smalltalk, avoiding some of its pitfalls while adding many great features. Unfortunately, it made the same mistake Smalltalk made, Newspeak is also image based. Newspeak could've been a lot more successful had it chosen to be file based instead, if anything this is a lesson to learn for future language designers/implementors.
I disagree. Image-based development can be a strength. Smalltalk is renowned for its productivity for this reason.
I mentioned Docker. Smalltalk's image is a lot like a Docker container. Virtualization is a powerful tool for deployment purposes.
Modern Smalltalk like Pharo and VAST are fully capable of interacting with the external world.
File-based development is so antiquated. It's like programming with stone knives and bear skin.
Would Java count as a popular language? Java native image is an image with a starter, just saved as a native executable.
Actually even Gilad Bracha himself was able to identify the pitfalls of Smalltalk images, and he made a blog post back in 2009 describing what went wrong with Smalltalk's image system:
https://gbracha.blogspot.com/2009/10/image-problem.html
I dont know whether newspeak can escape the problems suffered from Smalltalk images. But its evident that image based system is a big reason for Smalltalk's downfall, and a language that relies 'exclusively' on image mechanism is never going to work. The fact that you cant find a mainstream language that uses image based system, is enough to justify what I am saying is correct.
Image-based development does have some shortcomings, but so does file-based development. In software development, everything has pros and cons. Nothing is perfect.
So you pick and choose on the basis of what suits you best. I'll take Smalltalk's pros over the cons of Python/JavaScript/Java/etc.
The key to overcoming the cons of image-based development is backup, backup, backup. Including the source code. You should be doing this in file-based development anyway.
That's how and why Smalltalkers have remained loyal to this great language.
Gilad and Tim: I think the big issue was that Smalltalk was expensive for universities. The reason that C syntax is ubiquitous is that C was free to universities as part of Unix from the mid-1970s - 15-20 years before free compilers became a thing anywhere else, but it infect the minds of those students, who in due course steered the companies they ended up at.
Imagine if Smalltalk had been similarly free.
Ordland: SAP's R/3 ABAP System, which runs almost all of their applications is practically image based (the image being a relational DB that holds both all code and data). The programming language on top has a hideous syntax an some questionable semantics, but the fact that you can log into a running server and inspect/edit code and data with the same basic UI concepts that you also use to do your accounting/stock management/invoicing is quite powerful. It also comes with a distributed version control system so that you can push changes that you make on the qa/staging/prod server back into the development system.
That tech stack isn't popular/mainstream everywhere, but there are a lot of payroll runs every month that it enables and a lot of money is made with that technology.
@Richard Kenneth Eng and Felix Leipold:
The image problem is that, it inherently requires that your entire ecosystem be built around it, rather than fit into or interop with the existing ecosystem. Its no surprise that languages like Oberon and Forth which also had a similar full-stack system never gained traction. At least, one that you cannot opt out of their full-stack system. You can read this post for more insights:
https://news.ycombinator.com/item?id=5607350
On the contrary, we see languages like Kotlin and Typescript becoming successful and making it to the mainstream, due to their seamless interop with JVM and JS ecosystem. You can use any JVM library in Kotlin, you can gradually migrate your JS code to TS. If you use Smalltalk image, you give up every tooling and libraries you have for anything else. In an alternative universe in which Smalltalk language and programming environment dominate the industry, this might be feasible. But it wasnt the case in our real world, unfortunately.
Despite decades of success from languages such as FORTRAN, COBOL, BASIC, Pascal, and C, Smalltalk managed to carve out a significant niche for itself. Smalltalk reached its zenith in the 1990s to become the second most popular object-oriented language in the world, after C++ and ahead of Objective-C, Object Pascal, CLOS, and Eiffel.
IBM even adopted Smalltalk as the centrepiece of their VisualAge enterprise initiative in 1993.
It was only Java and Sun Microsystems' marketing prowess that brought down Smalltalk towards the end of the 1990s.
Today, Smalltalk's interoperability has vastly improved since the 1990s. Just look at Instantiations' VAST and open source Pharo.
It is not such a headache to use Smalltalk's image-based system, especially when you compare it to using container environments like Docker. Compiler, editor, application, virtual OS platform all rolled into one. This is tremendously convenient.
As a retired software engineer, I've used dozens of editors and IDE tools over the past four decades, and I've always tried to be adaptable and flexible. Adapting to Smalltalk's IDE tools and benefiting from the enormous productivity boost is no hardship at all.
I have to wonder about this current generation of software engineers.
Ordland said: "The fact that you cant find a mainstream language that uses image based system, is enough to justify what I am saying is correct."
Did you know that writing web-based applications in JavaScript is like using an "image-based" system? The web browser is the virtual OS platform, and the web-based application can be saved and restored just like an image.
Even Java is essentially an image-based system. Like Smalltalk, it executes byte code in a self-contained portable environment ("write once, run everywhere").
Interlisp, which has recently returned from the dead, is a hybrid file/image system. You don't edit files directly, but they remain the One Source of Truth: the image can be rebuilt from them. When you define something in Interlisp, you say what file you want it to be part of, and then the system remembers where it belongs after that so that when you change the definition, the file is rewritten. Thi Smalltalk image, on the other hand, has bits that were set before 1980 and aren't reconstructible from the Big Log File; worse yet, you can't find anything in that file without a working image.
It isn't the image itself that's the problem - it is in fact a wonderful feature, as long as you don't make it so you cannot easily extract your program from the image (the post mentions this by the way).
Newspeak never had the associated difficulties with deployment because the language and code are well defined and modular, and so easily separated from the image.
Current Newspeak doesn't have an image. The Squeak-based version ran in a Squeak image. Future Newspeak's may have an image again. Regardless, they won't have the same problems one associates with Smalltalk images. It isn't about images or files or databases or however you persist things - it is about separating code (intent) from heap (extent) when desired.
Gilad, you make a valid point. However, separating the program from the image hasn't been a major issue for most Smalltalkers, especially over the past four decades. If it were such a huge pain point, then Cincom, Instantiations, and Pharo would've done something to mitigate the issue.
No programming language is perfect. If this is the biggest complaint about Smalltalk, then surely it's not a deal breaker.
Post a Comment