APL is the only language I've ever dreamt about writing (as in: I could see the characters); I'd dreamt about programming in the past, but those dreams were usually what I would categorized as a nightmare—desperately trying to fix a bug that I couldn't figure out.
Due to my affinity for the language, and my wish to have worked in its heyday (would love to have an APL gig someday), I have been exposed to various writings and recordings of Ken Iverson. I've also been exposed to a few of Dijkstra's thoughts on APL.
I have to say that Iverson generally comes across as a very generous and curious individual while Dijkstra seems to have been a miserable ass. Maybe, given the lens, I've not given Dijkstra a proper chance to demonstrate a more positive attitude, so I'm open to any suggestions of writings where he doesn't seem like such a grump.
> Maybe, given the lens, I've not given Dijkstra a proper chance to demonstrate a more positive attitude, so I'm open to any suggestions of writings where he doesn't seem like such a grump.
Kinda hard to find where Dijkstra praised something (except Algol 60).
One funny example: he called FORTRAN "an infantile disorder", though he said this about the team behind it: "At that time this was a project of great temerity and the people responsible for it deserve our great admiration.".
On LISP: "LISP has jokingly been described as 'the most intelligent way to misuse a computer'. I think that description is a great compliment because it transmits the full flavor of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts."
Alan Kay on Dijkstra: "Arrogance in computer science is measured in nano-dijkstras."
"Kinda hard to find where Dijkstra praised something (except Algol 60)."
Hamilton Richards, who was one of Dijkstra's colleagues at the University of Texas, told me in an email that Dijkstra was impressed by the work of Richard Bird on functional programming.
Interesting. Bird-Meertens formalism? That was directly influenced by APL. In the broader scope — algebra of programs — Dijkstra heavily disliked Backus’ FP.
He liked to make fun of emerging CS, it was his thing, and he did it pretty well. He wasn't "miserable ass", he was a man of different culture. "Positive attitude" as you used it is a late americanism FWIW. In many parts of the world we often bring our criticisms in similar caustic manner, and we enjoy it while being resilient and optimistic.
It doesn't mean that he was always right, of course.
I suppose the thing that stuck in my craw is what I perceived as a personal slight:
> Your writings made me wonder in which discipline you got your doctor’s degree.
It’s ambiguous, so I can give the benefit of the doubt. However, if intended as an insult, I believe that goes beyond speaking frankly about one’s feelings about a programming language in a manner consistent with one’s culture.
I see what you mean, and in fact I'm sure it is an insult. However, sensitivity to it is still a matter of particular culture-time. Some cultures tend to consider such things to be an attack against someone's social status no matter what context. Others may take them as banter, or emotional coloring if you like.
Another example: Americans seemingly normalized public swearing using what was originally sex-related slang, while in many other cultures it is a very-very dangerous minefield. And let's not forget that the whole field at that time was orders of magnitude smaller, which implies less formal spectrum of accepted styles of communication.
I often see a lot of folks here on HN who are too eager to pass judgement on Dijkstra without having read and understood his writings nor giving much thought to the context and times which shaped his thinking. All of his opinions should be thought over and appropriately adapted for use in our context. He was a wide-ranging polymath philosopher (technical and non-technical) with a laser-like focus on exactitude using Mathematics. Radical approaches must be pushed, particularly when they are hard/difficult to learn and understand. If his language was biting, then it was so much the better for promulgating his cause viz. "correctness over easy", "mathematical reasoning underpinning everything", "structure in aid of the previous", "proper language syntax/design in aid of the previous" etc.
As an example; read EWD 1036: On the cruelty of really teaching computing science (https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD103...) in entirety and carefully. The thesis is that computing is "radically novel" from other forms of human endeavour and hence it cannot be taught by simplifying to known analogues and adhoc procedural operations. You need the rigor and exactitude of Mathematical Logic to build a "Scientific" basis.
Excerpts:
... What is a program? ... I prefer to describe it the other way round: the program is an abstract symbol manipulator, which can be turned into a concrete one by supplying a computer to it. After all, it is no longer the purpose of programs to instruct our machines; these days, it is the purpose of machines to execute our programs.
So, we have to design abstract symbol manipulators. We all know what they look like: they look like programs or —to use somewhat more general terminology— usually rather elaborate formulae from some formal system. It really helps to view a program as a formula. Firstly, it puts the programmer's task in the proper perspective: he has to derive that formula. Secondly, it explains why the world of mathematics all but ignored the programming challenge: programs were so much longer formulae than it was used to that it did not even recognize them as such. Now back to the programmer's job: he has to derive that formula, he has to derive that program. We know of only one reliable way of doing that, viz. by means of symbol manipulation. And now the circle is closed: we construct our mechanical symbol manipulators by means of human symbol manipulation.
...The point to get across is that if we have to demonstrate something about all the elements of a large set, it is hopelessly inefficient to deal with all the elements of the set individually: the efficient argument does not refer to individual elements at all and is carried out in terms of the set's definition.
...
Back to programming. The statement that a given program meets a certain specification amounts to a statement about all computations that could take place under control of that given program. And since this set of computations is defined by the given program, our recent moral says: deal with all computations possible under control of a given program by ignoring them and working with the program. We must learn to work with program texts while (temporarily) ignoring that they admit the interpretation of executable code.
Another way of saying the same thing is the following one. A programming language, with its formal syntax and with the proof rules that define its semantics, is a formal system for which program execution provides only a model. It is well-known that formal systems should be dealt with in their own right, and not in terms of a specific model. And, again, the corollary is that we should reason about programs without even mentioning their possible "behaviours".
... a program is no more than half a conjecture. The other half of the conjecture is the functional specification the program is supposed to satisfy. The programmer's task is to present such complete conjectures as proven theorems.
He then goes on to describe the importance of teaching Predicate Calculus via a simple imperative language where its semantics are given by proof rules. This language is described in EWD 472: Guarded commands, non-determinacy and formal derivation of programs - https://www.cs.utexas.edu/~EWD/transcriptions/EWD04xx/EWD472... There is no compiler for this language and so the student is forced to implement the program and its proof by hand. However this radical approach is not being accepted by educationists/industry.
He concludes;
Teaching to unsuspecting youngsters the effective use of formal methods is one of the joys of life because it is so extremely rewarding. Within a few months, they find their way in a new world with a justified degree of confidence that is radically novel for them; within a few months, their concept of intellectual culture has acquired a radically novel dimension. To my taste and style, that is what education is about. Universities should not be afraid of teaching radical novelties; on the contrary, it is their calling to welcome the opportunity to do so. Their willingness to do so is our main safeguard against dictatorships, be they of the proletariat, of the scientific establishment, or of the corporate elite.
That's a whole lot of text that does absolutely nothing to justify the stuff the parent comment is talking about. Let's be honest here: he was really smart and got away with being kinda an asshole a lot because of that (probably because at least some of the time his arrogance wasn't pointed at anyone and instead was genuinely funny, like his comments about not being able to lie about being "the world's greatest computer scientist" because he was under oath on a jury.
I personally don't think that being right ever excuses the method of delivery; there's always a way to be just as right and convincing without being an asshole, but it requires conscience choice. Dijkstra probably would have found this sort of insistence tedious and beneath him, and he wouldn't be wrong about being smarter than me and probably anyone else who would have expressed this to him, but like I said, I don't think that excuses it.
The books you mention deal with "Software Abstractions" i.e. Procedural Abstraction, Data Abstraction, ADT, Patterns etc. They are at a much higher level than Dijkstra's concepts.
What Dijkstra is talking about is the Set Theory/Logic Operations underlying all of them. This is why he was sceptical of OO/Functional etc. styles of programming. They were all mere syntactic sugar over the underlying Mathematics obscuring the essence of Programming. For him all computation is basically a series of manipulating values from a Set as a whole using Logical Expressions i.e. Predicate Calculus.
Again from EWD1036;
Before we part, I would like to invite you to consider the following way of doing justice to computing's radical novelty in an introductory programming course.
On the one hand, we teach what looks like the predicate calculus, but we do it very differently from the philosophers. In order to train the novice programmer in the manipulation of uninterpreted formulae, we teach it more as boolean algebra, familiarizing the student with all algebraic properties of the logical connectives. To further sever the links to intuition, we rename the values {true, false} of the boolean domain as {black, white}.
On the other hand, we teach a simple, clean, imperative programming language, with a skip and a multiple assignment as basic statements, with a block structure for local variables, the semicolon as operator for statement composition, a nice alternative construct, a nice repetition and, if so desired, a procedure call. To this we add a minimum of data types, say booleans, integers, characters and strings. The essential thing is that, for whatever we introduce, the corresponding semantics is defined by the proof rules that go with it.
Right from the beginning, and all through the course, we stress that the programmer's task is not just to write down a program, but that his main task is to give a formal proof that the program he proposes meets the equally formal functional specification. While designing proofs and programs hand in hand, the student gets ample opportunity to perfect his manipulative agility with the predicate calculus. Finally, in order to drive home the message that this introductory programming course is primarily a course in formal mathematics, we see to it that the programming language in question has not been implemented on campus so that students are protected from the temptation to test their programs. And this concludes the sketch of my proposal for an introductory programming course for freshmen.
The language mentioned above is described in EWD472 (also see wikipedia for a detailed explanation - https://en.wikipedia.org/wiki/Guarded_Command_Language). Notice the total absence of syntactic sugar and fancy computation models i.e. no complex Types, no mapping to lambda calculus or any other mathematical model (since a computer is fundamentally imperative over a state space) etc. It is just the absolute basic mathematics of Set Theory and Logic.
Note that Tony Hoare had systematized the algebra for this in his "Hoare Logic" via Pre/Post conditions for a executable statement - https://en.wikipedia.org/wiki/Hoare_logic
Dijsktra then went one-step further by giving automatic calculational rules to derive a program by simple logical manipulation of symbols. In this case we reason backwards from the output of the program (known apriori from its functional specification) i.e. from postcondition to precondition using the idea of a predicate transformer. This is his weakest-precondition (wp) calculus where with each executable program statement a Pre = wp(Statement, Post) total function is defined by the Programmer i.e. his code must satisfy the function which is the predicate transformer transforming the postcondition to a weakest-preconditon. See this wikipedia page on Predicate Transformer Semantics for a nice detailed explanation - https://en.wikipedia.org/wiki/Predicate_transformer_semantic...
I wonder if EWD would have had the same opinion if he were alive today, with every Unicode font having the APL characters immediately available on the screen.
Did he feel the language design was bad, or would having TTF fonts being able to show "rho", "iota", "grade up" have removed one or more of his objections?
Very interesting flip! For much of his life, Dijkstra opposed functional programming. He even more strongly criticised FP from Backus and APL from Iverson, which are both very funcional/function-level.
As he said, Java is a mess and any sensible person would oppose the switch from Haskell to Java. I am almost sure he never used any of them. Might have read about them, but highly doubt he run any on computer.
As for the high-level status of Haskell and APL — both languages are very mathematical. Haskell goes very into the abstract realm of computation, while APL tackles very raw form of computation. Semantically, Haskell is way more high-level. In terms of economy of notation and unified concepts, APL has no match.
The most used (by me) Vector package has boxed (efficient) and unboxed (J-like) arrays. With these arrays having map, fold, scan, fromList/toList, zip and unzip combinators, one can have as terse (in the operator count sense) notation as one wishes for.
> Might have read about them, but highly doubt he run any on computer.
Nice goalpost shifting. Anyways, you also neglected that he was a major proponent of structured programming and the author of the letter "Go To Statement Considered Harmful". The idea that he would oppose high-level languages is not based in reality. Specific languages, yes, but not because they are high-level as your silly original comment claimed.
Dijkstra was highly influential in theoretical (proofs, algos) and practical (spec&compiler for Algol60) compuper science. But in reality he used his fountain pen disproportionately more than the computer.
APL suffers from the same apparent problems as Perl: They have friction coming from an unconventional syntax that's hard to understand without knowing the language beforehand, and when faced with competition, people went with the path of least resistance.
* Out of all people, and especially in the newer generations, it is increasingly uncommon to find someone with a desktop or even a laptop.
* Out of them, very few decide to do anything with it besides checking mail, social media, the web or play games.
* Out of them, very few decide to learn a programming language.
* Out of them, very few decide to learn anything besides Javascript or maybe Python.
* Out of them, very few decide to learn anything besides Java/C#/C++, learn algorithms, or learn tools like Vim or Emacs.
* Out of them, very few decide to learn anything besides Rust/Go/Haskell/Lisp/Scheme or even Fortran.
* Out of them, very few decide to learn a language with an alien, symbolic notation that resembles a code golfing language, and which, too, requires them to possibly learn a completely new keyboard layout to type with proificiency.
Not trying to discredit APL's contributions to functional programming and the like, but from the letter, it is pretty obvious Djkstra had little respect for friction. Not saying that he's right to dismiss it outright, though.
Now that I think about it, and I don't know if this exists yet, but APL would probably very much benefit from having a Scratch-like or Factorio-like visual editor paired with a touch interface. You would drag and drop symbols, and long-pressing a symbol would popup its definition.
You could also probably do nice things with the symbol "icon blocks" themselves, and provide them with colors or different visualizations to convey different contextual meanings.
That wouldn't help much. People who don't use these languages doesn't understand that what makes the language different isn't the syntax. There are plenty of dialects that use English words instead of symbols (check out Ivy by Ron Pike for example).
The difference is much deeper, but the best way to understand it is probably to check out an introduction (there is a lot on youtube).
I'd personally be happy to give an introduction to anyone willing to listen, but this comment field is not the place to do it.
IMO, Perl's downfall was mostly Perl 6, not the language.
Plenty people wrote plenty of Perl long ago. Yeah, the whole $ business is maybe a bit unintuitive, but it's the least of the problem really. It's easy to get past it.
IMO, the first part of Perl's downfall is that it didn't evolve fast enough. It was good enough that people tried to do big things in it and then it turned out it wasn't a good idea. Perl's OO for instance is kind of a neat hack that turns into a horrible mess with large projects. Large projects also increasingly need verification and safety because the debug costs rise higher and higher, and Perl is paradoxically safer at small scales. "use strict" works great in toy scripts and is nigh useless in OO-laden large projects where "strict" does nothing to check your $this->{foo}->{bar} hash trees. Yes, solutions sort of exist but they're all adhoc and you have to plan for them, and module writers don't use them, and...
That could have been survivable with the right improvements, but:
The second part is that Perl 6 was terribly planned and took bloody ages to get anywhere. People stopped writing Perl 5 expecting Perl 6 was around the corner, so why invest too much effort when it was clear 6 was going to be incompatible even early on? And it kept not coming out, so Python quickly ate its lunch.
One can appreaciate striving for simplicity (a programming language that can be taught and explained with pen and paper), but one must also consider that computers are meta-devices.
Before computers, we could write things only on paper, either with our hands or a typewriter. So, naturally, when computers came about, the way of thinking about programming was very text-driven, with an emphasis on what a typewriter could represent.
But then, code could be written directly with computers, opening up more typesetting possibilities thanks to keyboards not being bound anymore by the mechanical limitations of typewriters. You could add keys and combinations to your heart's desire, and they would be natively digital and unlimited.
Now, with graphics, both 2D and 3D, and a myriad or other HIDs, shouldn't we try to make another cognitive jump?
It's very strange to see handwriting lumped in with typewriting, to be described as limited relative to screens! Iverson notation was a 2D format (both in handwriting and typeset publications) making use of superscripts, subscripts, and vertical stacking like mathematics. It was linearized to allow for computer execution, but the designers described this as making the language more general rather than less:
> The practical objective of linearizing the typography also led to increased uniformity and generality. It led to the present bracketed form of indexing, which removes the rank limitation on arrays imposed by use of superscripts and subscripts.
I think this is more true than they realized at that time. The paper describes the outer product, which in Iverson notation was written as a function with a superscript ∘ and in APL became ∘. followed by the function. In both cases only primitive functions were allowed, that is, single glyphs. However, APL's notation easily extends to any function used in an outer product, no matter how long. But Iverson notation would have you write it in the lower half of the line, which would quickly start to look bad.
I've long been fascinated by this question, probably spurred on by having read Hermann Hesse's _The Glass Bead Game_ (originally published as _Magister Ludi_) when I was impressionably young.
The problem of course is: ``What does an algorithm look like?''
Depicting one usually directs one into flowchart territory, and interestingly efforts at that style of programming often strive for simplicity, e.g., the straight-down preference from Raptor or Drakon --- systems which do not implement that often become a visual metaphor for ``spaghetti code'':
As a person who uses: https://www.blockscad3d.com/editor/ and https://github.com/derkork/openscad-graph-editor a fair bit, and needs to get Ryven up-and-running again (or to fix the OpenSCAD layer in his current project or try https://www.nodebox.net/ again), this is something I'd really like to see someone be successful with, but the most successful exemplar would be Scratch, which I've never seen described as innovatively expressive --- I'd love to see such a tool which could make a traditional graphical application.
All those things can be specified in text. Fortress was a language that had the facility to use mathematical notation. Turned out to be not so compelling iirc.
We might also consider letting the language semantics invade the editor. Hazel integrates its parser into the text editor, so rather than getting a red squiggly when you break a rule you're just unable to break the rules. It represents code you haven't yet written as a "typed hole" so instead of
1 +
The + would cause the following to appear
1 + <int>
where <int> is the typed hole, reminding you to put an expression there which is an integer. It's perhaps a smaller leap than using shapes and space, but it's one I'd like to feel out a bit sometime.
We do have syntax highlighting these days. And our editors work like hypertext, where I can go to definitions, find usages, get inheritance hierarchies etc. Quite a ways from your suggestion, but also a few steps removed from a type writer.
I think any such leap would have to be a really big one to catch on though, due to inertia. Colorforth is not exactly popular, and I can't think of any other examples.
Can sketchpad do this? (relatively simple, but showing what an LLM can do with a sketch with very little prompting, full transcript of further typing included)
Yes, but there don't seem to be any current implementations which are more than academic exercises (I'd love to be wrong about that and be pointed to something which I could try).
The reason for this is that we've been trying to draw code by hand since 1963 and it doesn't really work out well except in limited domains. Maybe it'll work better with LLMs tho, I guess we'll see.
>Parametric CAD, in my view, is a perfect example of "visual programming", you have variables, iteration/patterning reducing repetition, composability of objects/sketches again reducing repetition, modularity of design though a hierarchy of assemblies. The alignment between programming principles and CAD modelling principles, while not immediately obvious, are very much there. An elegantly designed CAD model is just as beautiful (in its construction) as elegantly written code.
Obviously, it is fitting that a visual product is amenable to a visual approach/solution, so my question is, what programming environment for general purpose is most like to a parametric CAD system?
Yeah I think CAD is a perfect domain for this kind of thing, and IIRC that was one of the original target applications for Sketchpad, where Sutherland demonstrated constraint-based bridge design where the constraints were sketched in.
I agree I don't think LLMs really change the equation much.
For another look at where drawing-based programming has gone, see Dynamicland by Bret Victor. No LLMs required.
It has been a long time since I used APL (in college)
We had APL terminals which had APL keys and would print APL characters. It was significantly more immersive that way.
Looking at this letter i start to vaguely recall things.
Decades later, I recall the output operator, not shown anywhere here.
⎕←<something>
which would print whatever <something> was. (was I misremembering?)
I do recall using matrix operations in a similar way to the math classes I was taking at the same time. matrix multiplication, inversion, dot products seemed to be more "math oriented" than other computer languages.
In other computer languages, you had to adapt to the language. For example:
x = x + 1
y = mx + b
in these two statements, one only makes sense in math class, one only makes sense to increment a variable in a computer language.
I wrote a lot of APL for my undergraduate Senior Project in 1978/1979.
I really enjoyed it because it was fun. You could do an incredible amount of work in a single line of code.
The only problem was, that line would then be almost impossible to read and understand! It could easily be used as a "write-only" language even without a separate obfuscation step.
When I become a professional programmer right after college, I never used it again, and learned to write code that was readable above all else.
Is this an instance of the maxim that one has to be twice as smart to debug code as to write it?
Are you aware of any APL programs written using Literate Programming?
Apparently there was at least one attempt:
Lee J. Dickey. Literate programming in APL and APLWEB. APL
Quote Quad, 23(4):11–??, June 1, 1993. CODEN APLQD9. ISSN 0163-6006.
Perhaps that additional layer of documentation would help? (APL is a language I've always been fascinated by, but never had occasion to more than superficially examine)
As if medieval math notation was not weird enough, people decided to invent APL to be even more bizarre. As a proud Perl5 dev, I totally don't buy it. Neither do I buy into Raku's brave use of all possible Unicode symbols. Perhaps I'm ageing.
What do you mean by "Medieval math notation"? Wasn't it mostly just textual descriptions? (which arguably would not be too far away from the natural language being used to interact with LLMs)
It wasn't until Thomas Harriot's _Artis Analyticae Praxis_ was published that one got anything resembling consistent math notation as folks today would expect, and isn't the symbology used for APL just a formalization and/or extension of prevailing math symbol usage? (which was the point of the whole thing?)
You are correct, most modern mathematical notation emerged during the Renaissance and Early Modern periods (late 15th-18th centuries). Indeed before that was rhetorical.
APL was the first language to have operators for "do this to all that stuff".
They were headed for functional programming. But the syntax was too weird.
If you are expecting someone to learn a completely new notational language before you can communicate a basic algorithm, you have gone wrong somewhere.
You could also similarly write down merge sort in pure lambda calculus, which is interesting as an exercise, but not especially useful as working code, or as a way to explain how merge sort works.
Dijkstra's go-to language (pun intended) was Algol 60 (& Pascal) – everything else was shit in his view. Some of his comments:
FORTRAN — "an infantile disorder"
COBOL — "the use of COBOL cripples the mind"
BASIC — students exposed to it are "mentally mutilated beyond hope of regeneration"
PL/I — "the fatal disease"
APL — "a mistake, carried through to perfection"
He liked his languages and programs to be easily traceable with pen & paper. He always wrote programs on the paper (and proved correctness) and only then into computer. REPL-driven development (what APL pioneered) was a foreign concept to him. He would be so appalled by LLM code generation.
Read the sources carefully. Fortran quote is not his; he quoted it. Also remember that he was talking about pre-Fortran77 era. F77 tried to fix some of the criticisms though did not succeed fully. Here is a nice dig-in about the quote https://limited.systems/articles/dijkstra-fortran/
Another note to remember that John Backus, the team lead of the Fortran gang, was in the Algol committee. So these folks knew what they are talking about and spoke to each other periodically. Even John Backus said, Fortran is not the final interface that we should have.
It keeps spinning in the programming circles half-quoted versions of half-baked quotes from original sources. These pioneers, even when they disagreed, had pretty precise arguments and very rarely feeling the feelies.
FORTRAN —"the infantile disorder"—, by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.
He told the truth and in turn Fortran corrected its course, but Dijkstra probably didn't change his mind about it.
---
> These pioneers, even when they disagreed, had pretty precise arguments and very rarely feeling the feelies.
The feud between Backus and Dijkstra kinda persuaded me of the opposite.
He liked to be able to reason about programs without running them.
He preferred simpler languages because they contain less irrelevant noise which got in the way of that.
The opening paragraphs about how people enamoured by a shiny gadget will overlook a terrible interface brings immediately to my mind the modern day LLMs.
I don't find this observation of Djikstra's to be one of his best. If there is a gadget that does a thing that no other gadget does, what does it even mean for the interface to be "terrible?" How can you even know if the interface is terrible, given that a better one has yet to be invented? Maybe the interface is as good as it can be for the tool in question.
I also don't love your mapping of this observation onto modern LLMs. The interface of an LLM is natural language text, along with some files written in plain text or markdown. Can it be improved? Undoubtedly! But as a baseline, it doesn't seem half bad to me. If it is so terrible, it should not be hard to propose an interface that will be significantly more productive. Can you?
> If there is a gadget that does a thing that no other gadget does, what does it even mean for the interface to be "terrible?" How can you even know if the interface is terrible, given that a better one has yet to be invented? Maybe the interface is as good as it can be for the tool in question.
That's just a taste judgement, and you can decide the interface sucks on a one of a kind item quite easily, and people often do.
Due to my affinity for the language, and my wish to have worked in its heyday (would love to have an APL gig someday), I have been exposed to various writings and recordings of Ken Iverson. I've also been exposed to a few of Dijkstra's thoughts on APL.
I have to say that Iverson generally comes across as a very generous and curious individual while Dijkstra seems to have been a miserable ass. Maybe, given the lens, I've not given Dijkstra a proper chance to demonstrate a more positive attitude, so I'm open to any suggestions of writings where he doesn't seem like such a grump.
reply