Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hey. You're completely missing it. The Y combinator shows that full general computation, including recursion and iteration, derives automatically and inevitably from just basic rewrite rules. Obviously it is too raw to be used directly. But if you design/implement any system with rewrite rules, you have provided indefinite power for recursions, and you have opened the Pandora box of undecidable-termination. This means that decidable computation can only occur without rewriting, which is incredibly restricted.

Food for thought.



I do use it directly. For example in Fexl I define the append function for lists as:

    \append=(@\append\x\y x y \h\t [h; append t y])
Where '@' is the Y-combinator.

Note that there's no direct self-reference with the symbol "append" there. I could define it equivalently as:

    \append=(@\loop\x\y x y \h\t [h; loop t y])


Why? Is it unreasonably difficult for you to implement recursion via direct self-reference in your language? Or do you just not want to because the Y combinator is there and it's cool?


True, I could implement it in terms of direct self-reference. At one point I used the "==" syntax for just that purpose:

    \append==(\x\y x y \h\t [h; append t y])
Maybe I should resurrect that syntax option. :)

It was trivial to implement. When the parser sees the "==", it just automatically abstracts the symbol "append" from the definition and applies the fixpoint operator (Y combinator).

The only reason I eliminated "==" in the first place was that I was in the throes of using different syntaxes for lazy, eager, and self-referencing definitions. Now I've settled in on "=" always meaning eager evaluation, without exception. Then I got hyper-minimalist and said that's it, there's only one syntax for definitions, and it's "=", and if you want self-reference, use "@".

However, the decision to settle in on eager evaluation now frees up "==" once again as an option for self-reference. So I may bring that back.

Thanks for the food for thought.


Postscript: You might well ask why not just use "=" only, and assume that all definitions are potentially self-referencing?

That's a non-starter because I find that redefinition is the more common intent:

    \x=4
    \x=(* x 5)
    \x=(+ y x)
In those cases I don't want x defined in terms of itself, but rather in terms of the previous definition of x.

That is why I would insist on a special token such as "==" to express the intention of self-reference.


OK, I went ahead and revived the "==" syntax for recursive definitions, so you don't have to use '@' (fixpoint) explicitly.

https://github.com/chkoreff/Fexl/commit/57b841cb6c2347cad147...


>derives automatically and inevitably from just basic rewrite rules.

Sorry, I don't understand what that means.

Also, (lambda x: x(x))(lambda x: x(x)) already gives you an infinite loop, so why do you need a Y combinator to show non-termination? Once you have functions as first-class citizens that you can copy around, you've lost control of termination. Seems pretty intuitive. What's the specific contribution of the Y combinator?


">derives automatically and inevitably from just basic rewrite rules."

I said "full general computation, including recursion and iteration, derives automatically and inevitably from just basic rewrite rules". You can't cut a sentence wherever you want and still think you're responding to the original thought.

There is an imprecision in my original comment, but it's not there. If you can stop nitpicking long enough, you may be able to see something that's pretty powerful.


Thanks for the assertion that everybody can understand you when you use terms without defining them. Not everybody here is so lucky as to have the same educational background as you.

What is a rewrite rule? To me, it's something you put in a web server configuration.


My apologies, I was only responding to the previous comment, forgot momentarily about others, sorry about that. Let me try to fix this.

A computation engine based on rewrite rules, in general, performs computations by replacing instances of certain patterns by some other patterns. Patterns can be literal or generic. The rules according to which these rewrites are done are "code" of the computation, they determine what computation the system performs. Different rules can result in a system that may do one of: computing prime numbers, computing digits of pi, factoring large number in prime divisors, finding the shortest path visiting all cities exactly once in a given map, rendering a 3D scene from a list of objects and coordinates, finding the best potential dating matches for a given user out of list containing many other users, or outputting a Schonberg-style tonal composition in waveform with synthetic piano-like sound.

The lambda calculus is one such type of computation system. The core element is a lambda expression of the form (\x.y), where "\" is just ASCII for the greek lambda letter which is hard to type for me now. The core thing a lambda-calculus system does is it takes lambda expressions and it applies a simple rewrite rule: where there is a sequence of the form (\x.y)z, where x,y,z can be anything, it will substitute the whole thing by "y", but wherever "x" appears inside "y", it will write "z". For example:

  (\x.xx)A
Will be rewritten to:

  AA
Given an initial string, the engine that computes the lambda expression will apply this rewrite rule repeatedly until no more rewrites can be done. This may or may not reach an end (this is the halting problem - in a general case, it is impossible to know whether the computation will reach an ending point or not).

The rewrite rule is called "beta reduction", by the way, but try to remember the concept, not the name, which is arbitrary and unimportant.

Using lambda notation you can craft code that will do any of the things above. The code will look like a long lambda expression, which the lambda-calculus engine can work on following the standard rules to end up producing the end result.

The Y combinator is a specific (\x.y) lambda expression that, once applied to some value by the rewriting system, it results in a computation being done on that value, plus an extra copy of the Y combinator, thus allowing a new iteration of the same computation. Thus resulting in being able to keep on computing - thus being able to compute anything requiring arbitrary recursion, such as the factorial of a number, the Ackerman function, the shortest path in an arbitrary graph, or whatever.

Although I've never seen Paul Graham explain it, my take is that the accelerator is called Y Combinator because they put some money in, this allows the startups to get to profitability, they get their money back, and they can use it again to fund new startups, thus achieving infinite effective resources from finite initial resources.

PS: the author of the original comment was handwaving and nitpicking since he didn't like being in the point of having missed something (which we all do quite often, incidentally, including myself in the previous comment). I don't like that behavior and I don't like the time and energy waste it results in, thus my dismissive response, apologies everyone for not responding constructively.


Exactly! Look at eg primitive recursion for an interesting subset of computation that does not suffer from undecidability.

Most programmers are familiar with regular expressions, too. There are another interesting subset.


What do you mean by "rewrite rules" - are you referring to beta reduction?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: