Why are polymorphic messages so much more powerful in practice than the combination of unification and backtracking?
One way of looking at the history of programming language design is that there was a revolution with the introduction of the subroutine. Twenty or thirty 开发者_开发问答years later, two refinements of the subroutine call were seriously considered:
- Polymorphic messages
- Unification and backtracking
I've just been programming in Prolog after a 20 year hiatus and realizing just how incredibly powerful unification and backtracking are. However, polymorphism won. Why?
My experience with Prolog is that is works excellent when backtracking search is a good fit for your problem domain. However, if that is not the case much of the programming effort goes into fighting the backtracking search, bending it to ones own needs.
So my take on the situation is that backtracking search is too narrow a language feature to be generally useful. If we would have seen unification together with a more flexible search then we might have seen a different course of development.
I have done a LOT of programming in Prolog, and while I love the language for its expressive power, I have to agree with svenningsson that as soon as you try to do anything non-declarative it becomes a puzzle to use the ! operator (cut, discards backtracking options) in the right places, which is extremely error prone.
Though not perfect, one language that elegantly manages to combine backtracking and non-declarative (imperative/side effecting) code is Icon. It basically isolates expressions which can backtrack naturally from the general program structure (e.g. statements) such that it is relatively easy to see that backtracking will not lead to unexpected results, like in Prolog. I am not sure why not more languages are based on this execution model, my guess is that the majority of programmers are really stuck in sequential thinking, and backtracking is confusing.
I am not sure if backtracking compares with polymorphism directly. To me, it is more of an alternative to closures, as the #1 use for closures in most languages is custom iteration (think map/filter/fold etc). For example, in Icon I can say:
every write 10<(1..10)*2
Which takes a sequence of numbers, multiplies them by two, filters out those >10, and prints the result ("every" is bit like a repeat-fail loop in Prolog). In a list/closure based language, I have to write:
for (filter (map [1..10] \x.x*2) \x.x>10) \x.(write x)
granted, this is a bit contived as list comprehensions & currying can simplify this, and not all icon code is that terse, but you get the idea. The Icon version is not only more expressive obviously, but also has the advantage that it does not use intermediate lists and is "lazy" in a co-routine sense, i.e. it will write out the first number before getting to do *2 on the second element. This means it allows you to write code that is equally efficient even if you end up not using all results generated.
A guess: message-passing was more easily tacked on to the then-popular practices and gradually absorbed. A gradual acceptance of Prolog's ideas would take a vehicle like Oz, only invented in the 90s, something like 20 years behind Smalltalk. Since Oz claims to support both procedural and logic programming in one clean package, I see no reason in principle the world couldn't have taken that path if it had known how at the right time. Instead the paradigm got tied to a more burn-the-diskpacks attitude and the 5th Generation disappointment.
(I haven't tried Mozart/Oz myself so far. I have played with Prolog.)
Immediately the cut operator came to my mind: Prolog is beautiful and concise just as long as you want and can program declaratively. Once you start using the cut operator (i.e. cut all backtracking at that position), you have to think through too complex situations to find a good solution or understand code from others / your old code.
So the problems with optimizing backtracking seems to be the consensus here, with 3 out of 4 answers (as of 12th of Aug. 2011) stating it (+1 to both Aardappel and svenningsson).
Backtracking is very hard to debug when all the system will tell you is “No”! There are better Prolog compilers, but most people have had enough of it by the time they have been forced to use a poor compiler at university.
UI code is where most programmers start, and is what the user sees, Prolog never seemed a good fit for writing UI code.
Before “Polymorphic messages” become normal, people where using function pointers to get the same expect so there were a smaller step.
Prolog code is still hard for most programmers to read, however most programmers could understand at least some C++ code given that they know C.
精彩评论