Notation as a tool of thought - how far have we come?
Recently I was comparing an old Windows DOS command for deleting all the files in a directory with a scripted equivalent - I noticed the "modernised" version required typing 50 times more keystrokes to achieve the same outcome.
Are these additional keystrokes enhancing pr开发者_运维百科oductivity? Are they serving a purpose that has been quantified, for example reducing coding error rates?
The issue as I see it is that a computer language written primarily to accommodate Von Neumann architecture - rather than the way we think - forces us to solve problems by juggling three problem domains in our heads (a) the original probem (b) the problem restructured to fit Von Neumann architecture (c) the mapping rules needed to translate back and forth between (a) and (b).
As a rule of thumb the more efficient a computer language notation - in the sense that it enables you to work directly with the problem at hand - the lower the coding overhead. Lower coding overhead makes problem solving more tractable and thereby reduces coding and room for error. It should definitely not increase workload!
Which computer language in your opinion makes for the most efficient problem resolution platform - in that it enables you to think directly in terms of the original problem without having to do cross-domain problem juggling?
For interest I did a byte count of 37 different solutions to Conway's game of life and came up with the following stats:
J : 80,
APL : 145,
Mathematica : 182,
Ursala : 374,
JAMES II : 394,
SETL : 559,
ZPL : 652,
PicoLisp : 906,
F# : 1029,
Vedit macro language : 1239,
AutoHotkey : 1344,
E : 1365,
Perl 6 : 1372,
TI-89 BASIC : 1422,
Perl : 1475,
PureBasic : 1526,
Ocaml : 1538,
Ruby : 1567,
Forth : 1607,
Python : 1638,
Haskell : 1771,
Clojure : 1837,
Tcl : 1888,
R : 2031,
Common Lisp : 2185,
OZ : 2320,
Scheme : 2414,
Fortran : 2485,
C : 2717,
ADA : 2734,
D : 3040,
C# : 3409,
6502 Assembly : 3496,
Delphi : 3742
ALGOL 68 : 3830,
VB.NET : 4607,
Java : 5138,
Scala : 5427
(See e.g. http://rosettacode.org/wiki/Conway's_Game_of_Life)
Comments?
Please be specific about the merits of the notational approach the language you critique takes and do so from a reasonably high level - preferably with direct project experience.
You used Conway's game of Life as an example, and no language can solve that more elegantly or efficiently than APL. The reason is full array/matrix manipulation in very powerful single or multiple character operators.
See: Whatever Happened to APL? and my story about my combinatorics assignment that compares APL with PL/I.
If you're talking about "efficient" in terms of keystrokes to solve a problem, APL will be tough to beat.
Your byte count of 145 for APL solving Conway's game is wrong. That is a very inefficient solution you were looking at.
This is one solution:
(source: catpad.net)
That's 68 bytes and beats the J solution. I think there are other APL solutions that are even better.
Also see this video about it.
Those seizing on "keystrokes as a measure of efficiency - considered harmful" are missing the point indicated by the title of this discussion.
A well-designed, notationally-dense language like APL or J gives us high-level computational concepts embedded in a simple, consistent framework which allows us to think more easily about complex problems. The small number of keystrokes is a side-effect of this.
Are you really comparing del /s *.*
with an implementation of the same? I bet that the author of the script could have shell out and execute the built-in del
command. It's impossible to say why he didn't do that but he could have had a good reason.
I'm all for less ceremony and as low cyclomatic complexity as reasonable but keystrokes seems like a really bad metric of how easy the code is to read (Perl - why are you looking at me like that?) or how well it maps to the problem domain. Just change all your variable names to one character and you save lots of keystrokes! Or make the code totally undreadable by some advanced code golfing. Not very productive.
Are these additional keystrokes enhancing productivity? Are they serving a purpose that has been quantified, for example reducing coding error rates?
I think in part they are. Even if I typed 10 times faster, a project woulnd't be done 1% faster in the end. But take a look at bat files, they look like Spaghetti. No, more like ramen.
Most of the time I code some "quick n' dirty" script, I have to run it to check if it does indeed work. But in a modern language I hardly face stupid surprises (like deleting the wrong file because the script got invoked from a network drive or a shortcut) at run-time.
精彩评论