Not sure how that works. I think Python bites so I do not use it, I do not labor over detailed treatises to post on comp.lang.python. But it is a great treatise so I will promote it some here.
We have perhaps yet another measure of how great is Lisp: even its haters are its fans. They come to boo, but come they do. Like Gorgeous George and his understudy, Muhammad Ali, Lisp puts ticket buyers in seats as much to be hated as used.
WJ on comp.lang.lisp wrote:
Dying is easy, comedy is hard. Humor must have within it some truth for it to work. The laugh comes from the accuracy followed by a sharp turn to a put-down, the sharp turn being the key. If Lisp is not like COBOL there is no sharp turn and the put-down becomes mere name-calling. Sticks and stones and all that.Worshippers of CL (COBOL-Like)
Please try not to try to be funny again.
We need the spec when we want to get fancy with format, 'mkay?are the most slavish punks that
the world has ever known. The hypertrophied CL hyper-spec is
their Holy Bible.
Their are the worst enemies that Lisp has. (And they never win.
Only the most mediocre of programmers are willing to slave over
the CL hyper-spec.)
And yes, it is actually a Religion to these degenerates.
Don't believe me?
Kenny Tilton:
We are the Priesthood. Offerings of incense or cash will do.
The only thing I do not use is series. Hmmm, I should go look that up.Here's what the experts say about COBOL Lisp:
"an unwieldy, overweight beast"
For most of you, yes. Lisp is for smart people."intellectual overload"
That explains the smell. Sorry, I forget the author of that one."did kill Lisp"
He went from Lisp to Constraints and butchered that, from there to Java and can take a lot of credit for that steaming pile of turd, and has now abandoned that failure to work on... Fortran?"A monstrosity"
"ignores the basics of language design"
"killed Lisp"
"sucks"
"an aberration"
"the WORST thing that could possibly happen to LISP"
"incomprehensible"
"a nightmare"
"not commercially viable"
"no future"
"hacks"
"unfortunate"
"bad"
In context:
Guy L. Steele, Jr., July 1989:
50? Check again.
I think we may usefully compare the approximate number of pages
in the defining standard or draft standard for several
programming languages:
Common Lisp 1000 or more
COBOL 810
ATLAS 790
Fortran 77 430
PL/I 420
BASIC 360
ADA 340
Fortran 8x 300
C 220
Pascal 120
DIBOL 90
Scheme 50
Yeah, Graham said that, too: a language has to fit in your head. I am thinking he did not program enough to learn it really well*. And he prolly did not have the hyperspec an F1 away either.
-----
Brooks and Gabriel 1984, "A Critique of Common Lisp":
Every decision of the committee can be locally rationalized
as the right thing. We believe that the sum of these
decisions, however, has produced something greater than its
parts; an unwieldy, overweight beast, with significant costs
(especially on other than micro-codable personal Lisp
engines) in compiler size and speed, in runtime performance,
in programmer overhead needed to produce efficient programs,
and in intellectual overload for a programmer wishing to be
a proficient COMMON LISP programmer.
* I am pretty sure a constraint on a language I intend to use all the time should not be that I have to be able to remember it all when I am not using it all the time.
You know Lisp is the perfect language precisely because it lost its momentum (died) and lives on. Other languages need their Next Big Thing momentum.
-----
Bernard Lang:
Common Lisp did kill Lisp. Period. (just languages take a
long time dying ...)
The French are still pissed off about Lance Armstrong!It is to Lisp what C++ is to C. A
monstrosity that totally ignores the basics of language
design, simplicity and orthogonality to begin with.
-----
Gilles Kahn:
To this day I have not forgotten that Common Lisp killed
Lisp, and forced us to abandon a perfectly good system,
LeLisp.
Come on, we have "goto". What else does he want?
-----
Paul Graham, May 2001:
A hacker's language is terse and hackable. Common Lisp is not.
The good news is, it's not Lisp that sucks, but Common Lisp.
Historically, Lisp has been good at letting hackers have their
way. The political correctness of Common Lisp is an aberration.
Early Lisps let you get your hands on everything.
A really good language should be both clean and dirty:
cleanly designed, with a small core of well understood and
highly orthogonal operators, but dirty in the sense that it
lets hackers have their way with it. C is like this. So were
the early Lisps. A real hacker's language will always have a
slightly raffish character.
Can I respond to all these "CL killed Lisp" quotes by pointing out that they are wrong? I am developing The World's Greatest Algebra software on Windows 8 (after 7 and Vista and XP), pushing to GitHub, pulling onto Linux and away we go*. Standards avoid the disaster of Scheme, which brilliantly recreated the problem CL had to fix: disparate implementations leading to unshareable code. *Okay, I am using compilers from the same vendor, but I am also using a ton of open source written mostly for SBCL.
Organic growth seems to yield better technology and richer
founders than the big bang method. If you look at the
dominant technologies today, you'll find that most of them
grew organically. This pattern doesn't only apply to
companies. You see it in sponsored research too. Multics and
Common Lisp were big-bang projects, and Unix and MacLisp
were organic growth projects.
-----
Jeffrey M. Jacobs:
I think CL is the WORST thing that could possibly happen to LISP.
In fact, I consider it a language different from "true" LISP.
*****
Common LISP is the PL/I of Lisps. Too big and too
incomprehensible, with no examination of the real world of
software engineering.
... The CL effort resembles a bunch of spoiled children,
each insisting "include my feature or I'll pull out, and
then we'll all go down the tubes". Everybody had vested
interests, both financial and emotional.
CL is a nightmare; it has effectively killed LISP
development in this country. It is not commercially viable
and has virtually no future outside of the traditional
academic/defense/research arena.
Common Lisp worked, but it got the blame for Minsky's (inter alia) over-promising and under-delivering on AI.
Constrained? Where? Who? How? I am writing amazing code and having a blast between bong hits. Where's the problem?
-----
Paul Graham:
Do you really think people in 1000 years want to be
constrained by hacks that got put into the foundations of
Common Lisp because a lot of code at Symbolics depended on
it in 1988?
I think DW meant "looking more like Scheme" as a good thing. Ouch. I used Arc for a while and it had one namespace and it was not fun.
-----
Daniel Weinreb, 24 Feb 2003:
Having separate "value cells" and "function cells" (to use
the "street language" way of saying it) was one of the most
unfortunate issues. We did not want to break pre-existing
programs that had a global variable named "foo" and a global
function named "foo" that were distinct. We at Symbolics
were forced to insist on this, in the face of everyone's
knowing that it was not what we would have done absent
compatibility constraints. It's hard for me to remember all
the specific things like this, but if we had had fewer
compatibility issues, I think it would have come out looking
more like Scheme in general.
I think some of these smart guys are thinking themselves into to many rules and regulations and way too much "should". I am not that bright so I am free to just get on with the programming, and when one is programming one knows what works and what does not. One namespace does not.
-----
Daniel Weinreb, 28 Feb 2003:
Lisp2 means that all kinds of language primitives have to
exist in two versions, or be parameterizable as to whether
they are talking about the value cell or function cell. It
makes the language bigger, and that's bad in and of itself.
Except there is no non-Loop code posted to comp.lang.lisp that I cannot make clearer and faster with loop. And it took me a very long time to come around to loop, so I should know. Yes, the syntax can make you weep*. Until you learn it. Then it is an undisputable win. * Hey, it is a DSL. Ya gotta learn it, like any language. And it is a DSL for iteration, something that comes up more than a little in programming, so the effort (and the DSL itself) are justified.
-----
Paul Graham:
I consider Loop one of the worst flaws in CL, and an example
to be borne in mind by both macro writers and language designers.
I wonder if Dan wrote enough code. Loop never surprises me, now that I have learned not to close over loop variables.
-----
Dan Weinreb, one of the designers of Common Lisp:
... the problem with LOOP was that it turned out to be hard to
predict what it would do, when you started using a lot of
different facets of LOOP all together. This is a serious problem
since the whole idea of LOOP was to let you use many facets
together; if you're not doing that, LOOP is overkill.
Hey, that's the guy that did IF*, a DSL for conditions!
-----
From: John Foderaro
Newsgroups: comp.lang.lisp
Subject: Re: the "loop" macro
Date: Sun, 26 Aug 2001 10:51:26 -0700
I'm not trying to join a debate on loop. I just wanted to present
the other side of [the issue so that] the intelligent people can
then weigh the arguments on both sides.
I'm not suggesting that loop can be fixed either by adding
parenthesis or coming up with ways of indenting it to make it
understandable. It's a lost cause.
Why does all the above miss its mark? Because it is 2013 and all that was written when CL got created back in the eighties and we are still using Lisp and when we do we use Common Lisp. All those quotes are by people who saw the politics and the big spec and were unhappy about the compromises and the size of the resulting language. But I use it all (except series) so they were wrong about the size.
Meanwhile the raison d'etre of unification was achieved. Yobbos are writing open source on SBCL and I am going to use it to change the way the world learns math, atop AllegroCL. A brave attempt to resume the fragmentation (scheme) failed. Mostly because they were wrong about language design, but also because of -- wait for it -- fragmentation. Perfect.