The Performance of Lisp, or Why Bean Counters Need Bigger Bags of Beans

One critic, echoing the voices of thousands, asks:

"Surely if Lisp makes a programmer N times more efficient, then it would be easy to study scientifically. Have there been any studies comparing productivity?"

I wish I could reply with the immortal words of Babbage.  But alas I cannot.  Sadly, I can indeed "apprehend the kind of confusion of ideas that could provoke such a question." It is the confusion which inevitably follows from a bean-counter mentality applied to subjects which are fundamentally alien to it and the culture it has been shaping since the start of the Industrial Revolution.  For instance, take the very word "productivity."  It is meaningless alone.  One must always ask, productivity at what?  If the productivity in question is in the act of being a pyramid builder following orders and bringing to life the ideas of others, then without question the productivity of many a Lisp programmer is something less than heroic.  This has nothing to do with Lisp, and everything to do with the kind of people who find it appealing.  Now, if we're talking about productivity at undirected invention, the story of Lisp becomes a very different one.  To the now-scorned Lispers we owe garbage collection, all the roots of the modern GUI, dynamic typing, lexical scope, the very idea of a single-user computer workstation, and countless other innovations which so many people believe to have simply dropped from the sky (or worse yet, to have been invented by the sociopathic hucksters who have managed to weld their names to these marvels through Machiavellian business acumen.)

The Lisp Machine (which could just as easily have been, say, a Smalltalk machine) was a computing environment with a coherent, logical design, where the "turtles go all the way down." An environment which enabled stopping, examining the state of, editing, and resuming a running program, including the kernel. An environment which could actually be fully understood by an experienced developer.  One where nearly all source code was not only available but usefully so, at all times, in real time.  An environment to which we owe so many of the innovations we take for granted. It is easy for us now to say that such power could not have existed, or is unnecessary. Yet our favorite digital toys (and who knows what other artifacts of civilization) only exist because it was once possible to buy a computer designed specifically for exploring complex ideas.  Certainly no such beast exists today - but that is not what saddens me most.  Rather, it is the fact that so few are aware that anything has been lost.

It is indeed possible to measure the productivity of Lisp - just as it is possible to measure the productivity of, say, the scientific method.  You would not attempt to weigh the latter by setting up a gladiatorial match between stereotypical African shamans and stereotypical American physicists (randomly plucked from Los Alamos, say.)  Yet there is no end to similar suggestions for measuring the value of programming systems.  I am no mystic, and believe that productivity could in principle be measured (once you define the word.) However, you must measure it on the time scale where it is relevant. The productivity of inventors and the tools of invention cannot be measured in the same manner as the efficiency of two competing pieces of construction equipment - especially not in a society which routinely deprives inventors of the fruits of their labor and awards everything to the slick and hypersocialized.

Yes, my dear bean counters, you can measure productivity.  I would no more ask you to stop in your attempts at its measurement than I would ask mosquitoes to stop sucking blood.   You can measure productivity - even of Lisp; even of political philosophies.  You will simply need to secure a very large bag of beans - one deep enough to hold a bean for every twist and turn of a century of tinkering, politicking, and everything else associated with the messy business of successfully thinking new thoughts.

This entry was written by Stanislav , posted on Sunday December 27 2009 , filed under Hot Air, Lisp, NonLoper, Philosophy . Bookmark the permalink . Post a comment below or leave a trackback: Trackback URL.

3 Responses to “The Performance of Lisp, or Why Bean Counters Need Bigger Bags of Beans”

  • Justin Grant says:

    We live in a world where many think that the only useful definition of computability
    is the Turing machine. No doubt that Turing himself would have found this completely baffling and disturbing if he was alive today not to mention all the other Computer Science greats.

    This kind of scalar mindset and active denigration of free imagination and creativity seems based on a deep-rooted fear. This fear is likely artfully cultivated by the bean counters' masters and fully embraced by the bean counters themselves. This kind of mental predisposition has been accurately echoed in much of Eric Hoffer's prose.

    I really enjoy your writing, keep it coming !

  • Sociopathic Huckster says:

    I am sick of this notion amongst the technocrati that Xerox "invented" the GUI and Apple simply "stole" the idea from them, and that the only reason Apple won was because of Steve Jobs's magical business acumen.

    Xerox's GUI was not merely more expensive with worse hardware than the Macintosh's QuickDraw. The software itself was just objectively worse. The PARC researchers had not solved the issue of overlapping regions so that windows simulated three-dimensional environments.

    The whole point of the GUI is to more accurately simulate the real world. The real world happens to be in 3D. The real world does not care that graphical frame buffers are only a two-dimensional matrix. The real world desires accurate simulation. That was the genius behind Job's sociopathy : in the face of engineering that seemed to reject his every move, he had an almost religious adherence to simulating reality. I think this is the whole point behind "building great products".

    More info here : http://www.folklore.org/StoryView.py?story=I_Still_Remember_Regions.txt

    The Macintosh also had the first windowing system that supported marching ants, the double click and the menu bar.

    "All incremental improvements," Mr. LoperOS says. "Xerox had the roots."

    False. Xerox did not have the roots. The roots of the modern GUI can be traced back to Douglas Engelbart, one of the most innovative geniuses in human-computer interaction.

    So Xerox's Lisp machines had neither the idea nor the execution. What is their purpose in the modern narrative of personal computing history?

    I hate the "Steve Jobs didn't even write code" model of thinking amongst modern engineers that enjoy the ability to earn potential salary increases in the process of learning code. This is an opportunity granted to them only by the various markets that Steve Jobs's company helped create! Modern personal computing? The market was not born by, but rather EXPLODED thanks to the Apple II and the Macintosh. Perhaps, for a short decade as Apple languished in misery thanks to the supreme idiocy of a man named John Sculley, Wintel machines created more software engineering jobs than Apple machines did. But, as soon as Jobs came back, the Internet took off by storm. Most modern web browsers really come down to a fork of WebKit. If you think that most modern web apps rely on fast JavaScript invented by Google, think again. Fast JavaScript is really contingent on the ability to make inline polymorphic assumptions thanks to JIT compilation of JavaScript to native machine code - a genius idea that was born within Apple for SquirrelFish Extreme, which was marketed as Nitro for Safari. And, after a decade, web apps seem to be being replaced by mobile apps, a market that was started, and in large part is still dominated by (at least when you block for high-end markets), Apple.

    So of course Steve Jobs didn't even write code. If you grew up in Steve's time, you probably wouldn't have written code either. You'd be an accountant (if you're part of the 90% of cogs) or a mathematician/formal linguist (if you're in the 10% of people who actually enjoy writing code).

    Perhaps Jobs himself lacked the literal engineering genius necessary to create the machines; but he sure as Hell was capable of hiring the genius and using his sociopathy to enable the genius locked away in his engineers.

    I like that you are so faithful with the Great Man theory of history, even though it is so harsh when it comes to doing away with nice-sounding populist theories : "Every innovative work of mankind has been the product of one – sometimes two, rarely three – minds." In Apple's case, we have Steve Wozniak (the genius behind the engineering superiority of the Apple II), Bill Atkinson (the genius behind the design dominance of the Macintosh), and Jony Ive (the genius behind the forward-thinking philosophy of the iPhone). Then we have Jobs, who possessed the meta-genius of somehow attracting all this genius in one place, which is the most important skill-set to have as a business leader in the information age. Whether this skill set self-selects for sociopathy is irrelevant. It was an important skill set.

    In any case, this elucidates why programmers are so obsessed with ALGOL-like programming languages. Culturally speaking, we're still stuck in the Stone Ages of PL Theory : a day and age when we were forced to use assembly language to describe our programs. Lisp just wasn't fast enough to power the regions of QuickDraw as opposed to the Motorola instruction set.

    In some fields, we are literally still technically stuck in this Stone Age. Video games have to refresh a scene graph that consists of millions of polygons with thousands of textures at varying levels of detail and deal with dynamic lighting that can bounce off of an arbitrary amount of surfaces, and they have to do it 60 times a second, on client hardware that they can't even choose. They can't deal with the niceties of "single user workstations" or "REST-like interfaces". They most certainly cannot deal with garbage collection or the sorts of assumptions compilers lose when you rid yourself of static typing. And while they can stand to adopt some modular, functional, Lisp-y practices here and there, they cannot take it to it's logical extent - at a certain level, you just have to mutate the frame buffer.

    In other fields, I agree with your frustrations. The obsession with ALGOL is merely cultural. I think the biggest failure of programming is JavaScript. The original goal of JavaScript was to provide a dynamic language that allowed for quick scripting of interactive elements in the web browser. There is no reason that couldn't have been a Lisp. Object prototyping is basically atom/attribute, anyways; there's very little in JavaScript's incredibly flexible object system that could not have been represented in cons cells. Fortunately, this culture quickly self-corrects itself. For most web apps, you can write ClojureScript -> JavaScript -> faster JavaScript (Closure compiler) -> native machine code (V8, Nitro). For the significantly more performance-intesnvie apps, we will have WebAssembly, which brings us back to Algol : it just so happens that von Neumann machines speak Algol better than Lisp, so when talking in von Neumann happens to be more important than talking in human-readable math, Algol wins out.

    Some cultural problems persist, frustratingly so. The only reason that Python is considered "easy to learn" is because programmers are so obsessed with C-like syntax due to history, that Lisp looks weird to them ... even though the math behind Lisp is far more natural, intuitive, and powerful than the logic behind C (in the famous words of Jamie Zawinski, a "glorified PDP-11 preprocessor"). But when you consider the functionalities of Python that make it so prevalent in academic programming - dynamic typing, garbage collection, first-class functions, etc. - you find them rooted in Lisp. In fact, the biggest problems with Python are the things that they DIDN'T take from Lisp - object-orientation is famously broken in Python due to a lack of lexical scoping, a lack of macros lead to operator overloading being restricted to strange underscore syntax, an arbitrary difference between expressions and statements, and a focus on iteration over recursion results in a lack of tail-call optimization.

    --

    P.S. I think the Great Man theory is fundamentally literally wrong in any formal interpretation, because the subjective qualia of "innovation" can only be relatively perceived according to varying population sizes. In other words, the only reason some men are Great is due to the perception by many, many others that they are Great. Ergo, their Greatness is dependent on many people, which is a fundamental contradiction of the Great Man theory.

    But as soon as you allow yourself to escape this formal literalism, I think the Great Man theory holds. In fact, I think in any system that supports fiat currency, your net worth is your numerically measured Greatness. In this sense, Steve Jobs is Great by definition.

  • Sociopathic Huckster says:

    (P.S.S.: I should mention that PARC's machines were famously some of the last Lisp machines sold commercially.)

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre lang="" line="" escaped="" highlight="">


MANDATORY: Please prove that you are human:

122 xor 20 = ?

What is the serial baud rate of the FG device ?


Answer the riddle correctly before clicking "Submit", or comment will NOT appear! Not in moderation queue, NOWHERE!