Among the advertised features of Apple’s latest OS update, three in particular caught my attention: “auto-save”, which claims to wipe out the abomination of volatile-by-default documents; “versioning”, which claims to introduce document version-control into the Mac’s normal operations; and “resume”, which promises to re-load a user’s work-state whenever an application is re-started.
On the surface, these new features appear to bring in two of what I believe to be the defining attributes of a non-user-hostile computer system:
Information which entered the machine through deliberate operator action shall never be destroyed or otherwise rendered inaccessible except as a result of deliberate operator action to that end. No operator action shall lead to the destruction of information unless said destruction is the explicit and sole purpose of the action. If all non-volatile storage space that could hold full undo-information for operator-initiated actions is exhausted, the operator shall be informed immediately and given the opportunity to explicitly erase unwanted data or connect additional storage devices, thus preventing unintentional information loss.
and the Third:Volatile storage devices (i.e. RAM) shall serve exclusively as read/write cache for non-volatile storage devices. From the perspective of all software except for the operating system, the machine must present a single address space which can be considered non-volatile. No computer system obeys this law which takes longer to fully recover its state from a disruption of its power source than an electric lamp would.
But… not so fast. By all means, open your checkbooks, buy the updates. But before you also buy an icon of Mr. Jobs to kiss, take a moment to think.
As expected, Apple’s promotional material breathes not one word about how these features are implemented. But the fact that they are advertised as separate features is just about a dead giveaway of how they aren’t implemented. That is to say: correctly, as a unified whole; in some way which is not, at its heart, a sham, a cheap trick, a fuffle. But how can we be so certain that we are being fooled?
It is because a system architecture having orthogonal persistence  would give you “auto-save” and “resume” for free. Auto-versioning would follow just as readily from a relatively-uncomplicated mechanism laid on top of an orthogonally-persistent address space.  Apple’s OS update clearly has not removed and replaced the system’s UNIX foundation with something sane, and therefore orthogonal persistence is not to be found in Mac OS 10.7. It follows trivially that Apple’s auto-save and all related features are implemented by means of demanding ever more pervasive use of proprietary document-specific API calls from programmers. There is ample precedent: consider Apple’s much-hyped “reinvention” of parallel programming. It might seem like manna from heaven to a thread-addled C/C++/Java programmer, but compared to even the lamest proposals for dataflow-based architectures it is the innermost circle of hell.
Persistence implemented correctly is an architectural simplification, rather than yet another proprietary knob dumped on top of a tall, stinking heap of the same. With true persistence, user and programmer alike are completely relieved of the burden of thinking about RAM volatility. It simply vanishes as a concern. Why, then, won’t Mr. Jobs sell you a computer which has this marvelous property? Is it out of malice, out of a desire to watch you squirm? No: it is because he simply cannot. While Apple remains the undisputed champion of seamless architecture-hopping, moving between competing instruction sets is nothing compared to even the smallest movement between paradigms. And that is precisely what a move to orthogonal persistence would be. The UNIX loader and just about everything connected with the act of manually shuttling data between different forms of storage would have to vanish. The difference between asking developers to port code (what happened after each of Apple’s three CPU swap-outs) and asking developers to junk all code ever written by anyone is, well, a serious one. Don’t expect this kind of suicidal courage from Apple or from any other commercial entity. Or from any mainstream organization led by respectable people, for that matter.
All you will ever get from Apple is a “Worse Is Better“, taxidermic imitation of orthogonal persistence. The same goes for the First Law. As for the others, just forget it. Apple’s products shit  on the Fourth, Fifth, Sixth, and Seventh Laws  enthusiastically and malignantly.  And if you think that this is merely a story about the antisocial behavior of a large American company, you are not seeing the big picture. Apple’s notions of how to build a personal computing environment are already finding their way into university classrooms. Not only Tetris-playing accountants, but now so-called academics are eagerly sucking them up. In the classrooms they will be taught as the best, perhaps the only reasonable notions. This is when the sun will truly set on the personal computer’s potential as a civilization-level game changer.
Foundations matter. Always and forever. Regardless of domain. Even if you meticulously plug all abstraction leaks, the lowest-level concepts on which a system is built will mercilessly limit the heights to which its high-level “payload” can rise. For it is the bedrock abstractions of a system which create its overall flavor. They are the ultimate constraints on the range of thinkable thoughts for designer and user alike. Ideas which flow naturally out of the bedrock abstractions will be thought of as trivial, and will be deemed useful and necessary. Those which do not will be dismissed as impractical frills — or will vanish from the intellectual landscape entirely. Line by line, the electronic shanty town grows. Mere difficulties harden into hard limits. The merely arduous turns into the impossible, and then finally into the unthinkable.
The typical MS Windows user may never read or write a single line of C++, C#, or any other programming language. Nevertheless, Windows is ultimately a product of the way C++ built environments; Unix, of the way C does; the Lisp Machines, of Lisp’s way. This holds true not because of some yet-undiscovered law of mathematics, but rather due to the limitations of the human mind. No matter who you are, regardless of your intelligence, endurance, motivation, or other qualities, your capabilities are still finite. Thus, a “mental CPU cycle” which you spent on manual memory management (or decisions regarding static types, and other drudge work) is one which can no longer be spent on something more interesting. Any given conceptual foundation sets up a kind of current, against which those who build on that foundation swim at their peril. To continue with this analogy in reference to modern computing, what has once been a fast-moving stream has now become a high-pressure steel pipe beneath city streets. A sewer main, to be exact.
Bedrock abstraction is destiny. This cruel law of nature applies to all aspects of computing, both above and below the level of the programming language. The Von Neumann Bottleneck is known to many, and is sure to become a dinner-talk phrase among ever less intellectually-inclined programmers as Moore’s Law breathes its last. But it is not the only conceptual flaw in the foundations of the modern computer. The pervasive use of synchronous logic circuits is another, far more serious one. The basic principles of asynchronous digital logic design have been known for more than half a century. Yet engineers continue to be taught only synchronous design, popular CAD tools remain incapable of asynchronous synthesis, and through the familiar vicious chicken-and-egg circle the tradition persists. On the rare occasion when someone bothers to design an asynchronous CPU, it invariably turns out to be a mindless adaptation of the tired old crippled Von Neumann paradigm, sans clock. We can do better than that. But that is a story for another time.
There are those who tell us that any choice from among theoretically-equivalent alternatives is merely a question of taste. These are the people who bring up the Strong Church-Turing Thesis in discussions of programming languages meant for use by humans. They are malicious idiots. The only punishment which could stand a chance at reforming these miscreants into decent people would be a year or two at hard labor. And not just any kind of hard labor: specifically, carrying out long division using Roman numerals. A merciful tyrant would give these wretches the option of a firing squad. Those among these criminals against mathematics who prove unrepentant in their final hours would be asked to prove the Turing-equivalence of a spoon to a shovel as they dig their graves.
The ancient Romans could not know that their number system got in the way of developing reasonably efficient methods of arithmetic calculation, and they knew nothing of the kind of technological paths (i.e. deep-water navigation) which were thus closed to them. Who knows what marvels we are denied, lacking the true computer, the true intelligence amplifier, the true labor-saver? But unlike the Romans, we have some clues regarding the ways in which our foundational concepts are lacking. Let’s consider alternatives. Just… consider them. I ask here for neither respect nor money, but merely that people think. Pretend that you are doing it only for sport, if you must do so to save face. But do think.
 If you try to research the meaning of the phrase “orthogonal persistence”, you will drown in an ocean of pseudo-intellectual babble. The original, true idea behind it was that of creating the equivalent of non-volatile RAM, and restricting all of the machine’s user-accessible storage to said RAM. This is something that once stretched the bounds of the technologically-plausible, but is trivially accomplished today — if you enthusiastically piss on backwards-compatibility.
 “…that this margin is too narrow to contain.”
 Comments re: the Seven Laws which sum up to “who do you think you are, what, a Euclid, you wanker” will be silently deleted unless they contain an honest attempt at deriving a different, more perfect set of such Laws.
 For the idiots who would like to paint me as a paranoid lunatic, a reminder: Apple shits on the principles of sane personal computer design not out of a sadistic desire to torment users, or from a lack of awareness of said principles, but simply because doing so is immensely, irresistibly lucrative.
 This is not the place to summarize exactly why, but it suffices to say that Apple is
- In bed with copyright racketeers
- About as interested in building — or even permitting to exist — the cheap, legal, easily-user-programmable personal computer as Boeing or Airbus are in the cheap, legal, and easy-to-fly private airplane. The destruction of HyperCard alone is proof of this.